AT&T Showcases Innovations That Might Have a Future
Instant Translation in Text Messaging
Imagine texting the front desk of your hotel in France, or a colleague in Peru, and having your message go through in the language the recipient prefers (as dictated by the settings in the phone), and likewise their responses coming back to you in your preferred language. Researchers at AT&T Labs have created such a solution.
Your Personal UN Translator
As with the text-messaging solution, researchers have used the cloud-based AT&T Watson speech engine to create near-real-time language translation capabilities from video. Imagine a live earnings report Webcast delivered in Japanese that can be understood by reporters and analysts in Korea, Argentina and beyond, all reading in their preferred languages in a box below the display.
Real-Time Communication in the Browser
IBIS, an innovation from the AT&T Foundry, offers developers tools to extend enterprise-grade services—once relegated to separate applications—all through the Web. In a demo, an airplane technician video-conferenced with an engineer at headquarters, who was able to pull up the service ticket and confer on a problem, without leaving his browser.
SafeCell is a solution to address the problem of distracted driving. An employer could use the solution to see if employees—delivery workers, say—are following the rules and staying off their phones will driving, and even how fast they're going. The solution can also block access to select applications, such as Facebook, while the vehicle is moving.
Simplifying Fleet Management
SafeCell can complement tracking and management records in AT&T fleet-management solutions and offer individual drivers scores on their performance.
Researchers at AT&T Labs have introduced a Voice Biometrics application programming interface (API) into a recently announced Alpha API program. In an example, to log into her bank account via mobile phone, a user might be asked to type in a pin and then read aloud a sentence.
Another solution offered the visual equivalent of Shazaam, the music identifying service. Curious about a painting or a building? Take a photo with your device, and Visual Search can pull up information about it. The technology could also be used to organize photos stored in a cloud, whether by location or faces. Here, a researcher swipes through all the photo responses received to a blurry photo he took of a building in Oxford.
Visual Search in Video
The same solution could also be used within video. Sit down late to a program in process? A search of a snippet could offer information on the movie or program and get you up-to-speed, the researcher, Zhu Liu, explained. Or, should you stop a movie and go back to it later, a searched snippet could act as a bookmark.
Water Cooler Opportunities for Remote Workers
Ambient Communication is a way to encourage communication, collaboration and maybe even just conversation among far-flung employees. All that's needed is a browser, a display and a camera. At any time, a user can click into one of the "bubbles" to initiate audio and have or join a conversation.
For those who'd feel weird about how having a live camera on them all day, there's a black-and-white silhouette-ish option that makes clear enough whether someone's on the phone or available. It's also possible for a user to be alerted when a conversation is taking place in which a designated key phrase is mentioned several times.
Eco Space Mobile
Eco Space hopes to literally put into peoples' hands the tools to make more environmentally friendly choices. One can find the nearest place to plug in an electric car, track one's carbon footprint and be given a "green score." Competition, it's hoped, could lead to smarter choices, whether for individuals or businesses.
Here, Ruggero Scorcioni, a participant in an AT&T Mobile App Hackathon, explains his creation, Good Times. It combines AT&T's call management and machine-to-machine (M2M) APIs with a headset that records brainwaves, all toward the goal of eliminating phone call distractions during key periods of focus. When Good Times senses that a user is deep in thought, it automatically redirects a call.