IBM Targets Five Innovations to Change the World in Five Years
Current uses of haptic and graphic technology in the gaming industry take the user into a simulated environment, IBM said. The opportunity and challenge is to make the technology so ubiquitous and interwoven into everyday experiences that it brings greater context to our lives by weaving technology in front and around us, the company said. This technology will become ubiquitous in our everyday lives, turning mobile phones into tools for natural and intuitive interaction with the world around us. Prediction Two: Sight: A Pixel Will Be Worth a Thousand Words Computers today only understand pictures by the text used to tag or title them; the majority of the information—the actual content of the image—is a mystery, IBM said. In the next five years, systems will not only be able to look at and recognize the contents of images and visual data, but they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as health care, retail and agriculture. Within five years, these capabilities will be put to work in health care by making sense out of massive volumes of medical information such as MRIs, CT scans, X-rays and ultrasound images to capture information tailored to particular anatomy or pathologies, IBM said. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images—such as differentiating healthy from diseased tissue—and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy.Within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies, IBM said. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead. Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns. For example, "baby talk" will be understood as a language, telling parents or doctors what infants are trying to communicate, IBM predicts. And in the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures.
Prediction Three: Hearing: Computers Will Hear What Matters