Dec 27 2012

IBM Research Predicts Computers Will Have All 5 Senses in the Future

Touch, sight, smell, hearing and taste — coming soon to a computer near you.

People have long dreamed of a computer that would allow them to touch or smell things in the same way they’ve been able to hear and see things with radio and TV.

But IBM believes there will be a great shift in human–computer sensory interaction in the next five years. The company is calling these predictions the 5 in 5.

In a post on the company’s Smarter Planet blog, Bernard Meyerson, a chief innovation officer at IBM, provides a real-world example of how the human senses would be useful to a machine.

Today, if you put a robotic track inspector in a railroad tunnel and equipped it with a video camera, it would not know what to make of an oncoming train.

But what if you enabled it to sense things more like humans do–not just vision from the video camera but the ability to detect the rumble of the train and the whoosh of air?

And what if you enabled it to draw inferences from the evidence that it observes, hears and feels? That would be one smart computer–a machine that would be able to get out of the way before the train smashed into it.

Considering IBM’s own breakthrough with its artificial intelligence project, Watson, it’s safe to say that the company is on a mission to make significant advances in machine learning technology.

Here’s how the company sees computers embracing and enhancing the five senses with technology:

1. We’ll Be Able to Touch Through Our Smartphones

Right now, we’re able to touch and interact with virtual objects through the screens of our smartphones, but we feel texture, as we do in the real world.

Robyn Schwartz, a retail industry expert for IBM, sees texture playing a big part in the future of touch technology. She believes that vibrations can convey textures through a smartphone, so that the textures are recognizable to the brain.

“If you think about buying a shirt online, we can use different technologies like vibration, like being able to manage vibration through an understood lexicon of texture, to be able to use vibration to translate burlap versus linen, versus silk," she says.

Learn more from the video below.

2. Computers Will Be Able to Understand Images

With barcode technology and QR codes, computers can already “see” certain things, but it requires humans to write software and unique languages to support that kind of vision. What if we could show a computer a scene of a beach and have it actually understand the scene?

Dr. John R. Smith, senior manager for IBM’s Intelligent Information Management department, believes computers can be trained to see in the same way as humans.

"In cognitive computing, a computer is basically taught to understand photos by being given examples, and it basically learns to detect the patterns that matter. So it could be for a beach scene, the color is very important. However, for another kind of scene, like a downtown city scene, well, perhaps it's edge information, something completely different,” he says.

Learn more from the video below.

3. Machine Learning Will Help Us Make Tastier, Healthier Food

What humans perceive as flavor and taste is really just a series of interactions between chemicals, taste buds and neurons. So if we could train computers to understand what humans perceive as “good” flavors while at the same time adhering to the tenets of a good diet, we’d be able to actually make healthy food that tastes good.

Dr. Lav Varshney, research scientist for IBM, believes it’s possible, and that big data can help lead the way.

"In the future, a computer will be able to access large repositories of data that tell us about the chemical structure of various ingredients. It will be able to tell us about what humans perceive in terms of the flavors and then be creative and actually put everything together,” he says.

Learn more from the video below.

4. Cognitive Computing Will Be Able to Sniff Out Trouble

We’ve longed for smell-o-vision for decades. Now, computers might be one step closer to growing an artificial nose.

Dr. Hendrik Hamann, research manager for physics analytics at IBM, believes that computers will be able to use smell to diagnose health problems before patients are aware they’re ill.

"It could smell things around you, maybe your breath. Your phone might know that you have a cold before you do,” he says.

Learn more from the video below.

5. Big Ears Will Lead the Way to Big Data

When we think about unstructured data, it’s usually text-based content. But what if your computer could take the data it “hears” and turn that data into information that it could then act upon? For example, the computer could listen to the conference room, hear that it’s silent and know that someone forgot to shut off the conferencing device. In response, the computer could automatically shut down the equipment.

Dimitri Kanevsky, a master inventor for IBM, believes that hearing sensors could help detect mudslides in Brazil.

"One solution from IBM labs to solve this is to put sensors that hear sound, so they can hear some movement in the mountains. It can predict that maybe a flood is coming," he says.

Learn more from the video below.