Partner Manager Resource Center

Register Here

IBM 5 in 5

by Mark Cox
Read 7 times
Rate this item
Major technological change can drive channel transitions. In this article republished with permission from eChannelLine, Integratedmar.com Managing Editor Mark Cox writes about IBM's seventh annual "IBM 5 in 5" projections; innovations that have the potential to change the way people work, live and interact during the next five years.

The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM's R&D labs around the world that can make these transformations possible.
"This isn't science fiction," said Don Aldridge, GM, Research & Life Sciences, IBM Canada. "These are all things where there is work going on today, at some level, but where we believe they will intensify in importance because they will be combined with more data, or will be ramped up because the cost has come down." IBM believes that all five trends will become market realities at some point within the next five years.

The first forecast trend is the ability to touch things through your phone.

"You can picture almost any form of haptic feedback, but the ability to do it on your phone is kind of cool," Aldridge said. "In the consumer world, being able to touch and feel things is kind of important." One example would be being able to feel the satin or silk or a wedding dress you are shopping for from the surface of your phone screen. Another would be surgeons and physicians doing digital exams.
IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will do things like differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material.

Secondly, in the next five years, systems will not only be able to look at and recognize the contents of images and visual data, but will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph.
Computers today only understand pictures by the text we use to tag or title them; the majority of the information -- the actual content of the image -- is a mystery.

"This takes what we have now and puts it on steroids," Aldridge said. "Today we do feature ID on photos in things like the Neptune Project, where have underwater cameras as a source of data. Today grad students sit and identify the features. We are working on doing it automatically. It's a challenge because things don't look the same at different angles."

The idea here is that "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.

"Computers will look at things in context, like we do," Aldridge said. "Computers are now able to do feature extraction now, but they don't see things they aren't looking for. Traffic cameras in the future will have a lot more smarts, like being able to see people who have been hurt."

Aldridge said this ability of computers to learn -- a word he preferred to 'think,' which is a more human concept -- is growing significantly.

"Watson was revolutionary, but this kind of ability to learn is just starting, as compared to computers that do things quickly," he said.

Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy.

The third trend is improved ability of computers to hear. Within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.

"Computers will be able to hear what matters," Aldridge said. "Sensors today can detect sound at a higher pitch than humans."

Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.

For example, "baby talk" will be understood as a language, telling parents or doctors what infants are trying to communicate. Sounds can be a trigger for interpreting a baby's behavior or needs. By being taught what baby sounds mean - whether fussing indicates a baby is hungry, hot, tired or in pain - a sophisticated speech recognition system would correlate sounds and babbles with other sensory or physiological information such as heart rate, pulse and temperature.

A fourth trend is using computers to enhance taste.

"I don't like broccoli and I don't think a computer can make me like it if I don't like it," Aldridge said. "So can we identify tastes people do like and impart flavors people like to get them to eat things? It's all trial and error at this stage."

IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham. The result would be making foods like broccoli taste delicious by imbuing them with other flavors.

Aldridge said that this is far removed from genetically modified foods and their controversies.

"There's potential for controversy in anything, but it's not same as getting into the genetically modified world," he said. "At this stage we are talking about is what it is that makes food taste the way it does."

The final projection is that computers will have a sense of smell.

"All kinds of existing sensors smell today," Aldridge said. "They detect things like chlorine in the air, and they can search luggage for drugs, although they are not as accurate as a dog. But today they look for one particular scent -- one molecular pattern, and going forward, that needs to be much broader in range."

This ability to look for many scents will permit doctors to analyze odors, biomarkers and thousands of molecules in someone's breath, so they can diagnose and monitor the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not. Tiny sensors embedded in your computer or cell phone will also detect if you're coming down with a cold or other illness.

"Doctors have been smelling breath and doing diagnostics for centuries, because there are certain things you can smell," Aldridge said. "But much like in the genome world, there's a massive amount of information that we haven't figured out how to read yet. Reading these chemical imprints may find things we don't even know we are looking for."

Today IBM scientists are already sensing environmental conditions and gases to preserve works of art. This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today. For example, antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was associated with almost 19,000 hospital stay-related deaths in the United States, is commonly found on the skin and can be easily transmitted wherever people are in close contact. In the next five years, IBM technology will "smell" surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless "mesh" networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.

Reprinted with the permission of eChannelline who retains all copyrights and other intellectual property rights.

Last modified on 6/30/2013 10:11:20 PM
Trackback URL: https://channelenablers.com/trackback/f7524752-919e-41bf-89de-8e9c36ca4bea/IBM-5-in-5.aspx?culture=en-US

Comments
Blog post currently doesn't have any comments.