Your future home knows you well. Where you keep the keys; when you’re home early. Who watches the most TV; how you prepare lasagna. If its walls could only speak they would offer cues and instructions useful for managing everyday life.
A new Intel initiative in ubiquitous computing will develop applications that achieve just that kind of smart home. By continuously monitoring and "learning" an environment, task or the human state, the systems will adapt and attend to the personal habits and contexts of consumers as never before possible.
"A cellphone can figure out where you are within 200 feet. You can download a pedometer of how many steps you take in a day. We want to push beyond that, see what people do, what their preferences are," said Anthony LaMarca, Intel scientist and co-project leader.
In this new scheme unobtrusive sensors could collect visible, radio and thermal radiation to understand what activities occur when. That total awareness would then help apps determine what activities occur in the house, and help with them.
One app already in the works, the "Family Coordination" app, aims to perpetually recognize and reflect on the activities of a household, intervening in tasks like alarm clock setting and schedule delegating if it detects changes in daily routines or heavy rush-hour traffic, said a white paper by the Intel Science and Technology Center for Pervasive Computing. A child packing his own lunch could prompt the system to let his father sleep in an extra 10 minutes. Another function would assist with tasks like cooking, pairing real-time recognition of how you're chopping to domain knowledge such as "what a julienned carrot looks like" and even sharing the experience with friends online, LaMarca said.
User values such as "spending time together as a family” or "caretaking a person with Alzheimer's" would direct the app's adjustments and presentations of patterns.
"An alert could pop up when you are leaving, 'Don't you have your yoga bag on Tuesdays?' When you just have your briefcase," LaMarca said.
With current mobile devices already using accelerometers and microphones, and with the needed depth cameras for the next generation already on the market, all Intel needs to do is fully develop algorithms powerful enough to interpret the available data.