The future of computing lies in rich, context-driven user experiences, according to Justin Rattner, chief technology officer of Intel Corp. In future computers - mobile, portable or desktop - will be able to determine needs of people in particular time and place.
Mr. Rattner described how context awareness is poised to fundamentally change the nature of how we interact with and relate to information devices and the services they provide. With computing devices having increased processing power, improved connectivity and innovative sensing capabilities, Intel researchers are focused on delivering new “context-aware” user experiences. “Context-aware” devices will anticipate your needs, advise you, and guide you through your day in a manner more akin to a personal assistant than a traditional computer. “Context aware” computing, via a combination of hard and soft sensors, will open up new opportunities for developers to create the next generation of products on Intel platforms.
“Imagine a device that uses a variety of sensory modalities to determine what you are doing at an instant, from being asleep in your bed to being out for a run with a friend. By combing hard sensor information such as where you are and the conditions around you combined with soft sensors such as your calendar, your social network and past preferences, future devices will constantly learn about who you are, how you live, work and play. As your devices learn about your life, they can begin to anticipate your needs," explained the CTO of the world's largest chipmaker.
In general, personal computers - big and small - should become personal assistants that help people every day.
"Imagine your PC advising you leave the house 10 minutes early for your next appointment due to a traffic tie-up on your way to work. Consider a “context aware” remote control that instantly determines who is holding it and automatically selects the smart TV preferences for that person,” said Mr. Rattner.
Rattner a showed the Socially ENabled Services (SENS) research project that provides the ability to sense and understand your real-time activities and, if you choose to do so, share that knowledge “live and direct” to networked friends and family through animated avatars on whatever screen, be it PC, smartphone, or TV, is handy.
“While we’re developing all of these new ways of sensing, gathering and sharing contextual data, we are even more focused on ensuring privacy and security as billions of devices get connected and become much smarter. Our vision is to enable devices to generate and use contextual information for a greatly enhanced user experience while ensuring the safety and privacy of an individual’s personal information. Underlying this new level of security are several forthcoming Intel hardware-enabled techniques that dramatically improve the ability of all computing devices to defend against possible attacks,” Rattner said.
At the end of his keynote, Rattner presented the ultimate example of sensing – a human brain-computer interface. Through the Human Brain project, Intel’s aim is to enable people to one day use their thoughts to directly interact with computers and mobile devices. In a joint project with Carnegie Mellon University and the University of Pittsburgh, Intel Labs is investigating what can be inferred about a person's cognitive state from their pattern of neural activity.