by Anton Shilov
07/30/2010 | 03:47 PM
Motion detection-based user interfaces have been exciting minds for a long time. But it looks like “Minority Report”-like user interfaces may be a little closer than we think. Nvidia Corp. on Friday confirmed that it is researching technologies that could enable real-time motion-detection-based user interfaces.
All three leading makers of video game consoles have enabled real-time detection of motions for new games using different technologies, but emphasizing the importance of the idea in general. Toshiba Corp. and some other companies have enabled recognition of certain gestures on Cell processor-based devices, showing an evolution of the TV remote control. There are expectations that Microsoft Corp. is also working on user interfaces that can be controlled using gestures and motions. Hardware companies do recognize the necessity of interfaces that are based on detections of motions. Recently Intel Corp. announced creation of a special lab that will work, among other things, on various types of interfaces for computer users, including, but not limited to, motion detection-based. Apparently, Nvidia Corp. is also working on various motion tracking technologies aimed at the future.
"Nvidia Research is working on some projects which involve image processing and image/object tracking and recognition. We also are aware of a number of University and other efforts to use CUDA and GPU computing for this sort of task," said David Kirk, an Nvidia fellow and the former chief scientist of the company, in a brief conversation with X-bit labs.
Developers of graphics processing units (GPUs) have come a long way and their chips are now much more flexible and capable than they were a decade ago. At present both Nvidia and ATI (graphics business unit of AMD) are working hard to enable higher performance of consumer applications with the help of their GeForce and Radeon GPUs since at present graphics processors are utilized fully only in video games. After powering games, consumer oriented applications, some special-purpose business or professional (in fact, there are applications that use CUDA and graphics chips for audio-related computations) applications, it is logical for graphics processing units to go even further. If interaction with a PC requires performance of a massively parallel GPU, then the future of both chip designers is bright.
But there are things that may not be as important as user interfaces, but which may still represent value for the end-user. Several years ago an unknown company called Ageia started promotion of PhysX platform for processing of physics effects on a special purpose hardware, ATI and Nvidia told that GPUs could be used for physics effects processing too, whereas companies like AMD or Intel rightly said that both effects physics and gameplay physics were processed successfully on central processing units. Eventually Nvidia acquired Ageia and started to offer GPU-assisted physics technologies to game developers.
"As you observed with physics, and also I will mention ray tracing, there are many things that the 'conventional wisdom' asserts GPUs will not do well. Both of those areas now benefit substantially from GPU acceleration. We certainly enjoy proving conventional wisdom wrong, but our real goal is to delight end-users and make a difference in the marketplace," added Mr. Kirk.
In fact, the fellow of Nvidia believes that every thing, which is hard or complex to accomplish, and has a lot of data to process, is a good candidate for GPU computing.
"I cannot delve into greater detail at this time, but we willl share developments as they emerge," concluded David Kirk.