Bookmark and Share


General purpose computing on graphics processing units (GPGPU) is a highly-discussed topic these days. Thanks to multi-core architecture, GPUs can process multi-threaded tasks much more rapidly than central processing units. But while GPGPU technologies provide a number of benefits, they can also harm in case GPUs execute malicious code. Intel Corp. believes that GPU viruses are about to emerge and is taking steps to ensure that its forthcoming Larrabee is secure enough.

“Without any doubts, as graphics processors get more complex, the question of security will become more and more significant and, perhaps, we will witness the first viruses for the GPU. This is why we are carefully studying all the possibilities to ensure [appropriate] security with Larrabee, both on API and on driver levels,” said Philipp Gerasimov, a software development relations specialist at Intel, during a public discussion of Larrabee at Russian-language Habrahabr web-site.

At present graphics cards cannot delete or infect files on personal computers or steal confidential data due to limitations of driver models. Still, as GPUs are gaining functionality these days (e.g., Nvidia recently announced that its GeForce chips will be able to accelerate playback of flash-based content), sooner or later hackers will find ways to use those capabilities in a harmful way. Moreover, since Intel’s Larrabee is x86 compatible, whereas Nvidia promises to run C++ or Fortran code on its next-generation GPUs, developers of viruses are getting a whole new set of opportunities.

But GPGPU viruses should not be overestimated. Even though GPUs will be able to execute malicious programs (in fact, certain combinations of shaders can damage graphics chips that are not properly cooled down), there will always be a much more keen executor of harmful programs: the microprocessor, which will still be inside every personal computer.

Tags: Intel, Larrabee, GPGPU, Radeon, Geforce


Comments currently: 4
Discussion started: 10/07/09 08:53:43 PM
Latest comment: 10/12/09 01:19:42 PM
Expand all threads | Collapse all threads


Here is a more likely scenario, Windows gets infected with a trojan and then a bunch of systems turn into zombie systems allowing the trojan to utilize gpgpu capable systems for mass distributed computing to crack encryption algorithms. This however is not hardware issue, but a complete lack of competence in the os security.

This is just intel's way of putting FUD in people's minds about cGPU solutions because they already see larrabee getting pummeled by the competition.
0 0 [Posted by: deanjo  | Date: 10/07/09 08:53:43 PM]
- collapse thread

This is just intel's way of putting FUD in people's minds about cGPU solutions because they already see larrabee getting pummeled by the competition.

:rofl: Tottaly true. Overburdened zombie reincarnated x86 GPU isn't really something to be proud of.

But still nothing before g300 (if they keep their word) cant bear with inherited CPU precision. And versatility but til larabee needs specially optimized compiler so it's nothing with true legacy in mind as Intel intend to make us dream with their x86 blob.
0 0 [Posted by: OmegaHuman  | Date: 10/12/09 01:13:44 PM]

deanjo is right on. Intel has nothing, and the day and age of GPU computing is here. LRB has been seen one time - doing ray-tracing badly and with artifacts.

Naturally, they have to make GPU computing seem unsafe and bad, since they can't do it, and won't be able to do it for a long time, if ever. I laugh when I see PC's with the 4500mhd advertised as capable of 3d games and 1080p. They can't do either, and the drivers are so awful even if the hardware was able to, the compatibility is not there. They need to get out of the graphics business. When we start having HD Flash accelerated by GPUs, there are going to be a lot of very angry customers who bought PCs with Intel's junk graphics.
0 0 [Posted by: Equinox  | Date: 10/09/09 07:43:50 AM]
- collapse thread

I laugh when I see PC's with the 4500mhd advertised as capable of 3d games and 1080p.

You don't even know how wrong you are But the real problem lies that they promised HD decoding with their X3000 (G965) and that's something that ain't true. Not counting 720p (h264) thats lightweight on every sse3 based uP DC @1.5GHz and above (not counting P4D class) Ofc, while decoding 1080p X4500 utilize u to 80% of poor 45nm Penryn based Celerons @1.8G but still they're capable

On the hand on 3D gaming on X4500MHD @800x600 resolution and with some dx6.0 based manager games (arcade) have serious issues. But they're OGL2.0 compliant an it much more nitty and lightweight category than old dx6-dx9
0 0 [Posted by: OmegaHuman  | Date: 10/12/09 01:19:42 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture