Bookmark and Share


Nvidia Corp. recently said that it had discontinued support of dedicated Ageia physics processing units (PPUs) due to undisclosed reasons, which forces owners of dedicated PhysX accelerator cards to use Nvidia GeForce instead. In addition, the company claimed that the work of GPU-accelerated PhysX actually requires an Nvidia graphics chip to do rendering, which is why it is not possible to use a GeForce for PhysX processing and an ATI Radeon for graphics rendering.

No More PhysX PPUs

“The current PhysX driver unfortunately does not support Ageia* PPU hardware which you're already aware. Support for Ageia PPU ended after the 8.xx.xx driver but like you we've receive other users asking to include support in current PhysX releases. There are discussions to release a stand-along PhysX driver that will support Ageia PPU but that is still in discussion,” an Nvidia technical support specialist is reported to have said.

Nvidia’s own web-sites confirms the claim of the technician: Ageia PhysX processors users should use and install an older PhysX system software such as version 8.09.04 that will enable Ageia PPU acceleration version 2.8.1 SDK or earlier and Windows Vista and Windows XP only. The current version of PhysX SDK is 2.8.3, hence, Nvidia dropped support for PPUs both in Windows 7 and in the latest software development kits.

Although Nvidia’s move is not pleasant for owners of Ageia PhysX accelerators, it is completely logical for Nvidia, which stopped selling PPUs as soon as it acquired Ageia in early 2008. Usually technology companies tend to support products for about three years after their release, but the history knows a number of exceptions.

No ATI Radeon

On another occasion, Nvidia also explained why it does not enable PhysX GPU acceleration on Nvidia GeForce graphics processing units (GPUs) whenever another graphics solution – such as ATI Radeon HD 5000-series graphics board – is used for rendering. Apparently, GPU PhysX requires a tight collaboration between graphics chip that renders a 3D game and the one that does processing of physics effects.

“There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, Nvidia PhysX technology has been fully verified and enabled using only Nvidia GPUs for graphics,” an official claim of Nvidia reads.

Interestingly, but until late July, 2009, and ForceWare 186.x drivers, Nvidia did not disable PhysX whenever an ATI graphics card was used for graphics rendering. Even though it was definitely not a trivial task to make both ATI Catalyst or Nvidia ForceWare drivers or Nvidia PhysX system software work on the same system on Windows Vista at the same time in order to run graphics rendering on ATI Radeon and PhysX on Nvidia GeForce or Ageia PhysX physics processing unit, some people could still use such a configuration. When Nvidia disabled such was of operation, it said there were “development expense, quality assurance and business reasons” behind the move. Enthusiasts, however, managed to re-enable GPU PhysX support on systems with ATI Radeon inside.

Quite interestingly, Nvidia fully allows using GPU PhysX acceleration on multi-GPU systems featuring a number of GeForce GPUs, which means that there are no bandwidth or any other technical issues with using one GPU only for PhysX and the remaining GPU(s) for graphics rendering. Instead, there are indeed business reasons behind disabling PhysX on ATI Radeon-powered systems.

“A new configuration that’s now possible with PhysX is 2 non-matched (heterogeneous) GPUs. In this configuration, one GPU renders graphics (typically the more powerful GPU) while the second GPU is completely dedicated to PhysX. By offloading PhysX to a dedicated GPU, users will experience smoother gaming,” an explanation by Nvidia reads.

According to the company, the minimum requirement to support GPU-accelerated PhysX is a GeForce 8-series or later GPU with a minimum of 32 cores and a minimum of 256MB dedicated graphics memory. However, each PhysX application has its own GPU and memory recommendations. In general, 512MB of graphics memory is recommended unless you have a GPU that is dedicated to PhysX.

It should be noted that there are only around fifteen games that can take advantage of PhysX processing on Nvidia GeForce graphics processors. 

*Interestingly, but an Nvidia technician referred to Ageia as to Aegia for three times, which is surprising. We decided to correct the quote.

Tags: Nvidia, Geforce, PhysX, ATI, Radeon, Fermi


Comments currently: 17
Discussion started: 05/06/10 04:05:19 AM
Latest comment: 05/11/10 09:54:58 PM
Expand all threads | Collapse all threads


fu nvidia.
0 0 [Posted by: bereft  | Date: 05/06/10 04:05:19 AM]

We believe you NVidia, don't worry. We believe you.
0 0 [Posted by: john_gre  | Date: 05/06/10 04:15:01 AM]

all nvidia want is the best possible experience for us, the gamers. they know ati gpus are cheap, affordable, available, perform really good, consume low power, and that is bad, so, they, huh?
0 0 [Posted by: tabovilla  | Date: 05/06/10 04:52:52 AM]

NVidia seems very intent on digging their own grave.

Sadly i think they might succeed in that, because they certainly dont seem to succeed in much else.
0 0 [Posted by: DIREWOLF75  | Date: 05/06/10 07:52:34 AM]

It seems that it is NOT quite fair.

And I think this is very relevant to the Game Manufacturers . They can use Havok also but at this time, UNFORTUNATELY, NVIDIA PhysX is the winner for them.
0 0 [Posted by: Pouria  | Date: 05/06/10 09:45:01 AM]

*looks at HD5870 running with 9800GT for PhysX*

Sure thing nvidia. Whatever you say.
0 0 [Posted by: lainofthewired  | Date: 05/06/10 10:23:54 AM]

What did you people expect? Nvidia is a company that pays people to spread guerrilla marketing on internet forums instead investing in engineering. They have to keep on holding to gimmicks like PhysX, because they sure can't sell their cards on performance to cost ratio.

Sure, ATi had a couple of rough years but they have improved a lot, while nvidia is going down the drain of inferior product lanes and infinite rebranding schemes. Now I'm so happy to have upgraded my 8800 SLI to 5870.
0 0 [Posted by: FLA  | Date: 05/06/10 10:43:46 AM]
- collapse thread

And man ... there were many of these guys here, on XBitForums back in the days I had so many encounters with them, always wondering where did they get the time to be present on the forums all the time ... We'll it looks like they've been paid for the job.
0 0 [Posted by: East17  | Date: 05/06/10 02:35:21 PM]

nVIDIA : cheating our customers is our business! We thank you for supporting us
0 0 [Posted by: East17  | Date: 05/06/10 02:31:31 PM]

Bad move by Nvidia. They are losing the sale of graphics cards for PhysX processing by current ATI card owners and limiting their marketshare, not to mention creating bad press amongst the enthusiasts who know damn well it's simply a marketing decision and not a technical one since hacks have been made in the past to enable it, and it works just fine.

0 0 [Posted by: Astral Abyss  | Date: 05/06/10 04:32:24 PM]

This isn't exactly correct. I have a Dell XPS M1730 that has a PPU along with dual NVIDIA GPUs. I am using driver 195.62 and PhysX 09.09.0814. I can use the PPU PhysX acceleration. I suspect that this will be the last version that the PPU will work as it can not be enabled in driver 197.16. I have not tried to mix the 09.09.0814 Physics system software with driver 197.16 as I just does not seem to be worth the effort. There just are not enough games that use it.

0 0 [Posted by: bobwalt  | Date: 05/06/10 05:33:37 PM]

Poor Nvidia. Nobody respects a bully anymore...
0 1 [Posted by: Divide Overflow  | Date: 05/07/10 08:02:40 PM]

One more thing that AMD have to be more serious about, is 3D Vision technology.
ATi Eyefinity is a good one but 3D Vision Like Avatar movie has many advocators also.
0 0 [Posted by: Pouria  | Date: 05/07/10 08:37:33 PM]

“There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, Nvidia PhysX technology has been fully verified and enabled using only Nvidia GPUs for graphics.”

Talk about verbal diarrhea! Funnily, this didn't seem to have been a problem in previous driver releases. Technical connections? I don't get it! Do they mean technical aspects or physical connections? How about PCI-E 2.0 16x? Not "tight" enough of a connection? Oh, please! We (users) don't need that tight collaboration between our noses and fingers to point at where the smell comes from! Well OK, I'll take out one particular and many times renamed NV product, give it away to someone less fortunate and save on electricity bills. I'll gain a cooler, quieter running system and win some gratitude from some poor sod. What will I lose?? Anyone??
0 0 [Posted by: MyK  | Date: 05/08/10 06:11:13 AM]

I have never seen a bunch of more pathetic losers than ATI like they see anything about Nvidia and they come running like zombies and start posting to somehow feel more machos and better about their purchases....oh keep going at it...meanwhile I would keep loving my Cuda, Physx, folding monster and above all REAL 3D gaming card
0 1 [Posted by: shaolin95  | Date: 05/11/10 09:55:22 AM]

How many of you remember when nVidia was saying that PhysX and CUDA don't work on ATI cards because "AMD doesn't want to do it"..? Now you need "tight nVidia-only integration".. yeah right..

Whatever shaolin95 thinks - this is not ATI fanboys thing, this is same thing that got a lot of other companies a bad reputation, and nVidia shouldn't expect to slip by so easily either. Their own "fanboy" crowd is way smaller than Apple has, and it can't just spit out FUD and expect glorious press support. Or enthusiast support for that matter.

i'm really looking forward to technologies like Lucid's Hydra, so we can mix&match graphics cards, and I sure hope that PhysX will die out with slow and painful death, and that more open technologies like OpenCL and Havok will replace it. nVidia took a great company (AGEIA) and turned it into something that's not pleasant to watch at all...
1 0 [Posted by: LuxZg  | Date: 05/11/10 12:55:56 PM]

So why would this 'news' surprise anyone ??

NV have been actively screwing customers with product re-spins for how long now ???, thats why they are trying to save a buck wherever they can because ppl are catching on and they dont have any worthwhile/decent products to sell any more
1 0 [Posted by: alpha0ne  | Date: 05/11/10 09:54:58 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture