Bookmark and Share


FOLLOW UP: Nvidia Denies Bribing Game Developers for Implementation of PhysX.

ATI, graphics business unit of Advanced Micro Devices, accused its arch-rival Nvidia Corp. of making marketing deals with video game designers to promote GPU-accelerated physics effects processing using PhysX application programming interface.

“What I have seen with physics, or PhysX rather, is that Nvidia create a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game. The problem with that is obviously that the game developer doesn’t actually want it. They are not doing it because they want it; they’re doing it because they are paid to do it,” said Richard Huddy, AMD’s senior manager of developer relations in Europe, in an interview with web-site.

Earlier this year AMD already accused Nvidia of modifying PhysX API in a way to not let it use all available cores on multi-core central processing units (CPUs) when processing physics effects in games. The company claimed that Nvidia did this in order to look performance of GeForce-accelerated physics effects processing higher compared to CPU-accelerated processing. However, later on Nvidia denied any modifications of PhysX API.

AMD and ATI have always been strong proponents of open industry standards. PhysX is a proprietary standard of Nvidia that it got when it acquired Ageia back in 2008. There are not a lot of PhysX-based game titles on the market, but they do exist. Still, according to Mr. Huddy, with the exception of Epic, game developers do not use PhysX because they want to do it, but rather as a part of marketing deal with Nvidia.

“I am not aware of any GPU-accelerated PhysX code which is there because the games developer wanted it with the exception of the Unreal stuff. I don’t know of any games company that’s actually said ‘you know what, I really want GPU-accelerated PhysX, I’d like to tie myself to Nvidia and that sounds like a great plan’,” said Mr. Huddy.

AMD’s senior manager of developer relations in Europe is sure that eventually PhysX will share its fate with 3dfx Glide API thanks to development of open standards.

“I think the proprietary stuff will eventually go away. If you go back ten years or so to when Glide was there as a proprietary 3D graphics API, it could have coexisted, but instead of putting their effort into getting D3D to go well, 3dfx focused on Glide. As a result, they found themselves competing with a proprietary standard against an open standard, and they lost. It’s the way it is with many of the standards we work with,” said Mr. Huddy.

Tags: AMD, ATI, Nvidia, PhysX, Radeon, Geforce, Phenom, Athlon


Comments currently: 13
Discussion started: 03/09/10 06:25:14 AM
Latest comment: 03/17/10 03:14:13 PM
Expand all threads | Collapse all threads


I thought so, but still amd needs prove.
0 0 [Posted by: ibmas400  | Date: 03/09/10 06:25:14 AM]

Obviously the developers have a choice and they choose PhysX because oh wait their isn't any better choice. Meanwhile Ati just cries about PhysX and twiddles their thumbs like they've been doing for a few years all ready.
0 0 [Posted by: knowom  | Date: 03/09/10 07:08:18 AM]

It's just yet another Ati vs Nvidia problem, in my opinion.
Not really worth discussing.
0 0 [Posted by: Kurata  | Date: 03/09/10 09:07:14 AM]

I'm very happy an open standard is looking like an option now. Hopefully it won't be too long before we see some games supporting this.
0 0 [Posted by: Divide Overflow  | Date: 03/09/10 10:28:05 AM]

Yes agree, AMD need to prove that ,if is true Nvida must go to court ,as we know bribe is not allow in any country.
0 0 [Posted by: Blackcode  | Date: 03/09/10 11:18:06 AM]

Eh, AMD really needs to lay off the accusations against nVidia regarding their PhysX program. Frankly, AMD should concentrate on their Open Physics program, which I support instead of saying these things.

Personally, I would either like to see PhysX opened up or simply die off so that open standards can populate the field and that devs can have more choice and that customers are not closed to certain hardware.
0 0 [Posted by: RtFusion  | Date: 03/09/10 04:32:39 PM]
- collapse thread

Devs have and make the choices in the end. The difference between Nvidia and Ati is Nvidia won't wait 2+ years to start offering DX11 where Ati just sits by idle hoping for the open source community to do the work for them and come up with a miracle for physics product to complete against PhysX.
0 0 [Posted by: knowom  | Date: 03/10/10 08:14:47 AM]
There were more impressive games using Havok than those using Physx.

Even Half Life 2, released nearly 6 years ago, had some of the most impressive physics--the effects were more impressive than games today using Physx.

Throw in the fact that Physx is not optimized to use more than one CPU core. Furthermore, it is not "honestly" optimized for even a single CPU core. A honest optimization could have at least tripled or quadrupled the algorithm efficiency on such.

The bottom line of the bigger picture is that GPU's are already utilized to the maximum capacity with the work done for graphics processing (shaders, antialiasing work, etc..) Most of the newer games today leave at least 50% of a typical quad-core CPU idling along. As the CPU's become more massively multi-core in the future, while at the same time, the GPU's (including multi-GPU solutions) continue to be pushed at over 99% capacity, the true solution for the consumer is a CPU-optimized physics API.
0 0 [Posted by: Bo_Fox  | Date: 03/10/10 09:17:06 PM]
It's a case by case basis some games would be better suited having more of a GPU optimized physics engined and others CPU optimized.

Bottom line though is that it's not Nvidia's job to come up with a CPU optimized physics engine or optimize there's for such case nor is it in their best interest to do so.
0 0 [Posted by: knowom  | Date: 03/11/10 02:06:33 PM]
You made a point there, but the more important "bottom line" is that it is in the consumer's best interest to use the idle CPU cycles for physics animation effects and let the GPU use all of its horsepower for doing heavy shader work. Anyways, PhysX has been rather lackluster compared to the physics effects found in games several years ago. Only Cryostasis showed mildly impressive physics using Physx, but the performance was plainly horrible even with GTX 295's. PhysX started out as something "niche" because it was cool to have a PPU in addition to a GPU with that Ageia card which allowed us to play GRAW with plenty of debris. Now, even Mirror's Edge or Batman:AA shows physics effects that could have been done more efficiently using Havok with much higher frame rates. I find the game Ghostbusters Video Game (an excellent game by the way) to utilize far more impressive physx using Infernal physics engine.
0 0 [Posted by: Bo_Fox  | Date: 03/13/10 02:02:40 PM]

Whine and moan. Make better products and they won't have so many problems. Professionals use Intel and Nvidia 4 to one over AMD.
0 0 [Posted by: beck2448  | Date: 03/09/10 10:41:00 PM]

well, i can't deny that, in the real war, money's talking, yes it still doesn't make sense, in my opinion, physX is not an efficient physics library and need to consume a huge amount of processing power (we still don't know if AMD claims about physX disabling multi-core potential is true...). example, if you looking to what Havoc can do to create physics effects in 3D design you will understand why Batman AA or whatever physX effect is "nothing special", just "under-ordinary" physics effects, especially since Havoc only use CPU to render

is that a crime? not really, nvidia need to do their best to survive by trying to pull its proprietary to "full potential"
0 0 [Posted by: am_drs  | Date: 03/10/10 06:45:33 AM]

Finally the truth comes out. Nvidia has been bribing game developers for years now by forcing there own standards and not following “Open Standards” like other companies. It really screws up PC gaming in the long run and will force major delays for games that could have been designed around an open Physics standard recognisable by the industry along with Intel, AMD, IBM, Microsoft etc.
It has also been proven that Nvidia is trying to implement there own version of 3D and not letting the market settle down until major LCD & Plasma corporations finalize a industry wide open standard. Both Nvidia’s version of Physx and 3D are going to eventually bite them in the aris.
Anybody not seeing the truth within this article is obviously blind by Nvidia’s egotistical selfish behaviour.
0 0 [Posted by: nt300  | Date: 03/17/10 03:14:13 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture