News
 

Bookmark and Share

(6) 

AMD does not consider the lack of PhysX support on systems that feature ATI Radeon graphics processing units (GPUs) a problem. At the end, says AMD, PhysX application programming interface (API) will simply become irrelevant in the future.

Advanced Micro Devices, the world’s second largest developer of x86 central processing units (CPUs) and a leading designer of graphics processing units, said that it hardly regrets about Nvidia Corp.’s decision to disable support of hardware physics effects processing using PhysX API and GeForce GPU or Ageia PhysX PPU in systems where ATI Radeon graphics card is used for graphics rendering. In fact, AMD believes that with the raise of popularity of DirectCompute and OpenCL APIs, proprietary PhysX will soon vanish into oblivion.

“Physics can be a good thing for gamers, but it should be for all gamers. When it’s available for everyone, game developers will be able to make physics an integral part of gameplay, rather than just extra eye candy. This requires a physics solution built on industry standards. That’s why DirectX 11 is such a great inflection point for our industry – DirectCompute allows game physics that can be enjoyed by everyone. There are several initiatives (some open-source) that will deliver awesome GPU-based physics for everyone, using either DirectCompute or OpenCL. Industry standards will make any proprietary standard irrelevant,” said Neal Robison, director of global independent software vendors relationship for AMD, in an interview with Icrontic web-site.

Besides, the representative for AMD also accused Nvidia of not acting in gamers’ best interests: disabling support of both GPU- and PPU-based hardware processing of PhysX is not something that helps end-users.

“There’s a real discrepancy between what Nvidia says, and what they do. They “say” that they are looking out for gamers’ best interests. However, decisions like this are the exact opposite of gamers’ best interests,” added Mr. Robison.

Tags: Nvidia, PhysX, AMD, ATI, Radeon, Geforce

Discussion

Comments currently: 6
Discussion started: 10/01/09 06:59:11 PM
Latest comment: 12/02/10 03:16:45 PM
Expand all threads | Collapse all threads

[1-4]

1. 
The story:
NV bought Physx, no other physics API out there existed. So everyone was using it. Now, with DX11 there's a viable alternative. Soon Physx will die, even if opened up. So in the meantime, NV is being an ass about this by not sharing. They are trying to squeeze the marketing value out of Physx that's left to gain as many customers as possible until time's up.

What should've been:
The best case scenario for them is they could've lobbied to have Physx become integrated into DX11, thus giving them the early advantage they needed and a possible leg-up in understanding how it worked. All while not alienating their (potential) customers.

Consumer result:
I have a 8800GT here, and my Radeon 5870 that arrived today. After reading this article, do you think I'm going to buy Nvidia next time?
0 0 [Posted by: yenic  | Date: 10/01/09 06:59:11 PM]
Reply

2. 
^^^^
Totally agree. It seems to me that Nvidia is into anything else but graphics these days.
0 0 [Posted by: zaratustra06  | Date: 10/04/09 04:32:48 AM]
Reply

3. 
DX11 does not provide a hardware physics library - in bold just to make this very obvious!
DirectCompute isn't even DX11 - it's a competitor to opencl and CUDA available for DX10 cards too. With it a hardware physics library could be built, but at the moment I don't think anyone is.

If you want hardware physics currently the only option is physx.

In the future one other company is developing the "bullet" physics library on top of opencl. Being as opencl works on both ati and nvidia this will provide a physx competitor. However this is going to take years to take off because:

1) Physx is available for both consoles (software only) so was incorporated into the multi-platform engines. Hence if you want hardware physics on your PC port of some UE3 game you will use physx because this is supported by the engine. That's not going to change till the next round of consoles - software physx for the consoles is cheap and works well, they aren't going to change to software bullet just for better pc hw physics coverage.

2) The only reason we see hardware physics in games at all is because nvidia is leveraging TWIMTBP and getting it put into games. Without nvidia's backing there would be no hardware physics. Obviously nvidia aren't going to spent time and money convincing devs to use hw bullet physics when they already have physx.

Ati talk the talk but aren't willing to actually do anything - they would have to work closely with bullet physics and push hard to get it adopted. The only people currently working with bullet are nvidia (their website says it was developed using nvidia hardware).

Seems that all of Ati's games relations team is busy trying to push some DX11 into games, everything they say regarding hw physics is just a load of FUD at the moment.

(this is not to say that I approve of nvidia disabling physx when an ati card is detected - very bad move. Fortunately it's already been hacked to get it working with the latest drivers)
0 0 [Posted by: Dribble  | Date: 10/05/09 07:41:52 AM]
Reply
- collapse thread

 
Hardware physics has failed to impress us with such an aesthetic display of physic processing power that was not already done in games before such as Half Life 2 several years ago, or Crysis a few years ago. Whatever we see being done on PhysX could be done just as well using Havok physics or similar game engines such as Infernal that uses physics on more than 1 CPU core. Nearly all 4 cores of a quad-core CPU was fully utilized for such an impressive display of "tornado" physics in the Ghostbusters game.

When I played Mirror's Edge with PhysX enabled, it pained me to see how the effects would have been more impressive with Havok, even if it were released in 2004-2005 timeframe.

The mere fact that it is proprietary could mean that it would go the way of the do-do, just like with 3Dfx's Glide API. Glide did the job well for a while, but still failed to impress us after a couple of years with 16-bit color and such (while OpenGL and DirectX were widely supported and rapidly improving).

Software PhysX for the consoles does not mean that the ported games for PC will utilize it. Mass Effect, for instance, has an option in the .ini file for changing PhysX support from default OFF to ON, and does not make any differences whatsoever with either visuals or performance, even if you have a dedicated PhysX card and the latest drivers.

Also, PhysX causes some stutter even if running at 40+ fps in more PhysX supported games than not. There have been numerous complaints across the internet about how PhysX is just so unimpressive compared to games of 4-5 years ago. With the exponentially rising number of CPU cores waiting to be utilized in the future, most of those cores are being left idle, while the GPU will always be used to the fullest for the graphics processing power.

With Intel being so set at competing with ATI and Nvidia, and highly unlikely to start paying Nvidia astronomical royalty fees for PhysX, the future iterations of Larrabee will give us one more reason to accept the fact that PhysX (an API that refuses to utilize multi-threaded CPU cores) will go the way of Glide.
0 0 [Posted by: Bo_Fox  | Date: 10/05/09 02:42:16 PM]
Reply

4. 
This is an interesting article. It appears to me that AMD has a strategy in motion (hopefully).
0 0 [Posted by: CPUGuy  | Date: 10/05/09 01:32:04 PM]
Reply
- collapse thread

 
what strategy? they have no strategy at all. they just go with the trend.. and the wind groom wedding speech
0 0 [Posted by:  | Date: 12/02/10 03:16:45 PM]
Reply

[1-4]

Add your Comment




Related news

Latest News

Monday, July 28, 2014

6:02 pm | Microsoft’s Mobile Strategy Seem to Fail: Sales of Lumia and Surface Remain Low. Microsoft Still Cannot Make Windows a Popular Mobile Platform

12:11 pm | Intel Core i7-5960X “Haswell-E” De-Lidded: Twelve Cores and Alloy-Based Thermal Interface. Intel Core i7-5960X Uses “Haswell-EP” Die, Promises Good Overclocking Potential

Tuesday, July 22, 2014

10:40 pm | ARM Preps Second-Generation “Artemis” and “Maya” 64-Bit ARMv8-A Offerings. ARM Readies 64-Bit Cores for Non-Traditional Applications

7:38 pm | AMD Vows to Introduce 20nm Products Next Year. AMD’s 20nm APUs, GPUs and Embedded Chips to Arrive in 2015

4:08 am | Microsoft to Unify All Windows Operating Systems for Client PCs. One Windows OS will Power PCs, Tablets and Smartphones