News
 

Bookmark and Share

(22) 

Nvidia Corp. has denied accusations of its arch-rival Advanced Micro Devices of providing cash to game developers for implementing GPU-accelerated processing of physics effects using PhysX middleware. While Nvidia admits that it can provide engineers or artists to help game developers to incorporate certain effects into titles, the company cannot influence their decision to utilize PhysX, but not other libraries or engines.

“There could be no deal under which we would cash somebody in for using PhysX,” said Ashutosh Rege, the worldwide director of developer technology at Nvidia.

Nvidia Cannot Force Game Developers to Use PhysX

Physics engines of video games are developed along with graphics, audio and other parts of the titles. As a result, designers of games have to make their choices to use or not to use certain middleware well before any content of the title becomes apparent. Therefore, for Nvidia it hardly makes any sense to pay game developers to choose PhysX instead of Havok or other competing technologies and it also does not make a lot of sense for creators to take any cash from a third party and pick up a tool that does not comply with their requirements. There is a clear logic in this: even though Nvidia has a lot of financial resources, it cannot pay for development of a game that will eventually fail on the market; for a game developer it also makes more sense to create a playable title rather than a tech demo for a technology company.

“Physics engines are critical components of games. The game developers are not going to choose a physics engines based on any kinds of incentives if that is going to jeopardize the game itself. Primary criteria for game developers [when selecting a physics engine] are feature-set, algorithms, tools, support from the vendor. The most important factor for game developer today is market platform. In other words is whether that supports X360, PS3, PC and some developers are targeting iPhone and Wii with our PhysX engine. We support all of those, which is the reason why PhysX has become so popular,” said Mr. Rege.

We Help to Implement GPU PhysX – Nvidia

Nvidia PhysX middleware is truly utilized by many games that process physics effects on various kinds of central processing units, including homogeneous multi-core chips inside personal computers and Xbox 360 or heterogeneous multi-core chips inside PlayStation 3 or much less advanced processors installed into Wii or other platforms. There are also around fifteen games that can take advantage of PhysX processing on Nvidia GeForce graphics processors and those titles are usually criticized by AMD’s software developers relationship managers. The reason? According to Richard Huddy, who is the head of software developer relations of AMD in Europe, Nvidia provides cash to game developers for using GPU PhysX. Nvidia does admit that it supports such titles actively, however, it does not bribe game designers in any way.

“Developers can also choose to add some GPU PhysX features. We will, of course, help them to do that; we will help them with engineering and we will help them even with artists, who also go on-site and spend a lot of time with their artists to [help creating content]. Adding GPU PhysX to a game is a lot more different than adding just general physics effects. There is more work than adding post-processing effects. So we help them with that. We also help them with marketing with any kind of bundle deals with add-in-card makers if the latter are interested in bundling those games,” stressed the worldwide director of developer technology at Nvidia.

Nvidia makes no secret of the fact that it sends engineers and even artists to help development companies to implement certain functionality; nevertheless, it is also not a secret that ATI, graphics business unit of Advanced Micro Devices, does the same to support certain game developers and ensure that its graphics innovations are implemented in a timely manner. In fact, large game creators from time to time sign marketing deals with ATI or Nvidia about half a year or a little more before the launch of a game, after which the appropriate independent hardware vendors (IHV) work with designers to incorporate certain functionality into those titles. Needless to say that it is nearly impossible to make the games look radically different in half a year, which is why software constantly lags behind hardware.

Make No Harm

ATI/AMD earlier this year also accused Nvidia of intentionally lowering performance of non-Nvidia platforms in titles that support GPU PhysX, however, Nvidia pointed out a number of times that it did not do that.

“What we do when we add GPU PhysX engagement with the developer is that in no shape or form we do anything harmful for the rest of the platforms, those that do not support GPU PhysX. It is just an additive value to our GeForce customers and eventually it boosts game experience on the PC,” said Mr. Rege.

PhysX Is Not Glide!

Ashutosh Rege of Nvidia also does not agree that PhysX is akin to the 3dfx Glide application programming interface that vanished into oblivion after 3dfx was acquired by Nvidia in 2000. According to Mr. Rege, Glide only worked on 3dfx hardware, meanwhile, PhysX middleware is compatible with all three modern video game consoles, PlayStation 3, Wii and  Xbox 360, as well as all personal computers and even Apple iPhone platform. However, in those games that use GPU PhysX, end-users will get better experience.

“The comparison of Glide against PhysX is not smart. PhysX is not an API, it is a full set of software, it is a middleware. In the middleware business you have game developers saying ‘I’ve got these features, I’ve got these licensing terms and I need to deploy on these platforms. What is the best solution here?’. Of course, the cost of license is also important to developer. Based on all of that, they make their decision what package to choose. […] I will be honest: the GPU PhysX is not the biggest consideration for game developers, it is something that is cool, it is something that comes as a bonus, but this is not the main deciding factor,” explained Mr. Rege.

Still, GPU-accelerated PhysX only works on Nvidia hardware and only in cases when an Nvidia GeForce graphics card is used for graphics rendering. The practice is rather controversial since this limits consumers’ choice, but Nvidia has always claimed that it would not validate certain functionality on non-Nvidia platforms.

Nvidia Supports Industry-Standard Implementations, Too

It is also noteworthy that besides its own PhysX middleware, Nvidia is working with open-source developers of physics processing tools, including those that use OpenCL.

“We are happy to support all OpenCL or DirectCompute [implementations of physics engines]. If a developer asks us to help implement certain feature, we will add it. If he asks to port something to DirectCompute, we will certainly do our best to get that to him. […] We will support game developers to the extent of our knowledge of, [for example], Bullet. Obviously, we do not have engineers, who are exposed in Bullet to [provide technical support], but we are working with the Bullet Engine team on specific things. […] At the end, we are selling GPUs, not PhysX,” said the worldwide director of developer technology at Nvidia.

Tags: Nvidia, PhysX, Geforce, GPGPU, AMD, ATI, Phenom, Athlon

Discussion

Comments currently: 22
Discussion started: 03/11/10 12:35:43 PM
Latest comment: 03/17/10 03:41:43 PM
Expand all threads | Collapse all threads

[1-11]

1. 
AMD loves to whine
0 0 [Posted by: beck2448  | Date: 03/11/10 12:35:43 PM]
Reply

2. 
Ouch that backfired AMD sure did a good job of making themselves look bad who's got yolk on their face now?
0 0 [Posted by: knowom  | Date: 03/11/10 02:00:42 PM]
Reply
- collapse thread

 
What’s wrong with AMD telling the truth? It’s no ridiculously that Nvidia is lying through there teeth once again. Ask yourself why Nvidia disabled the ability for ATI cards to access Nvidia’s PhysX? You know very well ATI cards are more than capable to implement NV’s version of PhysX but Nvidia disabled that feature within the PhysX driver and wants AMD to pay A LOT of money before they allow access.

AMD told them to shove a pickle where the sun doesn’t shine, here comes HAVOK, it’s free and its guarantee to work on all PC’s regardless what components you have.

Why on earth should I buy an Nvidia card just to use PhysX LOL, no thank you, I already have a GPU and it starts with Radeon…
0 0 [Posted by: nt300  | Date: 03/17/10 03:41:43 PM]
Reply

3. 
Nvidia is not a very good liar. Batman Arkham Asylum is a perfect example of locking out the competition with code that disables AA on ATi cards.

Anyone believing anything Nvidia says has a brain of a mollusk.

On a side note, didn't they say that Fermi was supposed to be released in November 2009?
0 0 [Posted by: FLA  | Date: 03/11/10 06:12:22 PM]
Reply
- collapse thread

 
Right and AMD is just a caring loving company which does everything for their customers...

Batman: AA is a perfect example of AMD being lazy and then whining about competition not doing AMD's job for them.

Oh and by the way didn't AMD say we should have had Bulldozer already ?
0 0 [Posted by: opOOr  | Date: 03/12/10 02:09:10 AM]
Reply
 
considering that everything AMD/AIT does is (or is turned into) a open standard then yes they are a much more caring company then nvidia in my book.

edit @ buldozer : that was YEARS ago when they said that. since then AMD has delivered on time, on price point and in most cases on performance.
0 0 [Posted by: Countess  | Date: 03/12/10 03:05:04 PM]
Reply
 
AMD have to play nice. They don't have the same OEM/Game devs relations as NVIDIA or Intel do. It's the only strategy they've got. If you think for a second AMD would have been so "open" and nice if the had more money and/or market share you are seriously kidding yourself.

Barcelona was late and buggy and Bulldozer has been pushed back many many times so it's still a valid point.

Open standards ? Intel bought Havok and NVIDIA bought Ageia - what choice did they (AMD) have ? If AMD bought Ageia do you really think they would have been pushing for open standards so much ? Btw didn't NVIDIA enable OpenCL support in their drivers long before AMD did ? Don't NVIDIA also have superior OpenGL support ?

Don't get me wrong I like AMD. Hell I've owned their CPUs exclusively since the Athlon Thunderbird. Lately though they seem to be the cry-baby of the industry. They should focus on delivering competing solutions rather than crying foul all the time. Either introduce something better than the competition, take NVIDIA to court or shut the hell up.
0 0 [Posted by: opOOr  | Date: 03/13/10 07:26:38 AM]
Reply

4. 
pleaseee, give us consumer a proof, Batman AA is an example,

if we think with clear minds, there's another good example, when we try a dual-GPU solution for graphics-physX, nvidia doesn't allow non-GeForce GPU as primary graphics-GPU

do you really think nvidia's claims is true...?
0 0 [Posted by: am_drs  | Date: 03/12/10 07:07:34 AM]
Reply

5. 
We Do Not bribe game developers!

We Only pay them money and implement Physix in their games for free!

.
0 0 [Posted by: zark  | Date: 03/12/10 09:54:43 AM]
Reply

6. 
Yeah right. Like there are any more suckers to believe the bed time stories nvidia tells! Some might even believe their craps if it wasn't for games like Batman Asylum, Cryostasis etc... Not to mention they they disabled from drivers the use of dedicated card in pair with an ATI one. I mean...common!!!!!!! How stupid do you think we are???!
0 0 [Posted by: TAViX  | Date: 03/12/10 09:33:55 PM]
Reply

7. 
AMD are hypocrites complain about PhysX, but act lovey dovey about Havok. AMD and Intel can both suck it Nvidia doesn't have to play by their rules.

I can't wait for ARM to become competitive so AMD/Intel can't just strong arm out all the other potential competition because of patent monopolies.
0 0 [Posted by: knowom  | Date: 03/13/10 11:34:41 AM]
Reply
- collapse thread

 
havok doesn't have artificial drivers limitation on the hardware it can be used with.

considering how many games actually use GPU-physx(as opposed to normal physx which works on all systems) i have no problem seeing nvidia needing a bribe to get even those few
0 0 [Posted by: Countess  | Date: 03/14/10 11:32:42 PM]
Reply
 
Who's strong arming who? Here we have Nvidia trying to force there own non-industry standard that only works with NV cards. This is utterly ridiculous to say the least. Don’t you want to play games regardless of what graphics card you have fully knowing that you have the ability to enable any and all features within the game? Nvidia wants to make AMD pay them BIG BIG MONEY to have them enable NV’s PhysX on ATI graphics. How much more selfish can Nvidia be with the PC gaming world??????????

Read this then you’ll see what I mean:
Nvidia Disables PhysX in Presence of ATI GPU
http://www.tomshardware.c...ati-gpu-disable,8742.html
0 0 [Posted by: nt300  | Date: 03/17/10 03:36:40 PM]
Reply

8. 
There were more impressive games using Havok than those using Physx.

Even Half Life 2, released nearly 6 years ago, had some of the most impressive physics--the effects were more impressive than games today using Physx. Only Cryostasis showed mildly impressive physics using Physx, but the performance was plainly horrible even with GTX 295's. PhysX started out as something "niche" because it was cool to have a PPU in addition to a GPU with that Ageia card which allowed us to play GRAW with plenty of debris. Now, even Mirror's Edge or Batman:AA shows physics effects that could have been done more efficiently using Havok with much higher frame rates. I find the game Ghostbusters Video Game (an excellent game by the way) to utilize far more impressive physx using Infernal physics engine.

Throw in the fact that Physx is not optimized to use more than one CPU core. Furthermore, it is not "honestly" optimized for even a single CPU core. A honest optimization could have at least tripled or quadrupled the algorithm efficiency on such.

The bottom line of the bigger picture is that GPU's are already utilized to the maximum capacity with the work done for graphics processing (shaders, antialiasing work, etc..) Most of the newer games today leave at least 50% of a typical quad-core CPU idling along. As the CPU's become more massively multi-core in the future, while at the same time, the GPU's (including multi-GPU solutions) continue to be pushed at over 99% capacity, the true solution for the consumer is a CPU-optimized physics API. I would even claim that it was "always" better to use the additional CPU cores (at least the idle CPU cycles) for phsyics animation effects than the GPU which is usually pushed to the maximum with any of the more recent games and all the eye candy turned on (SSAA anybody?).
0 0 [Posted by: Bo_Fox  | Date: 03/13/10 02:09:25 PM]
Reply
- collapse thread

 
First of all it's not what PhysX can do but which of it's features devs will actually implement. All of the basic physics effects have to run on every piece of hardware a given game is meant to be played on - consoles, PCs with both ATI and NVIDIA GPUs, 1/2/4/6-core CPUs and so on. No one will go all out with PhysX cause if they do they will cater to only a hand full with the hardware capable of running such a game.

Have you even played Cryostasis ? I have and I didn't have any problems running it on my GTX280 so I don't know where did you get all that GTX295 BS from.

Throw in the fact that Physx is not optimized to use more than one CPU core
Source ?

Now, even Mirror's Edge or Batman:AA shows physics effects that could have been done more efficiently using Havok with much higher frame rates.
Source ?

GPUs utilized to the maximum ? Really ? Why the introduction of Eyefinity and 3D Vision/Surround then ?
0 0 [Posted by: opOOr  | Date: 03/13/10 04:07:36 PM]
Reply
 
If you want sources, I suggest reading the articles from this site, and after you finish start with the Tech Report, Anand and the rest, than you will understand how crappy and unoptimised this PX is.
0 0 [Posted by: TAViX  | Date: 03/13/10 07:04:55 PM]
Reply
 
All I see are AMD's rants. If that's your idea of a credible source then I rest my case.

0 0 [Posted by: opOOr  | Date: 03/14/10 03:21:26 AM]
Reply

9. 

opOOr : you get it wrong, you play 3D Design (not 3d Games) with Havok physics effects (CPU to render), you'll see a much better physics effects, the developer who use Havok just may not optimize Havok yet in 3D games (because they're not paid..? )

but I'm not in AMD side, all they do lately is complaining, they better DO something too..

This denial bribing story is bedtime story, like hell I believe this lol, please take note that if AMD was on nvidia side (bought Ageia), I doubt they won't do something similar, business is like this, full of tricks, this is just one of nVidia tricks
0 0 [Posted by: am_drs  | Date: 03/13/10 06:07:14 PM]
Reply
- collapse thread

 
(because they're not paid..? )


Or maybe they don't get the same support NVIDIA is willing to offer their partners ?
I think that's why NVIDIA dominates professional GPU market - support.
0 0 [Posted by: opOOr  | Date: 03/14/10 03:50:17 AM]
Reply

10. 
The whole point AMD/ATI has been making in the first place, is that the PC is a 'common platform'. If we buy a USB stick, we expect that Plug And Play, if it dos not support that, it's basicly useless. Same goes for everything else, DVD, BlyRay or hard drives, PCI, PCIe what ever. We expect that if it works on your PC, then it has to do so on mine, if my motherboard supports that standard. So, what about software? What if a MessageBox came up in Photo Shop, when you tryed to copy a picture saying, 'You need a NVIDIA graphics card for this feature!'. Now what kind of effect would that have on the use of Photo Shop? When we try to get rid of our sins from the past, we at least confess, some of us even promise never to do it again. So who is lying?
0 0 [Posted by: poula  | Date: 03/13/10 11:46:43 PM]
Reply
- collapse thread

 
Good Points:
Which strengthens the reason why AMD is pushing for an open "FREE" standard so that it’s guaranteed to be compatible with the PC and we may all use those features. Nvidia is in it for them only, they don’t care about PC gamers. They want there own non industry standard and they want to get paid for it.
0 0 [Posted by: nt300  | Date: 03/17/10 03:30:36 PM]
Reply

11. 
Nvidia had the opportunity to be the pinnacle of PC Gaming and they blew it. It’s obviously never too late, but denying the facts about paying off game developers to use non-industry standards is utterly selfish behaviour.

Pixelux and Bullet Physics offerings enable more realistic games that run on any OpenCL™- and DirectCompute-capable platforms. Don't kid yourselves, most games if not all have the Bullet Physics capability. All it takes is ATI’s drivers to enable this sucker to take off. But I am not sure if AMD had the go ahead with these game developers.
I do fear some game developers are used to getting paid loads of $$$ though something I don’t see AMD giving into anytime soon all thanks to Nvidia’s briberies..

If anybody can’t see how bad Nvidia is hurting PC gaming, Oh Boy, lol....
0 0 [Posted by: nt300  | Date: 03/17/10 03:25:37 PM]
Reply

[1-11]

Add your Comment




Related news

Latest News

Wednesday, August 27, 2014

9:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

6:41 pm | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga Graphics Processor

Monday, August 25, 2014

6:05 pm | Chinese Inspur to Sell Mission-Critical Servers with AMD Software, Power 8 Processors. IBM to Enter Chinese Big Data Market with the Help from Inspur

Sunday, August 24, 2014

6:12 pm | Former X-Bit Labs Editor Aims to Wed Tabletop Games with Mobile Platforms. Game Master Wants to Become a New World of Warcraft

Thursday, August 21, 2014

10:59 pm | Khronos Group to Follow DirectX 12 with Cross-Platform Low-Level API. Khronos Unveils Next-Generation OpenGL Initiative

10:33 pm | Avexir Readies 3.40GHz DDR4 Memory Modules. DDR4 Could Hit 3.40GHz This Year

12:10 pm | AMD to Lower Prices of A-Series APUs for Back-to-School Season. New Prices of AMD A-Series APUs Revealed