News
 

Bookmark and Share

(30) 

FOLLOW UP: Nvidia Denies Accusations of Disabling Multi-Core CPU Support in PhysX API.

Advanced Micro Devices said that Nvidia Corp. had specifically altered its PhysX application programming interface (API) so that it could not take advantage of multi-core central processing units (CPUs) while making physics effects computations. According to AMD,  the reason for such modifications was to increase importance of graphics processing units (GPUs) that are used to process physics effects in select games that are powered by PhysX.

“The other thing is that all these CPU cores we have are under-utilised and I'm going to take another pop at Nvidia here. When they bought Ageia, they had a fairly respectable multi-core implementation of PhysX. If you look at it now it basically runs predominantly on one, or at most, two cores. […] I wonder why Nvidia has done that? I wonder why Nvidia has failed to do all their QA on stuff they don't care about – making it run efficiently on CPU cores – because the company doesn't care about the consumer experience it just cares about selling you more graphics cards by coding it so the GPU appears faster than the CPU. It's the same thing as Intel's old compiler tricks that it used to do; Nvidia simply takes out all the multi-core optimisations in PhysX,” said Richard Huddy, AMD’s Worldwide Developer Relations manager, in an interview with Bit-tech.net web-site.

Nvidia has been proclaiming importance of physics processing on graphics processing units with the help of PhysX API it acquired with the takeover of Ageia back in 2008. The company has been also showing benchmarks that showed substantial performance advantage of GPUs over CPUs when it comes to processing of PhysX effects. However, AMD believes that such performance advantages are artificial.

“If coded well, the CPU can tackle most of the physics situations presented to it. The emphasis we're seeing on GPU physics is an over-emphasis that comes from one company having GPU physics... promoting PhysX as if it's Gods answer to all physics problems, when actually it's more a solution in search of problems,” added Mr. Huddy.

AMD is working with a number of developers of physics processing technologies to enable its ATI Radeon graphics chips to make appropriate computations, but so far Nvidia PhysX remains rather popular for physics-intensive game titles.

Back in 2009 Nvidia also disabled support of PhysX on the company's own GeForce GPUs as well as Ageia PhysX physics processing cards when drivers detected ATI Radeon hardware present.

Tags: AMD, ATI, Radeon, Nvidia, PhysX

Discussion

Comments currently: 30
Discussion started: 01/19/10 08:22:44 PM
Latest comment: 01/23/10 03:45:03 AM
Expand all threads | Collapse all threads

[1-19]

1. 
Why am I not surprised?

0 0 [Posted by: RtFusion  | Date: 01/19/10 08:22:44 PM]
Reply

2. 
NVIDIA does its best to make PhysX useless unless you have NVIDIA and only NVIDIA cards:

- Deliberately not taking advantage of CPU power.
- No support for Ageia cards.
- PhysX disabled if you also have an ATI card in the system. (At least it's possible to overcome this with some work.)

I really dislike NVIDIA for doing this, and I hope developers move away from PhysX. I can understand not supporting Ageia cards (though I have one I'd love to use, so I find it annoying), but the other two points are just deliberate anticompetitive practices.
0 0 [Posted by: ET3D  | Date: 01/19/10 10:35:15 PM]
Reply
- collapse thread

 
You first need something to actually compete against to be considered anti-competitive. PhysX is a supported standard because it's good and offers something other physics solutions can't or don't offer. It's no more anti-competitive than say pro tools being a industry standard in the music business.
0 0 [Posted by: knowom  | Date: 01/20/10 01:56:17 PM]
Reply

3. 
Perhaps the peeps that are running nVidia are the same ones that are running this country into the ground in the fastest means possible, hoping for any desperate signs of reason to boast success.
0 0 [Posted by: aaaayes  | Date: 01/19/10 11:13:31 PM]
Reply

4. 
Can no body see AMD's propaganda tactics. While they are making mega profits of their cards these days they have invested very little to give back to their customers and instead of looking bad for it they turn everything nvidia gives to their customers into something evil and wrong. It is insanity and its sad so many buy into it. Do you really think you can run the modern game GPU physx (batman, dark void, etc) on the CPU and while processing the game? Anyone with basic computing knowledge should know that its not that simple. Even AMD know this and to prove it they are "working with a number of developers" to bring a gpu physics to their cards. Imagine that. The have CPUs as well as GPUs but choose to work on gpu physics. Why has not havok brought this kinda advanced physics with the many yrs head start over nvidia? Why is amd working with developers to get a free physics for their gpus instead of dishing out money on their own back to their customers like Nvidia has done? What a joke
I buy AMD and can tell you this is utter crap. They point fingers so as to not reveal their own unwillingness to give something like physx to their customers out of their pocket. There is no cpu physics out that can compare to the physx nvidia has worked into their gpus and many have been around long before physx, by a long shot. Why would nvidia invest time developing further into CPU fuctions when they dont even sell CPUs? Why does the company that sells CPU complain that nvidia didnt develop physX to run specifically on the cpu they make money selling? No doubt nvidia could spend loads developing physics on their CPUs for them. They could make better progress But never to the degree of what they achieve with the dedicated GPU. Look at Havok!!!! but nvidia could magically do it better on the cpu then them. Then they can give it to AMD. Hundreds of millions already spent and its a sin to give it to only their customers, AMD should have it too. Cause they you know spent some tears but held tighly onto their money so they should get it for free developed and all.
AMD is looking for handouts and its a shame. Of course its nvidia the bad guy for giving physx to their customers for free and not ati. AMD has the ability to do these things but it would rather not. Your money fits better in their pockets while they point fingers instead of actually giving something back to their customers.
BTW i own phenom2 on the 790gx ati chipset. But i also own a gtx nvidia cause i like to give my money to a company that is making things new and exciting while i can enjoy it, see my money work. So far physx titles are a trip and still in its infancy. Its great that their is efforts in furthering the industry i love with new innovative uses of the technology. Why is it nvidia is the bad guy when i cant run physX on my ati card? ATI hasnt worked hard and spent the Cash and thats the real reason why. Nvidia developed it and the gtx runs it well, which i bought at a very competitive price and still look forward to seeing more physx titles roll out. They are a breath of fresh air.
ATI get off your butt!!!
0 0 [Posted by: ocre  | Date: 01/20/10 01:55:46 AM]
Reply
- collapse thread

 
I'm sure AMD would also be interested in seeing these 'mega profits' that you claim they are making. Lets not forget that it was ATI with the 4850 that ended a decade of nVidia price gauging of consumers for even midrange cards. That it was ATI that focussed on image quality when nVidia were loading up their drivers with benchmark specific hacks that downgraded quality to win benchmarks. That it was ATI who first brought DX9 and DX11 to the market.

Thats a lot of first's for a company that apparently just sits around asking for handouts. In regards to the article, there is a difference between not supporting something, and deliberately gimping it to suit nVidia at the expense of consumers. How is nVidia's blocking of Physx support to Ageia card owners fair to consumers? How is nVidia's blocking of Physx support to people who own both a nVidia and ATI card fair to consumers, especially when its been proven that there is no underlying technical reason to do so. And lets not bring up the Batman: AA debacle again.

ATI supports open standards, like OpenCL for physics acceleration. Thats why it supports developers to develop their own physic engines that run on OpenCL, so that everyone wins, because that same engine will work just as well on both nVidia and ATI hardware.
0 2 [Posted by: genie  | Date: 01/20/10 04:22:37 AM]
Reply
 
"Thats why it supports developers to develop their own physic engines that run on OpenCL, so that everyone wins, because that same engine will work just as well on both nVidia and ATI hardware."

Nvidia released OpenCL drivers prior to Ati.

I don't see Ati developing a OpenCl physics engine either and the reason they talk about supporting it so much is because it benefits their CPU and GPU businesses.

The reason you don't hear about Ati making hardware physics is because Nvidia owns the major important patents related to it now.
0 0 [Posted by: knowom  | Date: 01/20/10 02:24:33 PM]
Reply
 
nvidia price gauging? What about ati and the 58XX release? soon after they jacked up the $$ big time, What about the 5650? Did you know the gt240 is much more expensive to manufacture then the 5650. That every competing card in the gtx series was higher to manufacture then the ati counterparts? Do you think a company is evil for selling products at a profit? Well then put your coments strait toward ATI. Lets roast Kellogg, craft, and any name brand food, shoe, cloths, watch,every company that sell things higher then another can. That is retarded. Nvidias mamoth size cards cost a lot to make, you know that. And as soon as the gf100 hits AMD will sell their cards cheaper but you wont think ill of them, evn though its the [bexact same case. get over that stupidity talk.
Look i own AMD/ati and i always have been a fan. But this physX crap talk is just that crap. Get over it. ATI/AMD can make things better for their customers, they can.
Why cant ati invent eyefinity for nvidia owners? its not fair you must have an ati card for it, bla bla bla. Its stupid!
0 0 [Posted by: ocre  | Date: 01/21/10 12:56:45 AM]
Reply
 
Ocre, this is exactly what I was thinking. PhysX is intended to run on a GPU, and draining all of a CPU's power to use it would definitely not be a good thing to do.
0 0 [Posted by: Mango  | Date: 01/20/10 10:32:27 AM]
Reply

5. 
I suspect the developers are the ones you should ask not the amd marketing guy. They (as always) are developing for consoles first, I strongly suspect they don't want physx on the xbox 360 for example to be using any more then one of the cores. Hence all those console port games will be set-up to run fine on a single cpu core.

Any fancy *effects* physics like batman is effectively an nvidia add-on. While I agree nvidia could put effort into developing physx to do some of these fancy extra effects on the cpu why would they? Nvidia doesn't sell multicore cpu's?

If amd want multi-core cpu physics then they should be developing it - they are the ones selling multi-core cpu's after all not nvidia. Amd marketing bloke seems to be all talk and no action.
0 0 [Posted by: Dribble  | Date: 01/20/10 02:34:44 AM]
Reply

6. 
From what I'm aware multithreaded PhysX is already supported on all games that only support CPU PhysX calculations, it's only restricted to one thread in any game that supports PhysX via GPU.
0 0 [Posted by: Rand  | Date: 01/20/10 08:20:30 AM]
Reply

7. 
Here's an example of a PhysX game restricted to one thread on an i7 920 @ 3.06g http://www.tomshardware.c...rkham-asylum,2465-10.html.
I dont know whether contemporary CPU's can rival or better the physics produced by Nvidia PhysX cards but if CPU's are allowed to process physx at 100% on all threads I'm sure they'll be able to push higher frame rates.
Nvidia is just trying to capitalize with their propaganda about a "need" for their PhysX vidcards, especially with the way cpu's are progressing.
0 0 [Posted by: tensai  | Date: 01/20/10 10:00:38 AM]
Reply
- collapse thread

 
What? If you dont know then whats you posting about it for. The answer is, No! While if one was inclined they could program physX to take more advantage of AMDs 4 core cpus and intels i7s, the way physx is being programed is to take advantage of the massive parallel power of the GPU. Havok is owned by intel, been around longer the nvidia has had physX, and runs on the CPU, so why do we not see intel or havok helping developers with bringing advanced physics comparable to nvidias GPU physX? Intel is worth many times more the nvidia, thay have endless billions. Why is havok not doing this then. Because its a lot of freakn effort and money plus the results arent gonna be as impressive. Yet they could do much better with havok on the CPU and the dont!!!!
Why is nvidia a bad guy? Think about it!!!
0 0 [Posted by: ocre  | Date: 01/20/10 06:44:10 PM]
Reply

8. 
They have it like this, because if all of the cores were being used for physics processing, the actual games would run very slow. Having every core being used would definitely slow down games that use PhysX.
0 0 [Posted by: Mango  | Date: 01/20/10 10:28:45 AM]
Reply

9. 
I'd agree with you sir mango if the cpu's threads were running at max load (physx on with non-physx vidcard) while showing the same poor performance. But they're not, the cpu load is actually lower with phsyx on than it is off... both cpu's are not being utilized properly. Seeing this, how can i say that cpu's can't run physx when they're not even trying (or not allowed to try).
0 0 [Posted by: tensai  | Date: 01/20/10 11:09:18 AM]
Reply

10. 
Current Games rarely use over 2 cores. Many gamers have 4 cores. Just cause your running the game on 2 cores does not mean you dont want to be trying to load balance other processes on those cores since modern cpus work out of order and some intel cpus use hyper threading to try to work on 2 different things at same time. So you want all the tasks to be able to work with any free resources on all the cores.
0 0 [Posted by: cashkennedy  | Date: 01/20/10 12:26:34 PM]
Reply
- collapse thread

 
actually do your home work. Quads are getting more popular for the hardcore. The average PC gamer is by far not using 4 cores. on steam (where the gamers who play hardcore modern games) 57% use only dual cores. the rest consist of single (18%)then quads (24%).
from December 09
http://store.steampowered.com/hwsurvey

Amd is way out there with this, it really is bull and a huge load of crap. This is a sad sad attempt to mask their own inactivity by point fingers and making others look bad. Sad sad, dont listen to AMD's crap

0 0 [Posted by: ocre  | Date: 01/20/10 07:26:33 PM]
Reply

11. 
Nvidia's response..."Tough titty said the kitty when the milk went dry"

Ati has had plenty of time to come out with a physics solution of their own or at least have one heavily developed by now.

Nvidia wants to sell more of their own products go figure and your telling me AMD/Intel don't have the same business model?

Honestly I'm more often CPU limited anyway in situations where I'm below 60 FPS so I'd rather have the GPU handle more of the workload anyway though a toggle option would be the best option.
0 0 [Posted by: knowom  | Date: 01/20/10 01:38:49 PM]
Reply

12. 
Similar story:
Hardware support for some OpenGL functions is intentionally disabled in desktop versions of both AMD and Nvidia cards (GeForce/Radeon). Basically the same cards with different BIOS are sold as proffesional cards (Quadro/FireGL|FirePro 3D) for much higher price.
0 0 [Posted by: KonradK  | Date: 01/20/10 04:03:51 PM]
Reply

13. 
Look everyone!!!!

Amd owns ATI GPU and their chipsets. They have X86 CPUs and everything they need to exist in the PC world. No one has stopped them for they have had every opportunity to make GPU physics happen on their GPU or since the CPU is so capable develop some great advanced physics to run on CPUs. Every gamer would be able to enjoy it that way but guess what??? they dont and have not and are not. The new open physics for DX11 will be developed by anyone but them on their money.

Intel owns CPUs, chipsets, and have the market cornered with 70% all the gpus being the intel integrated graphics. They have all they need to exist in the pc market. On top of that intel owns Havok, a major physics platform that runs on the CPU and is more popular then physx by # of games that use havok. No game with havok looks anywhere close to what nvidia has brought with their gpu physX. Why doesnt amd complain about intel not stepping havok up to look better running on their CPUs, lol. Amd is ridicules!

Nvidia has GPUs, their chipsets division crushed by intel. Nvidia desperately needs to hang on tight as they cant exist in the pc market on their own. They are giving reasons for ppl to choose their GPUs. They have stepped up the role so as to make owning an nvidia GPU more useful and desirable. When you buy an nvidia GPU you get graphics, physX, and cuda. All things they worked on and payed for. They want to use the GPU and have as much uses as possible for them so as to have a place in the PC world. They have no CPU. They arent punishing ATI, they arent greedy for not sharing physX. Its part of their ideology. They are using the only thing they got to the fullest and offering these things to ppl who buy their gpus to make them more worthwhile and purposeful. Why is it a crime?

Look anywhere where there is real data and you will see the average gamer does not have 4 cores but 2. Should nvida spend all their time a and money developing physX to run better on CPUs in the first place. They dont even sell them, make them max out 4 cores, then what would the game run on? Oh they should develope it to use all the extra available cpu power after the game takes what it needs then everyone from single to quads will be able to experience it perfectly on the CPU. Like that would workout. Nvidia should just be adding advanced physX to AMD and intel CPUs just all for nothing. All out of their pocket so amd can be happy. Havok owned by intel doesnt do it but Nvidia should take up that role, huh?

Can anybody see how insanely off AMD is for these claims? They would benefit tremendously if nvidia spent all that money doing something like that. Why the heck isnt any other AMD owner PO'd like me over their lack of action. They should be giving us things like these on their own. They have the everything in the whole PC market (cpu, gpu, chipsets)and they want nvidia handouts on physX.

AMD get off your butt, quit complaining and give something back cause really nvidia is making their GPUs important and a must have for a lot of gamers. But then again they Must do that in order to exist, i dont blame them!
0 0 [Posted by: ocre  | Date: 01/20/10 07:17:56 PM]
Reply
- collapse thread

 
Ok.
It is Nvidia's right to make PhysX on CPU so poor as they whishes. Nvidia also has right and reason to not supporting other's products and to put efforts on its own chips.
Other thing however is comparing for purpose benchmarks its own intentionally capped CPU PhysX with (not capped) GPU PhysX.
Similarly Intel would write program that uses SSE 4.2 and comapare it with other version that does not use SSE 4.2 and is intentionally poorly written. And what next?
"Look everyone at potential of SSE4.2. Over 100% speedup comparing to..."
Would be it ok?
I'm very glad that such things are disclosed. People should know from what speedup comes. From power of hardware and software or from capping alternative solution used in comparition test.
Such marketing practices are not acceptable (maybe acceptable only by devoted fans of certain vendor).
0 0 [Posted by: KonradK  | Date: 01/20/10 11:25:28 PM]
Reply
 
Look if you want nvidia physX to run well then you must run it on their hardware, that is what it is optimized for. Who in the world wold run a benchmark with high physX without a card for physX? With physx on the benchmark let you see how good your GPU accelerated physx is keeping up. Different cards give different numbers. And no physx card is gonna give very poor resukts. It doent matter if its ATI gpus or Nvidias own GPUs that dont support physix. The 7950gtx will give you poor results. An nvidia 8600gt will give you poor results. What the heck are you trying to mix up here? If you dont have a physX supporting card then dont run the advanced physX settings. How confusing is that. Even on Nvidia cards that dont support or cant handle it the numbers will be low. Its for ppl who have gpus for physX!!! Mix some more crap up, agenda thats all
0 0 [Posted by: ocre  | Date: 01/21/10 12:43:42 AM]
Reply
 
So what is purpose of comparing CPU PhysX to GPU PhysX? Why Nvidia makes such comparison tests? I have a little hope that you understand these simple questions.
0 0 [Posted by: KonradK  | Date: 01/21/10 09:55:23 PM]
Reply

14. 
once again: Amd is way out there with this, it really is bull and a huge load of crap. This is a sad sad attempt to mask their own inactivity by point fingers and making others look bad. it is Sad sad, dont listen to AMD's crap.

Quads are getting more popular for the hardcore. The average PC gamer is by far not using 4 cores. on steam (where the gamers who play hardcore modern games) 57% use only dual cores. the rest consist of single (18%)then quads (24%).
from December 09
http://store.steampowered.com/hwsurvey

0 0 [Posted by: ocre  | Date: 01/20/10 07:29:03 PM]
Reply

15. 
Yes, inorder to take advantage of the massively parallel GPU architecture, Physx has to be multithreaded. Using more than one thread on a CPU should not be an issue and It wasn't until Nvidia started doing It's "magic".

The ATI card detection and CPU usage are clear indicators for the path Nvidia has chosen. Anyway I don't see why they make such a big fuss about it. All the Physx demos I have seen have been unimpressive and unrealistic. They don't have any advantage over the CPU based Physx engines.
0 0 [Posted by: uibo  | Date: 01/20/10 10:58:41 PM]
Reply
- collapse thread

 
thats generally what most ppl who havent experienced it say. Exactly one of those cases where ppl act like they got a point and dont have a clue. PhysX is something that adds to the experience which will draw a player in as adds depth and feel. Something to experience not watch videos of. Until you ride the ride you have not a clue and no reason to dis it. Does not the review aft review of every title not show you its become something to a higher level. Even the most physx negative reviews of the past have taken note and sway to nearly completely positive talks and rants.

And on your first note. When you say nvidia doings it magic arent you referring to the fact they they bought physx out of pocket and developed it much further for it to run on their GPUs? Are you claiming that ageia physx ran the same on the cpu as it did with there physX cards? Well its a shame you make statements that imply such as you do. To enjoy the super primitive ageia physx in any glory you had to but their physX cards. The CPU software emulations was the alternative which offered very little compared to when you used their cards. Yes it was trimmed to basically nothing extra. Also ageia physx was in no way close to what glory nvidia has pulled off in their latest and soon to be released titles. Ageia was never an open free physX but back then no one hated them for it. Nvidia bought the failing and troubled ageia to run physX on their GPUs, wtf is the crime in that????? Why would nvidia buy it to develop it to run massively well on CPUs? Ageia seen the need for cards to further physx, does it not make since that you may need more parallel power? Havok is cpu based owned by intel, why cant they give AMD this awesome CPU physx nvidia is so evil to keep from them. Why cant AMD develope physx for their own darn cpus?
0 0 [Posted by: ocre  | Date: 01/20/10 11:33:43 PM]
Reply

16. 
This is a big fight! Only consumers wins!
0 0 [Posted by: samueldiogo  | Date: 01/21/10 10:42:36 AM]
Reply

17. 
HEY OCRE! Do you work for Nvidea or do you just have sh!t for brains SFB (ie. an Nvidea fanboi)! "i also own a gtx nvidia cause i like to give my money to a company that is making things new and exciting while i can enjoy it, see my money work."
- Ocre

Yeaaahhh right!!!

1. We have to thank Nvidea (and consoles) for being stuck with DX9 for the last 4 years for PC games. DX10 has not a whole lot to offer. The real DX10 as originally intended by Microsoft would of offered a whole more eye candy for us gamers, but Nvidea manage to convince/manipulate Microsoft to cut down some key DX10 features at the last minute because Nvidea wanted to be the first with a DX10 compliant card! And they were! Almost a whole 8 months before ATI a released their DX10 card but ATI actually released DX10 done right, or what we call today DX10.1. ATI's DX10.1 includes a tessellation engine, one of the key features in DX11, we could been enjoying for the last several years beautiful games if Nvidea would of work with ATI and Microsoft on DX10 done right! Not only Nvidea didn't want the extra DX10 stuff but they went around saying to developpers that DX10.1 is not important! Wow! Now, I just saw Nvidea releasing DX10.1 cards a few months ago! WTF. Also, when a game like Assassin's Creed came out with some DX10.1 code it actually ran faster on ATI cards! Like 20% faster! What Nvidea do, they get Ubisoft (the developpers of Assassin's Creed) to remove the DX10.1 patch because Nvidea doesn't support DX10.1, now that's stiffling progress! Just a do search on google on that: "Assassin's Creed DX10.1 patch" and see the controvercy on that. Now, here some other examples of ATI's R&D efforts to improve gamers/consummers experience, and Nvidea doing just the opposite:

1. Who are the first to come out with GDDR5 memory on graphic cards? ATI! at least 1.5 year ahead of Nvidea, that's no accident! ATI have been working closely with memory makers like Qimonda to set out the specs etc. GDDR5 provides twice the memory bandwith compared to GDDR3 and simplifies the PCB layout for AIB partners to manufacture. You get improved FPS and cheaper cards.

2. Who have been the first to come out with GPU's on a new mfg process? ATI, for the last 3 processes (90nm, 65nm and 40nm). TSMC and ATI worked together on getting those new processes paving the way for Nvidea to FOLLOW. You get faster clock speeds and lower power requirements from your PSU and... cheapers cards.

3. More recently; who comes out with DX11? ATI, since September. Nvidea still has no DX11 card because they didn't do their homework on DX10 done right. In fact, not only Nvidea has no DX11 card, they have been chanting for the better part of fall (September thru November) that DX11 is not important!!! Again, just google: "Nvidia thinks DX11 won’t drive graphics cards sales". Now, how wrong is that! I'm updating my DX9 card to DX11 because of the eye candy of DX11! and most gamers on the earth are! And everybody who upgraded from DX9 to DX10 will upgrade to DX11. DX10 is a failure, it didn't do dick sh!t for us gamers. Thanks Nvidea!

4. Another example of Nvidea stiffling the gamers experience: Nvidea purposely disables PhysX if you have an ATI card in your PC. Wow, talk about Nvidea shooting themselves in the foot! Fat chance of getting PhysX an open standard with that attitude. PhysX will remain Nvidea proprietary technology at the cost of gamers, I can't run PhysX on a faster card ATI than Nvidea's. Can you just image if Microsoft came out with their own graphic card and purposely disables DX because DirectX it their technology. I guess we would all move to the open standard OpenGL which isn't bad, Doom 3 is an openGL game. Open API standards exists because no one wants to be held hostage and be forced to purchase a card for a specific manufacturer. Yeah, that's Nvidea progress and leading us down the hole. I would never buy an Nvidea card because otherwise you would be endorsing their business model which is about screwing consummers.

5. ATI eyefinity; sure beats the hell of Nvidea's 3D phoney stuff. The ability to game with a 180 degree view is way better than having to wear 3D glasses all day and possibly get some wierd side-effect for having them for a long period of time. I am convinced that Nvidea will copy ATI eyefinity idea.
0 0 [Posted by: steph  | Date: 01/22/10 11:13:31 AM]
Reply

18. 
HEY OCRE! Do you work for Nvidea or do you just have sh!t for brains SFB (ie. an Nvidea fanboi)! "i also own a gtx nvidia cause i like to give my money to a company that is making things new and exciting while i can enjoy it, see my money work."
- Ocre

Yeaaahhh right!!!

1. We have to thank Nvidea (and consoles) for being stuck with DX9 for the last 4 years for PC games. DX10 has not a whole lot to offer. The real DX10 as originally intended by Microsoft would of offered a whole more eye candy for us gamers, but Nvidea manage to convince/manipulate Microsoft to cut down some key DX10 features at the last minute because Nvidea wanted to be the first with a DX10 compliant card! And they were! Almost a whole 8 months before ATI a released their DX10 card but ATI actually released DX10 done right, or what we call today DX10.1. ATI's DX10.1 includes a tessellation engine, one of the key features in DX11, we could been enjoying for the last several years beautiful games if Nvidea would of work with ATI and Microsoft on DX10 done right! Not only Nvidea didn't want the extra DX10 stuff but they went around saying to developpers that DX10.1 is not important! Wow! Now, I just saw Nvidea releasing DX10.1 cards a few months ago! WTF. Also, when a game like Assassin's Creed came out with some DX10.1 code it actually ran faster on ATI cards! Like 20% faster! What Nvidea do, they get Ubisoft (the developpers of Assassin's Creed) to remove the DX10.1 patch because Nvidea doesn't support DX10.1, now that's stiffling progress! Just a do search on google on that: "Assassin's Creed DX10.1 patch" and see the controvercy on that. Now, here some other examples of ATI's R&D efforts to improve gamers/consummers experience, and Nvidea doing just the opposite:

1. Who are the first to come out with GDDR5 memory on graphic cards? ATI! at least 1.5 year ahead of Nvidea, that's no accident! ATI have been working closely with memory makers like Qimonda to set out the specs etc. GDDR5 provides twice the memory bandwith compared to GDDR3 and simplifies the PCB layout for AIB partners to manufacture. You get improved FPS and cheaper cards.

2. Who have been the first to come out with GPU's on a new mfg process? ATI, for the last 3 processes (90nm, 65nm and 40nm). TSMC and ATI worked together on getting those new processes paving the way for Nvidea to FOLLOW. You get faster clock speeds and lower power requirements from your PSU and... cheapers cards.

3. More recently; who comes out with DX11? ATI, since September. Nvidea still has no DX11 card because they didn't do their homework on DX10 done right. In fact, not only Nvidea has no DX11 card, they have been chanting for the better part of fall (September thru November) that DX11 is not important!!! Again, just google: "Nvidia thinks DX11 won’t drive graphics cards sales". Now, how wrong is that! I'm updating my DX9 card to DX11 because of the eye candy of DX11! and most gamers on the earth are! And everybody who upgraded from DX9 to DX10 will upgrade to DX11. DX10 is a failure, it didn't do dick sh!t for us gamers. Thanks Nvidea!

4. Another example of Nvidea stiffling the gamers experience: Nvidea purposely disables PhysX if you have an ATI card in your PC. Wow, talk about Nvidea shooting themselves in the foot! Fat chance of getting PhysX an open standard with that attitude. PhysX will remain Nvidea proprietary technology at the cost of gamers, I can't run PhysX on a faster card ATI than Nvidea's. Can you just image if Microsoft came out with their own graphic card and purposely disables DX because DirectX it their technology. I guess we would all move to the open standard OpenGL which isn't bad, Doom 3 is an openGL game. Open API standards exists because no one wants to be held hostage and be forced to purchase hardware from a specific manufacturer. Yeah, that's Nvidea progress and leading us down the hole. I would never buy an Nvidea card because otherwise you would be endorsing their business model which is about screwing consummers.

5. ATI eyefinity; sure beats the hell of Nvidea's 3D phoney stuff. The ability to game with a 180 degree view is way better than having to wear 3D glasses all day and possibly get some wierd side-effect for having them for a long period of time. I am convinced that Nvidea will copy ATI eyefinity idea.

here's 2 more:

6. Nvidea constant rebranding scheme. They did it again just recently (2 weeks ago) with the g100 and G120. Google search: "Nvidea rebrand". What's the harm? Manipulating consuimmers thinking it's new card...

7. Nvidia's Bumpgate. google: "nvidia bump gate". Thousands and thousands of laptop consummers (and some desktop) get shafted with a faulty Nvidea chip.

give me some time and I can probably refer you to more problems with Nvidia...
0 1 [Posted by: steph  | Date: 01/22/10 12:02:02 PM]
Reply

19. 
Nvidia is an Evil company,
They will fail, it must be defeated!
0 2 [Posted by: TrueGamer  | Date: 01/22/10 01:56:52 PM]
Reply

[1-19]

Add your Comment




Related news

Latest News

Tuesday, July 15, 2014

6:11 am | Apple Teams Up with IBM to Make iPhone and iPad Ultimate Tools for Businesses and Enterprises. IBM to Sell Business-Optimized iPhone and iPad Devices

Monday, July 14, 2014

6:01 am | IBM to Invest $3 Billion In Research of Next-Gen Chips, Process Technologies. IBM to Fund Development of 7nm and Below Process Technologies, Help to Create Post-Silicon Future

5:58 am | Intel Postpones Launch of High-End “Broadwell-K” Processors to July – September, 2015. High-End Core i “Broadwell” Processors Scheduled to Arrive in Q3 2015

5:50 am | Intel Delays Introduction of Core M “Broadwell” Processors Further. Low-Power Broadwell Chips Due in Late 2014

Wednesday, July 9, 2014

4:04 pm | Intel Readies New Quark “Dublin Bay” Microprocessors. Intel’s “Dublin Bay” Chips Due in 2015