Information

Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!



Discussion

Discussion on Article:
Half-Life 2 Performance Preview: The Graphics Hardware Squeezer

Started by: Carfax | Date 11/18/04 04:15:28 PM
Comments: 84 | Last Comment:  08/25/06 07:03:44 AM

Expand all threads | Collapse all threads

[1-20 | 21-21]

1. 
Your results are much lower than everyone elses for the 6800 cards.

You did something wrong. You probably left trinlinear optimizations off.
0 0 [Posted by:  | Date: 11/18/04 04:15:28 PM]
Reply

2. 
There's def something wrong with the nvidia results here - I know for a fact that they don't perform that low!

Sounds pretty ATI-biased if you ask me...
0 0 [Posted by:  | Date: 11/18/04 04:46:49 PM]
Reply
- collapse thread

 
They didn’t use overclocked nvidia cards like the others web sites did.

If the DEMOS of Ati are ATI biased, why aren’t the DEMOS of Hardocp nvidia biased, anandtech, .....

Is to difficult to say….

0 0 [Posted by:  | Date: 11/19/04 08:25:52 AM]
Reply

3. 
Quote Page 6 :The RADEON X800 XT cannot run this game at a comfortable speed even in the lowest resolution.

Error?
0 0 [Posted by:  | Date: 11/18/04 05:38:08 PM]
Reply

4. 
And it would be nice if we could see how last generation cards handle the game... e.g. (5200, 5600, 9200, 9600) Try it on medium settings for an example, you don't need 'pure' mode (highest settings) to make it playable.
0 0 [Posted by:  | Date: 11/18/04 05:41:45 PM]
Reply

5. 
Bias. What is ATI paying you guys?
0 0 [Posted by:  | Date: 11/18/04 06:18:16 PM]
Reply

6. 
You idiots claiming bias do notice that there are no standard demos out there and the results you see on sites will vary since everyone is doing something different.
0 0 [Posted by:  | Date: 11/18/04 06:26:51 PM]
Reply

7. 
I would like to see the results on a newer set of drivers namely Forceware 67.02. This driver is known to be so far the driver to use when playing HL2.
0 0 [Posted by:  | Date: 11/18/04 07:12:23 PM]
Reply

8. 
They probably used ATi's demos instead of making their own.


Shame.
0 0 [Posted by:  | Date: 11/18/04 07:49:21 PM]
Reply

9. 
I love XbitLabs, but your results match with DriverHeaven - ATI won every single benchmark with NVIDIA staying behind. How can all your demos perform so differently that they don't match with AnandTech, HardOCP, and Firing Squad's results?

Are you use you correctly measured the performance?
0 0 [Posted by:  | Date: 11/18/04 08:04:20 PM]
Reply
- collapse thread

 
ati comes out on top in anand's, firingsquad, etc. Only review ive seen that had them relatively even was Hardocp which is known for crap reviews.
0 0 [Posted by:  | Date: 11/18/04 11:23:55 PM]
Reply
 
Hardocp used overclocked nvidia cards. And have the shame to say it was the stock clock.
0 0 [Posted by:  | Date: 11/19/04 08:31:35 AM]
Reply

10. 
Another very useful review guys! At least most of the entire range of of both major IHVs cards were tested esp Ati's X800XT PE vs X800XT & X700 vs 6600GT. Perhaps you should've kept 67.02 drivers although your reasons are clear.

For all you incredulous naysayers... Note that Xbit did not use overlcocked Nvidia cards, & correctly enabled the option for HQ watershader features for all cards. Not only that, they also correctly identified CPU limited benches, where all IHV's products tested similarly. Have a look at other sites & note which ones are promoting a close bunching of results & then concluding that Nvidia's products are as fast as Ati's in HL2. Well chosen demos, indeed... ;) The driver/IQ comments are also very useful, esp where one IHV has an issue with fog... ;) Compare Firingsquad vs HOCP, heh...

The thing to note is the great performance of the A64 system...
0 0 [Posted by:  | Date: 11/18/04 11:00:42 PM]
Reply
- collapse thread

 
100% agree with you!!!
0 0 [Posted by:  | Date: 11/19/04 08:33:45 AM]
Reply

11. 
what codepath does 6800 use by default? dx8.1 or dx9.0? looking here i'd surely say dx9.0, but when looking at PCperspective, i'd say dx8.1. Did xbitlabs force dx9.0 on 6800?
0 0 [Posted by:  | Date: 11/19/04 02:16:06 AM]
Reply

12. 
what the? it wont post the rest? trying again: i like this comment! so let me get this, it is not fair using an nvUltra OC card, but fair to use ATI's OC edition called PE? XT is 500/500 right? XTPE is 520/560 or am i wrong..
0 0 [Posted by:  | Date: 11/19/04 02:30:00 AM]
Reply
- collapse thread

 
There's one big difference... The XT PE's is clocked at 520/560 according to ATI's specifications. According to nVidia a 6800 ultra is clocked at 400/550, so how can you justify that HardOCP choose to use a non-standard 6800 ultra?
One more thing.... They choose to bench an entire level. Why on earth did they do that? Any hardware reviewer knows that if you review a grahipcs card you have to choose a graphical intensive test, otherwise there wont be much difference....
0 0 [Posted by:  | Date: 11/19/04 03:50:48 AM]
Reply
 
[H] noted LONG ago they were out to seek real world performance. Benching an entire level removes a) card bias and b) represents real world more than the 2 minutes you're out in the open space, fighting a dozen things at once, and with ten light sources.
0 0 [Posted by:  | Date: 11/19/04 08:56:02 AM]
Reply

13. 
I say that you've made the test in a way that it didn't compromise the news you've posted the other day. The news had the exact scores ATI gave. I don't think you did right.
0 0 [Posted by:  | Date: 11/19/04 03:46:22 AM]
Reply

14. 
As always a good indepth review, thank you :-)
Did you use the "reflect all" setting? How much does this setting slow down the 6800 as opposed to "reflect world"?
0 0 [Posted by:  | Date: 11/19/04 03:58:57 AM]
Reply

15. 
OK, pretty interesting review. I'll note just the following points :
- how much do the "ReflectAll" impact performance on both card versus "ReflectWorld" ?
- where are you demo from ? ATI ? NV ? Vendor independent ? or did you dreate your own ?
- how are the demo balanced between "shader intensive" and "real gameplay" ?

AWx
0 0 [Posted by:  | Date: 11/19/04 07:08:14 AM]
Reply

16. 
I think a number of you are getting a little carried away, however FYI, courtesy of jb:

http://www.chaoticdreams.org/ce/jb/ReflectAll.jpg
http://www.chaoticdreams.org/ce/jb/ReflectWorld.jpg
0 0 [Posted by:  | Date: 11/19/04 07:13:18 AM]
Reply

17. 
Regarding to FP32, why we don’t see any image quality difference, yet?
Is it because DX9 is FP24 only compliant?
I thought that when DX9.0 games start appearing we was going too see some differences.
0 0 [Posted by:  | Date: 11/19/04 08:49:31 AM]
Reply
- collapse thread

 
Because FP16 is *overkill* for most calculations. Standard 8bit is enough for most calculations. Maybe a couple bits higher internal precission helps sometimes, but generally 12bit fixed point, or often even 8bit, is so precise you won't be able to see any artefacts even if you look at the image side by side with an INFINITE precission version.

Going from FP16 to FP24 to FP32 is just oneupsmanship for the sake of oneupsmanship. The only time you can need more than FP16 is when doing texture coordinates on REALLY large textures or if you're doing world space calcuations in the pixel shader instead of vertex shader.

FP32 is completely overkill for PIXEL operations. Imagine you had a mouse that was 65,000x more precise than your current mouse. Would that make you better at fps games? No, it'd be overkill.

When is FP32 generally useful? When mapping non graphic algorithms that don't require IEEE floating point complicance off the CPU to something faster.

So don't expect any difference in image quality between FP16, FP24, or FP32. Expect a difference from the old things like anisotropic filtering or anti aliasing.
0 0 [Posted by:  | Date: 11/19/04 11:25:06 PM]
Reply
 
Thanks for the reply!!

But one thing,
if i understand your response, so the FX5xxx worst image quality is not because of the FP16 it self but because of chip?

But if isn't that very important why Ati is going for FP32 in the future since FP24 is giving it the performance edge, and image quality too?

Thanks!
0 0 [Posted by:  | Date: 11/24/04 07:10:47 AM]
Reply

18. 
Sped: "Hardocp used an overclocked nvidia card, and didnt enable the high quality water shaders"

How do you know this? Quoting the Hardocp article:
"All the game settings were set at their highest values with “Reflect World” selected."

I'm not taking sides here, I just think the considerable gap between Nv and ATI cards in the article is a little strange. Most benchmarks see ATI on top but not by THAT much.
0 0 [Posted by:  | Date: 11/19/04 08:51:48 AM]
Reply

19. 
3. I'd really love to see a couple benchmarks with older/slower CPUs.

5. And it would be nice if we could see how last generation cards handle the game... e.g. (5200, 5600, 9200, 9600) Try it on medium settings for an example, you don't need 'pure' mode (highest settings) to make it playable.

Agreed with post 3 and 5. Most Hardware sites are so caught up with fast CPUs/video cards, they forget to realise that the majority of gamers don't have fast machines or video cards.

How about a future article that involves various lower end CPUs and video cards to give us an idea how well (acceptable) does HL2 run on the "average Jane/Joe" system? That is a far more helpful article.
0 0 [Posted by:  | Date: 11/19/04 06:09:47 PM]
Reply

20. 
I am running a GeForce 2, wonder if it will work on HL2...
0 0 [Posted by:  | Date: 11/20/04 04:35:09 PM]
Reply

[1-20 | 21-21]

Back to the Article

Add your Comment