Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Choosing the Best CPU for Doom III

Started by: Dewil | Date 08/25/04 04:37:25 AM
Comments: 95 | Last Comment:  09/01/16 06:17:55 PM

Expand all threads | Collapse all threads


It’s a little strange that every site points amd as the best (faster), but this isn’t the case here.

Do you think that you have selected some board/bios not very good?

It would be nice to see some lower end P4 platforms since each of 8 of 10 people I know have some ULI, SIS, VIA chipset not intel. Reminds the time when everyone tested P4 with I850 and RDRAM vs AMD systems, I have seen more then 100 intel motherboards and only one of them had that chipset inside, quite popular, hem.
Drawing a better picture than it really is.
0 0 [Posted by:  | Date: 08/25/04 09:38:05 AM]

1) a] 30 monsters? Does this situation represent the situation in real gameplay? I would suppose 5, maybe 8-10 monsters. The large number of monsters makes your test a litle bit synthetic.
b] Why ID have decided to use "wrong situation" for benchmarking? Any suggestions?
2) I have notices that the most boring situation during the gameplay is the loading of the new map. Maybe this is an another candidate for benchmarking?
3) The differences between processors in your test are a little bit low. Maybe this is a sign of GPU-dependence?
0 0 [Posted by:  | Date: 08/25/04 04:55:59 PM]

Clearly the tester has, in trying to make cpu-limited tests, created very gpu-limited tests. As others have pointed out, there is very little difference between the processors (excluding the Athlon XP) in the custom tests.

The reason intel edges ahead of amd in xbit's tests (by an insignificant margin when you look at the numbers) is probably because of the very slight interface advantage offered by PCI express which only makes a difference in rankings because of the gpu-dependant nature of the tests. You can see the two intel chipsets when paired with 3.4XE's swap positions, with the 925X taking the lead only in xbit's tests but losing to 875P in the default test.

Finally, as others have mentioned, cheating to get a "score" of monsters in one room is not a real world test. Adding monsters may stress the cpu but it clearly stresses the gpu more in most cases, judging by xbit's results. I don't know how biased shadowing is towards cpu or gpu but possibly xbit would have been better off finding a place with a lot of light sources instead of alot of monsters.
0 0 [Posted by:  | Date: 08/26/04 11:53:47 AM]
- collapse thread

It would be wise to read carefully before posting. Xbit article clearly states right in the beginning that their custom test was not gpu limited, because they have tested for that and that was the result.
0 0 [Posted by:  | Date: 08/26/04 02:26:14 PM]

There's no problem with the way they did their custom benchmark. We all know that our boxes tend to choke in those situations when there are many enemies on the screen. In fact, that's the only time you really NEED fps: if you are just strolling around and enjoying scenery, even 25 fps would probably be sufficient. In this resepect, their test is what we really need to judge how well someone's system would perform in a game, ie if it doesn't drop below 25 fps with 2 million monsters on the screen, you are good to go.
As for comparing cpu's, well, it's the same story considering what one needs from the cpu again, that is providing fluid fps numbers in heavy firefights.
I only have one problem with these test.
#13 said it well: how can a more cpu demanding demo give a higher average fps.
My guess is: 1) this demo is legit, 2)this demo is not more cpu depending.
Let me explain. 1) average fps is higher but min fps is lower. some parts of the custom demo are very easy on cpu/gpu and give insane numbers (200-300fps) for short periods of time.
2) both average fps and min fps are higher in the custom timedemo.

There's a third possibility, similar to second: min fps has nothing to do w/ cpu bottleneck in this particular demo. But that's a very remote possibility. So far, min fps in games has been mainly due to cpu bottlenecking in busy scenery/fights.

I have to say that there's really only one way to resolve these issues. Take fraps on the regular timedemo1 under your conditions. If it shows higher min fps numbers than the custom timedemo, you are ok. If not, then you, xbit labs, goofed.
0 0 [Posted by:  | Date: 08/26/04 02:21:22 PM]
- collapse thread

In response to your suggestion of using fraps to measure the timedemo speeds, I do not know what the results would be, but it would not prove anything either way. Even if the general assumption that low fps=cpu dependant holds true, there could be a split second of very low fps (=cpu dependency) in an otherwise gpu limited timedemo. This theoretical demo would still be more adept at measuring graphics card performance.

Imagine a demo that runs, on a baseline system, at:

100 fps for 55 seconds (completely gpu limited)
1 fps for 5 seconds (completely cpu limited)

there are 100*55+1*5=5505 frames, completed in 60 seconds

the average fps =5505/60=91.75 fps

doubling the gpu speed would yield: 200 fps for 27.5 seconds and still 1fps for 5 seconds. avg fps is 5505/32.5=169.38 fps
which is quite differnt from baseline.

doubling the cpu speed yields: 5505/(55+2.5)=95.74 fps
which is hardly different from baseline.

Clearly, this first example is a primarily gpu dependant test.

Now if a second demo has a scene that is not quite as demanding as the difficult scene (say 2 fps baseline), but went on for longer (say 55 seconds), the results would be (I will spare you the math, but you can check it yourself, the other 5 seconds are at 100fps).

baseline: 10.17 fps
2xgpu: 10.61 fps
2xcpu: 18.77 fps

which is clearly a cpu limited test, even though it has higher min fps (2) than the first example (1). These examples may not be very realistic, but they serve to provide a counterpoint to the assertion that low min fps=cpu limited, as it isn't always (when looking at the entire test). As a side note, this is the reason average fps is still a more popular measure than min fps, because average fps contains information of the entire test and is therefore more accurate than min fps which only contains information on a single point. Perhaps a hybrid measure, something like would be more useful?
0 0 [Posted by:  | Date: 08/26/04 05:24:17 PM]

How can you declare that your demo is more "high-CPU-load" than the standard one? Everyone can see, the standard one is 104.4 and yours is 130.4。So your cpu-burn demo is simple and valueless.
0 0 [Posted by:  | Date: 08/26/04 07:03:12 PM]


Back to the Article

Add your Comment