In response to your suggestion of using fraps to measure the timedemo speeds, I do not know what the results would be, but it would not prove anything either way. Even if the general assumption that low fps=cpu dependant holds true, there could be a split second of very low fps (=cpu dependency) in an otherwise gpu limited timedemo. This theoretical demo would still be more adept at measuring graphics card performance.
Imagine a demo that runs, on a baseline system, at:
100 fps for 55 seconds (completely gpu limited)
1 fps for 5 seconds (completely cpu limited)
there are 100*55+1*5=5505 frames, completed in 60 seconds
the average fps =5505/60=91.75 fps
doubling the gpu speed would yield: 200 fps for 27.5 seconds and still 1fps for 5 seconds. avg fps is 5505/32.5=169.38 fps
which is quite differnt from baseline.
doubling the cpu speed yields: 5505/(55+2.5)=95.74 fps
which is hardly different from baseline.
Clearly, this first example is a primarily gpu dependant test.
Now if a second demo has a scene that is not quite as demanding as the difficult scene (say 2 fps baseline), but went on for longer (say 55 seconds), the results would be (I will spare you the math, but you can check it yourself, the other 5 seconds are at 100fps).
baseline: 10.17 fps
2xgpu: 10.61 fps
2xcpu: 18.77 fps
which is clearly a cpu limited test, even though it has higher min fps (2) than the first example (1). These examples may not be very realistic, but they serve to provide a counterpoint to the assertion that low min fps=cpu limited, as it isn't always (when looking at the entire test). As a side note, this is the reason average fps is still a more popular measure than min fps, because average fps contains information of the entire test and is therefore more accurate than min fps which only contains information on a single point. Perhaps a hybrid measure, something like would be more useful?
08/26/04 05:24:17 PM]