Articles: Memory
 

Bookmark and Share

(22) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 ]

Memory Clock Rate and Integrated Graphics Core

Graphics cores integrated into modern CPUs share system memory with the CPU proper. Therefore their performance is going to depend on the speed of DDR3 SDRAM installed. Moreover, considering that 3D rendering involves a lot of texture data transfers, this correlation may be even stronger than in conventional computing tasks. So, we want to see how the Intel HD Graphics 4000 core performs with system memory clocked at different frequencies.

Of course, we start out with the popular 3DMark 11.

Curiously, 3DMark 11 doesn’t think that the computer’s graphics performance depends on the memory subsystem bandwidth. The difference between the best (DDR3-2666 SDRAM) and worst (DDR3-1333) results is a mere 2.5%, which is even lower than in most non-graphics applications. The Ivy Bridge microarchitecture in which the graphics core has a dedicated cache and can also access the CPU's L3 cache seems to mask the effect of higher memory bandwidth on the performance of the HD Graphics 4000/5200 core.

The picture is somewhat different in actual games, though.

It looks like games do run faster on the integrated Ivy Bridge graphics when the memory subsystem has a high clock rate. Each extra 266 MHz of clock rate translates into a few percent addition to the frame rate, up to a total of 25%! It doesn’t mean the integrated graphics core itself gets faster because, as we’ve seen earlier, the frame rate goes up with a discrete graphics card as well. But anyway, if you plan to use the integrated graphics core of your Ivy Bridge processor, you may want to make it faster by installing high-frequency DDR3 SDRAM.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 ]

Discussion

Comments currently: 22
Discussion started: 07/06/12 08:49:29 PM
Latest comment: 08/19/14 02:05:42 AM

View comments

Add your Comment