by Ilya Gavrichenkov
02/02/2006 | 11:23 AM
How much RAM should be enough for comfortable work of an up-to-date platform? Most today’s systems are equipped with 512MB or 1GB of RAM. This amount of memory has been quite sufficient recently for work in most contemporary applications. However, the memory makers and retailers started pushing forward the idea that today’s systems need as much as 2GB of SDRAM. Is it really so? Some people believe it makes sense, some don’t, but it is us who will be digging out the truth. Since there appear more and more 2GB memory kits in the market, we decided to carry out our own investigation that would show us whether contemporary computer systems will really require over 1GB of system memory.
We are going to run our tests using an AMD Athlon 64 platform. Firstly, these systems are more widely spread among computer enthusiasts, and secondly, the 1GB DDR SDRAM modules seem to be the most interesting ones from the user prospective these days. The thing is that they are very much different from the 512MB modules in their characteristics. It can be explained by the fact that they are based on principally different microchips.
By the way, 1GB memory modules used to be pretty rare and cost quite a bit of money until recently. However, the situation has changed lately, as the memory chips needed for 1GB memory modules production became wider available. As a result, there appeared some pretty fast overclocker’s DDR SDRAM modules with 1GB capacity in the market. Unfortunately, they cannot boast the same characteristics as 512MB solutions most popular among computer enthusiasts. Of course, you can equip your system with 2GB of memory by installing 4 x512MB modules. However, this solution will always work slower than a system equipped with a pair of 1GB memory DIMMs (this is primarily true for Athlon 64 systems, but also works for Pentium 4 based platforms in many cases). Moreover, a solution like that would be less economical. Anyway, all these statements need to be proven by practical tests, so let’s pass straight to the tests then.
We compared the performance of our systems equipped with different amount of RAM following a very simple pattern. We assembled a system with AMD Athlon 64 FX-60 processors and Nvidia GeForce 7800 GT graphics card. Then we installed either two memory modules 512MB each or two memory modules 1024MB each. The memory worked at 400MHz in both cases with 1T Command Rate setting. All other timing settings were also identical to ensure that the performance comparison would be correct and fair.
I have to point out that it is impossible to have the memory timings work at the most aggressive (and best from the performance prospective) settings of 2-2-2-10, even if we have the best DDR400 SDRAM 1024MB memory modules at our disposal. With aggressive timings like that all these memory modules would work unstably. This is exactly the reason why we had to work with the timings set to 2-3-2-10.
Besides the side-by-side performance comparison of the 512MB and 1024MB memory modules, we decided to measure the performance of our test platform working with four 512MB DDR400 SDRAM DIMMs installed. However, I have to stress right away this is not the most optimal configuration for the memory subsystem. The thing is that having four double-sided DIMM modules in an Athlon 64 based platform requires the use of 2T Command Rate setting, which has a negative effect on the performance in this case.
Moreover, there is one more disadvantage of having four DIMM modules in your system, which is indirectly related to the topic of our today’s discussion. The use of four memory modules in Athlon 64 platforms reduces significantly the overclocking potential of the platform by limiting the bus frequency increase. For example, a lot of Socket 939 mainboards refuse to raise the clock generator frequency over 240-250MHz when they are equipped with a full set of four memory DIMMs. By simply removing a pair of DIMMs from the system you can immediately raise the stability bar beyond 300MHz.
Here is a list of hardware components we used to assemble our test platforms:
We decided to start our tests of systems equipped with different amount of RAM with the popular synthetic benchmarks.
PCMark05 uses the same algorithms as real applications to estimate the system performance. However, it doesn’t require big memory resources: platforms with 1GB and 2GB of RAM onboard demonstrated pretty much the same results here.
At the same time the system with four 512MB DIMMs stands out a little bit. This memory configuration works slightly slower than the systems with only two memory DIMMs installed. This phenomenon is a great illustration of the performance drop caused by the forced 2T Command Rate timing setting that jumps in place of 1T in this case. With more memory modules in the system the number of managed memory banks increases and hence the memory bus of the Athlon 64 processor gets loaded more intensively. Unfortunately, in this situation the memory controller has to switch to a less aggressive working mode to ensure system stability. So, as you may see, the influence of the memory size increase can be twofold: installing more RAM can not only speed up the system, but also slow it down in some cases.
PCMark05 test contains an individual benchmark for the memory subsystem. Let’s take a look at the results we can obtain here.
It is evident that the subtest from PCMark05 suite also uses less than 1GB of RAM. Therefore we cannot notice any significant performance difference between the platforms equipped with different amount of RAM. However, when it comes to a system with four double-sided DDR400 SDRAM modules, we see the negative effect from 2T Command Rate setting very clearly.
The recently released synthetic 3DMark06 benchmarking suite serves primarily for the graphics subsystem performance analysis. However, the developers position it as a means for evaluating the overall gaming performance of the system, so we decided to run it today as well.
However, the results obtained in 3DMark06 proved just slightly dependent on the amount of RAM in the tested systems.
The CPU test from the same benchmarking suite measures the system performance during game physics calculation and game characters AI processing. These results are very similar to what we have just seen in PCMark05. As we can see, in this case 1GB of RAM is quite enough for comfortable functioning.
The last synthetic benchmark we are going to use for our today’s investigation is ScienceMark 2.0. It will allow us to evaluate if we really need more than 1GB of RAM for typical physical calculations (mostly used for molecular dynamics projects).
The situation is the same again. This test application doesn’t reveal any real advantages of having large amount of RAM installed. However, it allows to clearly see the evident drawbacks of having all four memory slots occupied.
I have to mention here that one of the arguments the CPU makers use in favor of the shift to x86x64 architecture is the possibility to have over 4GB of RAM in the system. However, at this time we don’t even see the need for 2GB of RAM. Strange, isn’t it? Anyway, it is still too early to draw any conclusions, especially since we have only run a few synthetic benchmarks. Let’s take a look at the results our platforms will show in real applications.
With 2GB of system memory the archiving utilities can compress data with a larger dictionary than those used today. Theoretically it can improve the efficiency of programs like that quite noticeably. However, most of the today’s popular archiving utilities do not use this opportunity. As a result there is no visible dependence of the data compression speed on the size of the system RAM.
However, the performance of data compression utilities appears more sensitive to memory subsystem latency. According to our results, a system with four DIMM modules working at 2T Command Rate falls over 7% behind the same system with two DIMM modules and 1T Command Rate.
Video encoding with contemporary codecs depends just a little bit on the memory subsystem parameters. We have already pointed this out in our previous articles. Therefore, it is not at all surprising that the different size of system memory and number of memory modules installed do not really affect the performance in this type of tasks.
The image editing and nonlinear video editing tools have traditionally been considered quite resource hungry when it comes to hardware. However, let’s see what is going to change is these applications have 2GB of RAM at their disposal.
We can see that the test script is running somewhat faster in Adobe Photoshop CS2 when the amount of system RAM grows bigger. However, once 4 memory DIMM modules are installed instead of two, the advantage we have just gained disappears because of the increased memory latency. So, installing more RAM will get you a definite performance improvement, but only if you use 2 memory modules x 1GB each.
At the same time I have to say that during our tests in Photoshop we used a reference image that eats up only 100MB of the system memory. In some specific cases, in polygraphy for instance, more memory space may be required for processing of much larger images. Of course, additional memory will be a great thing to have in this case. However, these situations will most likely occur in specific professional PC applications and will hardly affect regular PC users.
Adobe Premiere Pro 1.5 is absolutely indifferent to the system RAM size. It can hardly ever consume more than 1GB of memory for its needs.
The popular professional application for 3D modeling and rendering, 3ds max 7, doesn’t show any significant performance jump when we increase the amount of system memory. However, we do notice some positive effect from the second Gigabyte of RAM we install.
For example, when we run SPECapc test in viewports, the performance improves by 3.2% with four DDR400 SDRAM modules (512MB each) and by 4.3% with two DDR400 SDRAM (1GB) modules. In other words, the positive effect from the larger amount of RAM is even higher than the negative influence of the 2T memory timing when all four memory slots of the mainboard are occupied.
During final rendering in 3ds max 7 the situation looks very similar, but the effect from the additional system memory is somewhat smaller this time. The platform equipped with two 1GB DDR400 SDRAM memory modules is only 1.9% faster than a platforms with two 512MB DDR400 SDRAM DIMMs.
Even though we have finally managed to detect the performance growth from the use of 2GB system memory, we can hardly regard it as a significant improvement. The ordinary user will hardly sense a 2-4% performance growth.
The previous-generation games do not actually require a lot of system memory. By adding another gigabyte of RAM into the system we do not really boost the gaming performance. However when there are four 512MB memory modules installed, the performance drops down because of the forced 2T Command Rate setting. The performance drop in this case is about 1-2%.
The results obtained in the new F.E.A.R. game suggest that the latest games feel quite comfortable with 1GB of RAM in the system.
However, the results obtained in Quake 4 do not prove this statement. Additional RAM provides a significant fps rate increase. In the best case (when you have 1GB modules installed) the performance gain may reach up to 10%. This is the biggest performance improvement so far in our today’s test session.
The situation with the performance dependence on the RAM size seems to be more or less clear by now. Most contemporary tasks should be satisfied with 1GB of memory. Only some heavy professional applications and the latest games can benefit a little bit from 2GB of RAM, because their high-quality graphics may require additional memory space for data storing.
However, we won’t make any final conclusions just yet. Let’s find out how we could benefit from additional system memory during multi-tasking. Usually you run more than one application on your PC, and altogether they might eat up more than 1GB of RAM. To evaluate the system performance in the most common multi-tasking models we resorted to SYSMark 2004 SE benchmarking suite that contains six more or less standard working patterns.
In this test we have an image rendered by 3ds max 5.1 into a bmp-file, while the user is preparing web-pages in Dreamweaver MX. Then the user renders some 3D animation into vector graphics format.
The obtained results show that when we have two “heavy” applications running at the same time, the second gigabyte of RAM can actually ensure some performance improvement. Although the improvement we observed in this case turned out quite small, I should say.
Now the test is emulating the user’s work in Premiere 6.5, when he is creating a video movie in raw-format from a few other movies and separate sound tracks. While waiting for the operation to be completed, the user is also modifying and saving to the hard drive a picture in Photoshop 7.01. When the video is finished, the user does the necessary editing and adds special effects to it in After Effects 5.5.
Just like in the previous test, the performance of test platforms equipped with 2GB of memory turns out higher than that of a system with just 1GB of RAM. And again the performance improvement is no bigger than 2%.
Here, our hypothetical user extracts from the zip-archive the web-site content and at the same time opens an exported 3D vector video in Flash MX. Then the user modifies it by adding some new pictures and optimizes it for faster animation. The final video with applied special effects is then compressed with Windows Media Encoder 9 so that it could later be broadcast via internet. The created web-site is then composed in Dreamweaver MX, while the system is scanned for viruses with VirusScan 7.0.
This work pattern is less demanding to the amount of system memory. However, the additional Gigabyte of RAM does have its positive effect here, too. Although I have to admit that the performance advantage in this case is hardly comparable with the price you will have to pay for one more gigabyte of system memory.
Here the test is emulating the user’s work when he is receiving an e-mail in Outlook 2002 with a number of documents in a zip-file attached to it. While the files are being scanned for viruses with the VirusScan 7.0, the user is looking through the e-mails and makes notes in the Outlook calendar. After that the user checks a corporate web-site and some documents through Internet Explorer 6.0.
This test uses a relatively simple set of tasks. So, it is not at all surprising that 1GB of RAM appears more than enough here.
In this test the hypothetical user is editing some text in Word 2002 and uses Dragon NaturallySpeaking 6 to convert an audio file into a text document. The document is then converted into pdf-format in Acrobat 5.0.5. After that the prepared document is used to create a PowerPoint 2002 presentation.
Here I can only repeat what I have just said about a previous diagram. No additional RAM is required in this case.
In the next test we see the following situation: the user opens a database in Access 2002 and creates a number of requests. The documents are archived with WinZip 8.1. The request results are exported into Excel 2002 and a diagram is created.
Summing up, I would like to say that most usage patterns dealing with office applications running simultaneously can do perfectly well with only 1GB of RAM. At the same time, digital content creation and processing tasks running in parallel can benefit from additional system memory. However, if you are not working with any super high-resolution data, the advantages gained from additional RAM in your system will be quite tiny: the results of SYSMark 2004 SE indicate a 1-2% performance improvement, not more than that.
The practical tests we have just discussed above indicate one thing: 1GB of memory is more than enough for contemporary applications. However, there will definitely be users among you who will strongly disagree with this statement. This section of our article is dedicated exactly for you, guys. Here we will try to prove that there can be found applications where 2GB of system memory will be absolutely appropriate. The thing is that despite everything we have already said, there are a lot of situations when 1GB of RAM may turn out insufficient.
First, the applications working with large amounts of data will certainly require more system memory. We have already mentioned this above. For instance, when you need to edit large images with graphic arts quality in Adobe Photoshop, 1GB of RAM may simply be too little to complete the desired operations. And there can be more examples like that. However all of them deal with the professional tasks and are very unlikely to occur in everyday life. And the high-performance workstations that are usually assembled for professional needs are already being equipped with 2GB or 4GB of RAM.
Another situation when 1GB of memory will not suffice for comfortable work is when there are several simultaneous tasks working with the memory. The dual-core processors that can process several computational threads at the same time very often push us towards this working algorithm. Why should we wait for the video encoding task or movie rendering task to be completed? We could easily do something else in the meanwhile, especially since CPUs based on dual-core architecture have more than enough resources for successful multi-tasking (without irritating delays) even if there are some resource-hungry applications running in the background.
Of course, if you are launching more and more tasks in the background mode you can soon exhaust the physical RAM, which will force the Windows memory manager to actively involve the virtual memory resources. In other words, some of the data required for certain applications will end up in the swap file on the system hard disk drive. This way, the HDD gets involved into the communication process between the CPU and the system memory turning into the system bottleneck in no time.
Since it takes much longer to complete the HDD requests than it would take to complete the memory requests, the CPU will be idling longer waiting for the new data to be submitted. Of course the system performance will drop down noticeably. For example, here is a screenshot showing the CPU utilization when we have three copies of 7-zip archiving utility running at the same time with a 32MB dictionary:
System equipped with 1GB of RAM
While two working copies of this archiving tool keep the CPU 100% busy, the launch of the 3rd copy of this utility requires active use of the swap file. The result is obvious: the CPU utilization as well as the overall system performance drop down by about 4 times, because of the insufficient RAM. Three fourths of the time the CPU remains idle waiting for the data that get shifted to the swap file. By adding more system memory into this platform you can easily solve this problem. In fact, this situation should also be regarded as a pretty rare occasion. You won’t really see that many people working this way. So, it looks like we desperately need an essential argument in favor of the 2GB RAM. And this argument is right here!
I am talking about the latest new-generation games. Although the tests measuring the fps rate in a pre-recorded demo-scene do not reveal any vital need for a lot of memory, the real situation is completely different. During real gameplay the textures get uploaded from the hard drive as needed. As a result, if there is more RAM in the system, the system will request new chunks of texture data from the HDD much rarer. Such contemporary games as Quake 4, Battlefield 2 or F.E.A.R. launched with high image quality do not feel at home if they only have 1GB of system memory at their disposal. The game sends data requests to the hard disk drive one after another in complex scenes, thus causing unpleasant delays that spoil the gaming dynamics and simply prevent you from aiming correctly.
To illustrate what has been just said, we recorded a log file of momentary fps rates in Point of Entry level (Interval 08 – Desolation) run on systems with different amount of RAM installed. The graphics quality settings were at their maximum with only one single exception: we disabled soft shadows.
The system with only 1GB of RAM demonstrated disappointing delays, especially in the very beginning of the game level, when the action takes place in an open space.
The upgrade of our system memory to 2GB changed the situation dramatically: most delays were eliminated, and even in the very beginning of the game all movements were comfortably smooth.
Note that the performance of a system with 1GB of RAM dropped below 20fps in 8% of all cases. The system with 2GB of system memory fell below the 20fps in only 1% of all cases. And in fact, this 1% is almost solely formed by auto-saving option enabled for preset checkpoints, where this discreteness doesn’t irritate at all. In other words, it is really hard to enjoy playing F.E.A.R. with the highest image quality settings if your system has only 1GB of RAM.
The similar picture can be observed in other newest game titles that we have already listed above. So, if you are a happy owner of an expensive high-performance latest-generation graphics card, go for a memory upgrade: 1GB will be not enough for your appetite. This is exactly the user group (plus some professionals) that really needs 2GB RAM kits.
Well, it’s high time we summed up the results of our today’s discussion. 1GB of RAM will be enough for almost any mainstream system. This amount of system memory guarantees comfortable work in all contemporary applications. You will even be able to run a few programs at the same time without much trouble when switching between them. The previous generation games will also run without delays with the maximum quality settings, and the latest games released in the end of last year will work fine with medium texture settings.
However, high-performance systems have to get 2GB of RAM. Gamers feel the need for this amount of system memory more than anybody else, if they want to be able to play the latest games with the high-quality image settings. Moreover, 2GB of RAM or even more than that will be very handy for professionals working in “heavy” applications processing large amounts of data.
So, I have every right to state that hardware enthusiasts do need 2GB of RAM in their systems. And this need will keep growing with the time. Therefore, we are going to take a closer look at the 2GB memory kits from the leading memory makers available in the today’s market in our next article. Especially, since we have already proven: the best way to install 2GB of RAM into your system is to get a 2-DIMM kit with 2x1GB modules.