Analysis: How Much System Memory Is Really Enough?
When it comes to adding memory, we all have to deal with the cost vs. speed equation. So how much memory do you really need? Our tester got some surprising results.
How much memory is enough? That's a question that's bothered me -- and thousands of other computer users -- for years. And so far, I haven't seen too many answers that have really satisfied me.
It's especially important because, while the type of memory -- whether it be DDR, DDR2, or some other -- is locked in stone by which motherboard and processor your system works with, you get to choose the amount of memory that comes with a new machine (and add to it later).
However, it's not easy to figure out how much is enough -- computer memory is situational. What you're doing and the software you're using to do it are the deciding factors in determining the optimal memory size for your computer -- and they can change from PC to PC.
For example, according to Microsoft, all you need to run the Professional version of its Windows XP operating system is "128 megabytes (MB) of RAM or higher recommended (64 MB minimum supported; may limit performance and some features)." There's a minimum specification for the processor as well, but let's face it, chances are that your processor is well beyond that minimum.
In other words, my ancient and puny IBM ThinkPad 600X with its 64MB of memory should run Windows XP Pro. Stop laughing. It can -- to a point. Microsoft Word and Lotus Notes sail along smoothly. But that's about as far as you'll get. Windows is crafty -- instead of just grinding down to a screeching halt if it looks like memory's coming up short, Windows starts to use your hard disk as if it were memory, polling data to and from the drive as needed. The difference in speed (and, therefore, overall performance) is like that between walking and driving a NASCAR racecar.
Putting It To The Test
But what's the ideal as far as memory is concerned? In order to find that out, I decided to take a typical Media Center system and ramp it up by degrees from 512MB of memory (which is actually a more reasonable memory baseline than 64MB) to 2GB, which is half the maximum memory most consumer motherboards support, and the absolute maximum you'll find for some.
For those purposes, I acquired four Ballistix 240-pin DIMM, DDR2 PC2-6400 memory modules (P/N # BL6464AA804) from Crucial Technology. Because this is high-performance module (complete with heat sink), it can be expensive -- upward of $100 per module. You can find equivalents, like Crucial's standard PC2-4200 modules (P/N # CT6464AA53E), for about $40 less per module.
OSMark is a synthetic benchmark -- that means there are no actual commercial applications in the software. Instead, OSMark was designed to test all of the subsystems (CPU, memory, graphics, hard drives) and then derive a single performance number by combining and weighting those individual results. VideoStudio is a real application I used to separate 43 minutes of video clips from a one-hour captured television video and then stitch them back together to create a complete show, except without the commercials. It's the computer equivalent of heavy lifting.
Incidentally, the only other change I'll be making in the system, besides adding more memory, is moving in and out of dual-channel memory architecture. Not up to speed on dual-channel? No problem.
Dual-channel is the use of memory modules in pairs rather than as single devices. Why is that good? Think of a deck of playing cards. If you use just one hand to take a card off the top of the deck, bring it to you, and then go back for the next, you'll eventually end up with all of the cards in front of you. However, if you alternate hands, start moving one toward the deck as the other is moving away from it, you'll accomplish the same thing a lot faster.
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.
Infographic: The State of DevOps in 2017Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.