Does installing larger RAM means consuming more energy?
In general, each DIMM will consume roughly the same amount of power. The more RAM you add to a system, the more power it will consume (RAM is always on and ready for use). The following diagram is from Micron:
The line points represent the speed of the RAM, and the bars represent the power consumption (in Ws/GB - Watt-second/gigabyte, or Watts/gigabyte per second).
This yields the following figures (as averages) for each DIMM:
SDRAM = 1.1 GB/s * 3.0 Ws/GB = 3.3W
DDR = 2.9 GB/s * 1.5 Ws/GB = 4.4W
DDR2 = 5.0 GB/s * 0.5 Ws/GB = 2.5W
Just a reminder that we're dealing with averages here, and remember that the figures above are for each DIMM. Modern DIMM's consume less power than others, and overclocked/high performance modules use more. In general, the figures above are accurate enough for most means.
Unfortunately, this doesn't mean you can just stop there and calculate up some numbers. A page worth reading is this Tom's Hardware article page. To make a quick quote:
...memory power requirements depend directly on the motherboard, as the efficiency of voltage regulators has an impact as well.
I would assume that laptops would be much more "efficient" compared to desktops. In their article, they outline that the best-case for a desktop for RAM is 5-10W... They do not make any mention of how many DIMM's this is for, or what RAM type it is for.
Because you're dealing with a laptop, I would assume that the figures I outlined above would be a good mid-upper bound estimate. If you're up for it, you could get HWMonitor. You can just download the .ZIP version (portable), fire it up, and see if it provides you with power usage statistics (some computers provide extensive power draw information, others don't - it depends what sensors are available). If that doesn't work, you can also get a Kill A Watt (basically, a multimeter that plugs into a wall socket), and plug your laptop (sans-battery) in to see how much power it consumes.
Then you could just let it sit idle (either with HWMonitor or a Kill A Watt), get the average power consumption, take a few DIMM's out (or add a few), and repeat.
...but if the extra memory implies less disk access, then the total power usage might decrease...?
(Just guessing here.)
If you skimp on the memory for your system, it’s just going to swap-to and read-from the hard-drive more often which will definitely use much more power.
Also, despite the excellent answers you’ve received, power draw definitely depends on how the memory is used. The refresh cycle you are concerned about only happens about every 10 milliseconds, but the bits also have to be refreshed again after every read, since a read also depletes the capacitors. The typical read latency for RAM is about 5 nanoseconds. That difference is 6 orders of magnitude! That means that simply reading through your whole memory once will use a million times more energy than the refresh ("idle") workload did during that time.
So, if you have more memory than you need, the excess memory you are not using is definitely using a lot less energy than memory you are actually utilizing. (In other words, by about a wide margin, you only pay for what you use. ) Combine this with the disk swapping consideration, and you would likely find that adding RAM appropriate to the workload likely to reduce overall energy usage.
Of course this guideline does not scale to absurd levels, as clearly, if you install 4GB RAM to only run solitaire on windows 95, that extra RAM would be a pure energy waste.
On the other hand, if (as it sounds) you are debating the battery life and utility bill consequences of putting more RAM in your laptop, and you will be multitasking with XP or later, there are any number of factors are going to be far more significant to your consideration. Here are a few:
- The number of apps you run.
- The efficiency of the Antivirus software you use.
- The features of the OS that you enable. (aero, indexing, background services, etc.)
- Backlight level.
Any one of the above factors are going to matter more in the long run than the power used to keep refreshing under-utilized RAM.
Yes, it increases the power consumption on a desktop, but it is very minimal, over the course of a year and leaving your pc on 24/7, it may be a couple of £/$, but not worth turning ECO protester and thinking everyone should go to single socket... (mind you, if everyone did it.... can't think like that!)
In a laptop on the other hand, it is possible that each memory module you add can take around 10 minutes off the battery (assuming standard battery).
All this being said, different specification memory can all have different (but similar) power requirements.
Also - remember that over clocking your system will not only consume more power in memory, it will use a lot more power in everything.
Edit - Looking up for you (from Kingston), it seems the average module for desktop, laptop and server is around 2.1-2.2v