Difference between server-grade RAM and desktop-grade RAM?
What's the difference between server-grade RAM and desktop-grade RAM?
Solution 1:
Quality (i.e. reliability), error correction and often the ability to have them replaced when they start warning of failure rather than after failing.
Solution 2:
Server ram can be error correcting (ECC) or non-ECC. With the current quality of RAM, ECC is losing ground. The other difference is Registered/buffered RAM. This allows servers to access more ram due to the lower electrical load on the memory controller. Many servers require registered memory but some of the newer servers don't, but even the newer ones can handle more RAM if registered memory is used. Both ECC and registered memory are more expensive than desktop memory.
Hardware is cheap; downtime is expensive.
Solution 3:
What's the difference between server-grade RAM and desktop-grade RAM ?
Quality. The rest is not an issue of "server to workstation" but specific technical requirements (ECC, registered etc.) that is well documented.
Solution 4:
Like the others said, but I will expand:
- ECC -- Most "SERVER" dimms have an extra DRAM per rank to enable ECC. This is also known as "check bit"
BUFFERED vs UNBUFFERED
RDIMM -- Some server DIMMS will have a BUFFER or REGISTER chip designed to re-broadcast the CA (command/address) signals to all the DRAMS on the DIMM. This reduces load on the CA bus, but has the negative effect of slowing everything down by 1 D-clock (think cas latency here, you will get a +1 on all timings)
LRDIMM and FBDIMM -- All signals are buffered and thus these dimms are typically very expensive.
DRAM width -- Most desktop products (CPU's) only support x8 and x16 DRAMS where servers will support X4.
Number of slots -- Servers often support more slots per channel than desktop (hence the need for BUFFERED dimms)