Do Windows virtual machines suffer from entropy shortage too?
Recently we ran into a problem where one of our Linux-based virtual machines was really slow due to a chronic shortage of "entropy".
I'm wondering if Windows virtual machines would suffer from the same problem. (A Google search gave me no relevant hits, but I could be using the wrong search terms.)
Solution 1:
The documentation for the Windows cryptographic API does not suggest that the calls for generating a key or generating random data can fail or be delayed due to insufficient entropy. So, no, Windows does not suffer from the problem you're describing.
This may mean that in a virtual or otherwise external-entropy-starved environment some cryptographic functions might not be as secure as is desirable. However, I've never seen any analysis of this. I'm inclined to think that modern computers are sufficiently complicated that internal entropy sources are adequate, and the Linux systems are just being overly cautious - but I'm not a cryptographer, so my opinion doesn't really count!