Commit charge is 100% full but physical memory is just 60% when using no page file
I have disabled the page file in my system (hard disk is too slow, cannot buy a new one right away, cannot move page file to another partition). When I see into Resource Monitor, using memory demanding applications, the system shows that commit charge is almost 100% full. Indeed, if I keep on demanding more memory, programs start to crash as commit charge effectively reaches 100%.
In the meanwhile, the system says I'm using just 50-60% physical memory and have around 1GB memory available (free + standby).
If commit charge is the total memory actually requested, why does the system says so much memory is free? Is the physical memory being unused by Windows? Is the memory graph wrong? Am I missing something?
Running out of commit limit while you still have lots of available RAM is not at all unusual. Neither the commit limit nor the commit charge are directly related to free or available RAM.
The commit limit = current pagefile size + RAM size.
Since you have no page file, the commit limit is smaller than it would be if you had a page file. It doesn't matter how much of the RAM is free. For the commit limit, only the amount of RAM installed matters. You can run out of commit limit even with 90% of your RAM free or available.
Commit charge is a count of virtual memory, not physical. Suppose my program asks for 2 GB committed, but then it only accesses .5 GB of it. The remaining 1.5 GB never gets faulted in, never gets assigned to RAM, so RAM usage does not reflect the 2 GB, only .5 GB.
Still, "system commit" is increased by 2 GB because the system has "committed" that there WILL be a place to hold my 2 GB, should i actually need it all. The fact that on any given run of the program I won't necessarily try to use it all doesn't help. I asked for 2 GB and the successful return from that call tells me that the OS "committed" - i.e. promised - that I can use that much virtual address space. The OS can't make that promise unless there is some place to keep it all.
So: put your pagefile back, add more RAM, or run less stuff at one time. Or some combination of the three. These are your only options for avoiding the "low on memory" and "out of memory" errors.
See also my answers here (longer) and here (much longer).
As the memory allocation test in the article at http://brandonlive.com/2010/02/21/measuring-memory-usage-in-windows-7/ illustrates, Windows is a type of system that would fail a large memory allocation if such allocation, together with all the prior allocations (the concept Microsoft calls as "commit"), would bring the total "commit" above the sum of both the physical memory and the sum of all the page files (swap).
Consider that an allocation by itself doesn't use any actual memory (neither physical nor swap), prior to a read or write taking place within the virtual address space of the process for the aforementioned allocation. E.g. a 2GB allocation by itself would only affect the "Commit" numbers (in Windows 7 talk), leaving "Physical Memory" alone (until read/write within the said allocation happens).
As far as OS design goes, the alternative approach would be to always allow allocation of any size (unless the available memory is already completely exhausted), and then let the applications fail on read/write instead. See https://cs.stackexchange.com/questions/42877/when-theres-no-memory-should-malloc-or-read-write-fail for more details.
The available memory is not what you think it would be. It not unused it really a file cache of recently terminated processes or trimed processes that have been force to give up some memory to other processes. They could be called back to there original purpose. see for more detail.
http://support.microsoft.com/kb/312628
As to not have a page file this is very bad. Windows degradeS poorly without one. Remember even executable files are used as swap files when there is no page file. Even if the drive is slow it better to have a page file until you get up to 8 to 16 gigs of memory. Some people think Even windows 7 can run without one then.
I regularly give old machine a boost by doing a few things. Clean up the hard drive as much as possible. Copy anything you can temporarily remover from the drive onto a backup. Remove applications you don't need. Remove apps can reinstall.
When all that is done defragment your hard disk. At that point recreate your page file. It will be the closest to the front of the drive as is possible. Create a fixed size about 1.5 times memory. Thats my rule, usually I have seen sizes between 1 and 3 time memory. This will give it a slight boost in speed over the usual places it would be placed.
I use the auslogic defrager it's free (ads for more tool though). There are other that do this too. Check out the defragers at portableapps.com. It optimizes the disk by placeing recently accessed files near the front of the drive for faster access. It shows where the page file is placed so you can see if you moved it to the top 25% of the drive.
After that reinstall apps and copy back your data.
I would say you get 10 or 20% boost. But the main value is a lot of the hesitation goes away for a smoother experience.