OutOfMemoryException when I read 500MB FileStream
I'm using Filestream for read big file (> 500 MB) and I get the OutOfMemoryException.
Any solutions about it.
My Code is:
using (var fs3 = new FileStream(filePath2, FileMode.Open, FileAccess.Read))
{
byte[] b2 = ReadFully(fs3, 1024);
}
public static byte[] ReadFully(Stream stream, int initialLength)
{
// If we've been passed an unhelpful initial length, just
// use 32K.
if (initialLength < 1)
{
initialLength = 32768;
}
byte[] buffer = new byte[initialLength];
int read = 0;
int chunk;
while ((chunk = stream.Read(buffer, read, buffer.Length - read)) > 0)
{
read += chunk;
// If we've reached the end of our buffer, check to see if there's
// any more information
if (read == buffer.Length)
{
int nextByte = stream.ReadByte();
// End of stream? If so, we're done
if (nextByte == -1)
{
return buffer;
}
// Nope. Resize the buffer, put in the byte we've just
// read, and continue
byte[] newBuffer = new byte[buffer.Length * 2];
Array.Copy(buffer, newBuffer, buffer.Length);
newBuffer[read] = (byte)nextByte;
buffer = newBuffer;
read++;
}
}
// Buffer is now too big. Shrink it.
byte[] ret = new byte[read];
Array.Copy(buffer, ret, read);
return ret;
}
Solution 1:
The code you show, reads all content of the 500mb file into a contiguous region in memory. It's not surprising that you get an out-of-memory condition.
The solution is, "don't do that."
What are you really trying to do?
If you want to read a file completely, it's much simpler than the ReadFully method you use. Try this:
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
byte[] buffer = new byte[fs.Length];
int bytesRead = fs.Read(buffer, 0, buffer.Length);
// buffer now contains the entire contents of the file
}
But... using this code won't solve your problem. It might work for a 500mb file. It won't work for a 750mb file, or a 1gb file. At some point you will reach the limit of memory on your system and you will have the same out-of-memory error you started with.
The problem is that you are trying to hold the entire contents of the file in memory at one time. This is usually unnecessary, and is doomed to failure as the files grow in size. It's no problem when the filesize is 16k. At 500mb, it's the wrong approach.
This is why I have asked several times, what are you really trying to do ?
Sounds like you want to send the contents of a file out to an ASPNET response stream. This is the question. Not "how to read a 500mb file into memory?" But "how to send a large file to the ASPNET Response stream?"
For this, once again, it's fairly simple.
// emit the contents of a file into the ASPNET Response stream
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
Response.BufferOutput= false; // to prevent buffering
byte[] buffer = new byte[1024];
int bytesRead = 0;
while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) > 0)
{
Response.OutputStream.Write(buffer, 0, bytesRead);
}
}
What it does is iteratively read a chunk from the file, and write that chunk to the Response stream, until there is nothing more to read in the file. This is what is meant by "streaming IO". The data passes through your logic, but is never held all in one place, just as a stream of water passes through a sluice. In this example, never is there more than 1k of file data in memory at one time (well, not held by your application code, anyway. There are other IO buffers lower in the stack.)
This is a common pattern in streamed IO. Learn it, use it.
The one trick when pumping data out to ASPNET's Response.OutputStream is to set BufferOutput = false
. By default, ASPNET tries to buffer its output. In this case (500mb file), buffering is a bad idea. Setting the BufferOutput
property to false will prevent ASPNET from attempting to buffer all the file data before sending the first byte. Use that when you know the file you're sending is very large. The data will still get sent to the browser correctly.
And even this isn't the complete solution. You'll need to set the response headers and so on. I guess you're aware of that, though.
Solution 2:
You are doubling your buffer size at each reallocation, which means previously allocated blocks can never be used (they effectively leak). By the time you get to 500 MB, you've chewed up 1 GB plus overheads. In fact, it might be 2 GB, since, if you hit the 512 MB, your next allocation will be 1 GB. On a 32-bit system, this bankrupts your process.
Since it's a normal file you are reading in, just query the filesystem for its size and preallocate the buffer in one go.