Jesse
Jesse
CC#
Created by Jesse on 6/4/2024 in #help
Handling reading large files in C#
How do I go about reducing memory usage to handle large files in x86 C#? My current code is as follows:
FileStream fs = new FileStream(filepath, FileMode.Open, FileAccess.Read);
using (BinaryReader br = new BinaryReader(fs))
{
byte[] bytes = new byte[0];
using (MemoryStream test = new MemoryStream())
{
fs.CopyTo(test);
bytes = test.ToArray();
}

byte[] searchBytes = Encoding.UTF8.GetBytes("test");
List<long> positions = new List<long>();

foreach(long pos in Extensions.SearchStringInBytes(bytes, searchBytes))
{
positions.Add(pos - 4);
}
}
FileStream fs = new FileStream(filepath, FileMode.Open, FileAccess.Read);
using (BinaryReader br = new BinaryReader(fs))
{
byte[] bytes = new byte[0];
using (MemoryStream test = new MemoryStream())
{
fs.CopyTo(test);
bytes = test.ToArray();
}

byte[] searchBytes = Encoding.UTF8.GetBytes("test");
List<long> positions = new List<long>();

foreach(long pos in Extensions.SearchStringInBytes(bytes, searchBytes))
{
positions.Add(pos - 4);
}
}
When reading a large file (>500MB) the memory usage skyrockets to 2GB. The result is that it only works in x64 build, as x86 results in a OutOfMemoryException near 1GB memory usage. I have thought of reading the file in "chunks" but I'm not sure how. Any other suggestions aside from making the program x64 only?
50 replies