It is pretty easy to cause out of memory exceptions in a process.
Just create a loop that allocates memory in blocks that are small enough to not be on the large object heap (but not too many that you have that cause the exception) and then you can try to open a smaller file and that will cause the opening of the file to not be able to allocate enough contiguous memory and you will get your OOM exception when opening your file without needing a huge file. Something like this...
List<byte[]> items = new List<byte[]>();
for (int i = 0; i < 10000; i++)
{
byte[] c = new byte[160000];
items.Add(c);
}
byte[] next = new byte[1000000000];
If you run the above code as is, you will get a OOM exception on the last line. But, if you comment out the loop first it will execute with no errors. You will probably have to tweak the loop a little to get it to make your file open fail every time, but you can do it. Just run the loop before your call to open your file in your test and you will have used up a large chunk of memory and your open should fail.
Also, you might want to look into setting the /3GB switch if it is an option for you. It is not always the right answer and it has drawbacks associated with it, but it changes the virtual memory split form 2GB/2GB to 1GB/3GB allowing your process access to more virtual address space. This will give you a little more breathing room in the size of files that you can open. Again, you should read about the downsides of doing this before going after this as a solution and make sure that it is worth it if it helps your situation.
Here is how to enable it on the server