I couldn't find a good title for the question, this is what I'm trying to do:
- This is .NET application.
- I need to store up to 200000 objects (between 3KB-500KB)
- I need to store about 10 of them per second from multiple-threads
- I use binaryserialization before storing it
- I need to access them later on by an integer, unique id
What's the best way to do this?
- I can't keep them on memory as I'll get outofmemory exceptions
- When I store them in the disk as separate files what are the possible performance issues? Would it decrease the overall performance much?
- Shall I implement some sort of caching, for example combine 100 objects and write it once as one file. Then parse them later on. Or something similar?
- Shall use a database? (access time is not important, there won't be search and I'll access only couple of times by the known unique id). In theory I don't need a database, I don't want to complicate this.
UPDATE:
- I assume database would be slower than file system, prove me wrong if you got something about that. So that's why I'm also leaning towards to file system. But what I'm truly worried is about writing 200KB*10 per second to HDD (this can be any HDD, I don't control hardware, it's a desktop tool which will be deployed in different systems).
- If I use file system I'll store files in separate folders to avoid file-system related issues (so you can ignore that limitation)