I'm currently developing a web application whose primary user function is uploading and downloading of files. The files will be stored on the hard disk (no cloud storage yet).
Taking into consideration the possibilities of gigabytes of data and a large number of files, do I need to organize files into sub folders to account for the fetching of a file or is the file system's indexing already very efficient and I can ignore this potential bottle neck?
Update:
On a side note, I plan to store file names and any additional information in a SQL database and only query the disk when a user actually wants to download the file. This is how I plan on retrieving files:
FileStream stream = File.Open("C:\file.txt");
byte[] fileContent = new byte[stream.Length];
stream.Read(fileContent, 0, fileContent.Length;
Any file information will be retrieved from the database. The hard disk will only be used for saving and fetching files.
Update 2:
Files will be saved as GUID
+ EXTENSION
on the hard disk while the actual file name is stored in the database.