I'm working on a new storage system for a business solution package that consists of around 40 applications. Some of those applications generate documents (mostly docx, some pdf) that are currently saved and organized in a network share folder.
Applications generate about 150.000-200.000 documents a year in average, and those documents should be persisted in a more consistent and reliable form (ie. separate SQL database).
Sharepoint is a leading candidate, since we plan on using it's other features eventually, other then the DMS capabilities. I've read about document library limitations ie. 2000 files per folder with up to 1.000.000 files in all the folders of a document library. I've also read that the 2000 limit can be bypassed BUT it affects performance. What I haven't found is the real world experience with such a large number of files in one library. And what will happen if I increase the folder limit to 50.000 for example, what impact would that have on performance (slower requests for reading/editing/writing documents through web services, especially writing if it checks for duplicate file names, indexing, searching etc.).
One important note: We will not be using sharepoint web portal at all if we don't have to, but instead do everything through our applications via web services, so data view slower rendering is not the issue.