Hi,
It's a requirement in my app that the users can upload files to the database. At this point, the user upload the file and the webpage save it in a temp directory, secondly the logic load the file in a Byte[] and pass this array as parameter to the "insert" SQL statement.
The problem with that approach, is that if the user try to upload a 1GB file, it'll cause the server takes 1GB of memory in order to store that file as Byte[], and that memory should be GCollected later on. If several users do that at the same time, the server could collapse.
One way to avoid this, is limit the size of the file... but the customer don't want to do that. So the best approach seems to be upload the file to the database sequentially using a pointer. I've found an example of this for SQL Server using a special function named updatetext, but I'd like to know an approach valid for all kinds of databases, ie. to know if is possible and how to upload a file to a database in chunks.
Cheers.