views:

761

answers:

3

I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**.

Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model.

Right now, I can think two approaches:

SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException

and

SPFile file = folder.Files.Add("filename", new byte[]{ });
using(Stream stream = file.OpenBinaryStream())
{
    // ... init vars and stuff ...
    while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) > 0)
    {
        stream.Write(buffer, 0, (int)bytes); // Timeout issues
    }
    file.SaveBinary(stream);
}

Are there any other way to complete successfully this task?

** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (>100Mb).

A: 

The first thing I would say is that SharePoint is really, really not designed for this. It stores all files in its own database so that's where these large files are going. This is not a good idea for lots of reasons: scalability, cost, backup/restore, performance, etc... So I strongly recommend using file shares instead.

You can increase the timeout of the web request by changing the executionTimeout attribute of the httpRuntime element in web.config.

Apart from that, I'm not sure what else to suggest. I haven't heard of such large files being stored in SharePoint. If you absolutely must do this, try also asking on Server Fault.

Alex Angas
hi Alex, thanks for your comments. I'm aware about that issues: I'm building a wss app for network share replacement (mostly for bandwidth restriction); while users should use this app for small files, the big ones must be considered.I'm not having problems with HttpRuntime, since I upload files in several chunks
Rubens Farias
A: 

As mentioned previously, storing large files in Sharepoint is generally a bad idea. See this article for more information: http://blogs.msdn.com/joelo/archive/2007/11/08/what-not-to-store-in-sharepoint.aspx

With that said, it is possible to use external storage for BLOBs, which may or may not help your performance issues -- Microsoft released a half-complete external BLOB storage provider that does the trick, but it unfortunately works at the farm level and affects all uploads. Ick.

Fortunately, since you can implement your own external BLOB provider, you may be able to write something to better handle these specific files. See this article for details: http://207.46.16.252/en-us/magazine/2009.06.insidesharepoint.aspx

Whether or not this would be worth the overhead depends on how much of a problem you're having. :)

Brett Coburn
Interesting links Brett, I'll study them carefully; right now I must finish this app this way; ty for your time
Rubens Farias
@Brett: There is a company out there that has developed a complete external storage solution for SharePoint. Can't remember the name or find it now though... Looks interesting
Alex Angas
+1  A: 

Hi ALL,

I ended with following code:


myFolder.Files.Add("filename", 
   new DataRecordStream(dataReader, 
      dataReader.GetOrdinal("Content"), length));

You can find DataRecordStream implementation here. It's basically a Stream whos read data from a DbDataRecord through .GetBytes

This approach is similar to OpenBinaryStream()/SaveBinary(stream), but it's doesnt keeps all byte[] in memory while you transfer data. In some point, DataRecordStream will be accessed from Microsoft.SharePoint.SPFile.CloneStreamToSPFileStream using 64k chunks.

Thank you all for valuable infos!

Rubens Farias