views:

236

answers:

6

(please read the update section below, I leave the original question too for clarity)

I am inserting many files into a SQL Server db configured for filestream.

I am inserting in a loop the files from a folder to a database table.

Everything goes fine until I try to insert a 600 MB file.

As it inserts it there is a +600MB memory usage in task manager and I have the error.

The DB size is < 1 GB and the total size of documents is 8 GB, I am using SQL Server Express R2, and according to the documentation I could have problems only if trying to insert a document that is greater than 10 GB (Express limitation) - Current DB Size.

Can anyone tell me why do I have this error? It is very crucial for me.

UPDATE FOR BOUNTY:

I offered 150 because it is very crucial for me!

This seems to be a limitation of Delphi memory Manager, trying to insert a document bigger than 500MB, I didn't check the exact threshold anyway it is between 500 and 600MB). I use SDAC components, in particular a TMSQuery (but I think the same can be done with and TDataset descendant), to insert the document in a table that has a PK (ID_DOC_FILE) and a varbinary(max) field (DOCUMENT) I do:

procedure UploadBigFile;
var 
  sFilePath: String; 
begin 
  sFilePath := 'D:\Test\VeryBigFile.dat'; 
  sqlInsertDoc.ParamByName('ID_DOC_FILE').AsInteger := 1; 
  sqlInsertDoc.ParamByName('DOCUMENT').LoadFromFile(sFilePath, ftblob); 
  sqlInsertDoc.Execute; 
  sqlInsertDoc.Close; 
end;

SDAC team told me this is a limitation of Delphi memory manager. Now since SDAC doesn't support filestream I cannot do what has been suggested in c# in the first answer. Is the only solution reporting to Embarcadero and ask a bug fix?

FINAL UPDATE:

Thanks, really, to all you that answered me. For sure inserting big blobs can be a problem for the Express Edition (because the limitations of 1 GB of ram), anyway I had the error on the Enterprise edition, and it was a "delphi" error, not a sql server one. So I think that the answer that I accepted really hits the problem, even if I have no time to verify it now.

+2  A: 

You could loop through the byte stream of the object you are trying to insert and essentially buffer a piece of it at a time into your database until you have your entire object stored.

I would take a look at the Buffer.BlockCopy() method if you're using .NET

Off the top of my head, the method to parse your file could look something like this:

        var file = new FileStream(@"c:\file.exe");
        byte[] fileStream;
        byte[] buffer = new byte[100];
        file.Write(fileStream, 0, fileStream.Length);
        for (int i = 0; i < fileStream.Length; i += 100)
        {
            Buffer.BlockCopy(fileStream, i, buffer, 0, 100);
            // Do database processing
        }
George
+2  A: 

Here is an example that reads a disk file and saves it into a FILESTREAM column. (It assumes that you already have the transaction Context and FilePath in variables "filepath" and "txContext".

'Open the FILESTREAM data file for writing
Dim fs As New SqlFileStream(filePath, txContext, FileAccess.Write)

'Open the source file for reading
Dim localFile As New FileStream("C:\temp\microsoftmouse.jpg",
                                FileMode.Open,
                                FileAccess.Read)

'Start transferring data from the source file to FILESTREAM data file
Dim bw As New BinaryWriter(fs)
Const bufferSize As Integer = 4096
Dim buffer As Byte() = New Byte(bufferSize) {}
Dim bytes As Integer = localFile.Read(buffer, 0, bufferSize)

While bytes > 0
    bw.Write(buffer, 0, bytes)
    bw.Flush()
    bytes = localFile.Read(buffer, 0, bufferSize)
End While

'Close the files
bw.Close()
localFile.Close()
fs.Close()
jacob sebastian
If you do something like this, you'll need to make sure that you are configured system file access from SQL, which isn't on by default.
George
Well it is not really 'system file access from sql' that needs to be enabled. FILESTREAM is a hybrid feature that needs to be enabled at the Windows Admin level as well as SQL Server level. First of all FILESTREAM needs to be enabled for 'file I/O streaming access'. This can be done either at the time of installation or using the SQL Server Configuration manager (post installation). In addition to that, the FILESTREAM access level needs to be configured (for full access) at the SQL Server instance level. This can be done either using TSQL SSMS
jacob sebastian
+1  A: 

SDAC team told me this is a limitation of Delphi memory manager

To me that looked like an simplistic answer, and I investigated. I don't have the SDAC components and I also don't use SQL Server, my favorites are Firebird SQL and the IBX component set. I tried inserting an 600Mb blob into a table, using IBX, then tried the same using ADO (covering two connection technologies, both TDataSet descendants). I discovered the truth is somewhere in the middle, it's not really the memory manager, it's not SDAC's fault (well... they are in a position to do something about it, if many more people attempt inserting 600 Mb blobs into databases, but that's irrelevant to this discussion). The "problem" is with the DB code in Delphi. As it turns out Delphi insists on using an single Variant to hold whatever type of data one might load into an parameter. And it makes sense, after all we can load lots of different things into an parameter for an INSERT. The second problem is, Delphi wants to treat that Variant like an VALUE type: It copies it around at list twice and maybe three times! The first copy is made right when the parameter is loaded from the file. The second copy is made when the parameter is prepared to be sent to the database engine.

Writing this is easy:

var V1, V2:Variant;
V1 := V2;

and works just fine for Integer and Date and small Strings, but when V2 is an 600 Mb Variant array that assignment apparently makes a full copy! Now think about the memory space available for a 32 bit application that's not running in "3G" mode. Only 2 Gb of addressing space are available. Some of that space is reserved, some of that space is used for the executable itself, then there are the libraries, then there's some space reserved for the memory manager. After making the first 600 Mb allocation there just might not be enough available addressing space to allocate an other 600 Mb buffer! Because of this it's safe to blame it on the memory manager, but then again, why exactly does the DB stuff need an other copy of the 600 Mb monster?

One possible fix

Try splitting up the file into smaller, more manageable chunks. Set up the database table to have 3 fields: ID_DOCUMENT, SEQUENCE, DOCUMENT. Also make the primary key on the table to be (ID_DOCUMENT, SEQUENCE). Next try this:

procedure UploadBigFile(id_doc:Integer; sFilePath: String);
var FS:TFileStream;
    MS:TMemoryStream;
    AvailableSize, ReadNow:Int64;
    Sequence:Integer;
const MaxPerSequence = 10 * 1024 * 1024; // 10 Mb
begin

  FS := TFileStream.Create(sFilePath, fmOpenRead);
  try
    AvailableSize := FS.Size;
    Sequence := 0;
    while AvailableSize > 0 do
    begin
      if AvailableSize > MaxPerSequence then
        begin
          ReadNow := MaxPerSequence;
          Dec(AvailableSize, MaxPerSequence);
        end
      else
        begin
          ReadNow := AvailableSize;
          AvailableSize := 0;
        end;
      Inc(Sequence); // Prep sequence; First sequence into DB will be "1"
      MS := TMemoryStream.Create;
      try
        MS.CopyFrom(FS, ReadNow);

        sqlInsertDoc.ParamByName('ID_DOC_FILE').AsInteger := id_doc; 
        sqlInsertDoc.ParamByName('SEQUENCE').AsInteger := sequence; 
        sqlInsertDoc.ParamByName('DOCUMENT').LoadFromStream(MS, ftblob); 
        sqlInsertDoc.Execute; 

      finally MS.Free;
      end;
    end;
  finally FS.Free;
  end;

  sqlInsertDoc.Close;       

end;
Cosmin Prund
Thanks for the answer and for the code. I will probably go for this approach, thanks also for the clear explanation of the problem.
+1  A: 

You're probably running into memory fragmentation issues somewhere. Playing around with really large blocks of memory, especially in any situation where they might need to be reallocated tends to cause out of memory errors when in theory you have enough memory to do the job. If it needs a 600mb block and it can't find a hole that's 600mb wide that's it, out of memory.

While I have never tried it my inclination for a workaround would be to create a very minimal program that does ONLY the one operation. Keep it absolutely as simple as possible to keep the memory allocation minimal. When faced with a risky operation like this call the external program to do the job. The program runs, does the one operation and exits. The point is the new program is in it's own address space.

The only true fix is 64 bit and we don't have that option yet.

Loren Pechtel
+1  A: 

I recently experienced a similar problem while running DBCC CHECKDB on a very large table. I would get this error:

There is insufficient system memory in resource pool 'internal' to run this query.

This was on SQL Server 2008 R2 Express. The interesting thing was that I could control the occurrence of the error by adding or deleting a certain number of rows to the table.

After extensive research and discussions with various SQL Server experts, I came to the conclusion that the problem was a combination of memory pressure and the 1 GB memory limitation of SQL Server Express.

The recommendation given to me was to either

  1. Acquire a machine with more memory and a licensed edition of SQL Server or...
  2. Partition the table into sizeable chunks that DBCC CHECKDB could handle

Due the complicated nature of parsing these files into the FILSTREAM object, I would recommend the filesystem method and and simply use SQL Server to store the locations of the files.

8kb
A: 

"While there are no limitations on the number of databases or users supported, it is limited to using one processor, 1 GB memory and 4 GB database files (10 GB database files from SQL Server Express 2008 R2)." It is not the size of the database files that is an insue but "1 GB memory". Try spitting the 600MB+ file but putting it in the stream.

Carnotaurus