tags:

views:

59

answers:

2

Does anybody have a an example of how to import a huge textfile to a specific row using SQLBULKCOPY and streamreader? Basically need to put all of the text file into one column in a specific row that has a value i am looking for.

A: 

Not sure sqlbulkcopy is the best tool for that. I would look at he BLOB writing TSQL commands via regular SqlCommand (usually in a loop). However I also wonder at the wisdom of storing a huge BLOB in the db when an external file may be preferable an simpler.

Marc Gravell
okay so how do i inject a 1.8million line file into a specific row that wont take 2 days to finish?
Mike
@mike if it is just a blob, then read in chunks of some decent multiple of 8040 - maybe 100*8040, and Append that each time. The TSQL is different between image text ntext varbinary max varchar max etc. But each has an append
Marc Gravell
@mike - additionally; what *good* is this as a blob? IMO either transform it into structured data ( where you *can* use SqlBulkCopy), or leave it outside the db (and just store the path in the db)
Marc Gravell
A: 

I think you are looking for something like the OPENROWSET command. This will allow you to read from a file directly into SQL Server. There are a couple options SINGLE_BLOB, SINGLE_CLOB and SINGLE_NCLOB which allow you to specify that the file contains data for a single row, single column rowset. Here is an example that is pretty close to what you are looking for:

INSERT INTO T(XmlCol)
SELECT * FROM OPENROWSET(
   BULK 'c:\SampleFolder\SampleData3.txt',
   SINGLE_BLOB) AS x

EDIT

As a second solution, if you are planning on doing this entirely from C#, you could use the .WRITE syntax with the UPDATE command as follows:

update TableX set BlobCol.Write(@Bytes, @Offset, @Length) where KeyCol = @KeyVal

Here is some untested code that takes an inputStream (FileStream, HttpPostedFile, etc), and copies the contents into a BLOB field:

public void SaveStream(SqlConnection conn, Stream inputStream, long keyVal)
{
    long offset = 0;
    int max_chunk_size = 1024 * 1024; // read a meg at a time...
    String sql = "update TableX set BlobCol.Write(@Bytes, @Offset, @Length) where KeyCol = @KeyVal";

    SqlCommand cmd = conn.CreateCommand();
    SqlParameter bytesParam = new SqlParameter("@Bytes", SqlDbType.Image, max_chunk_size);
    SqlParameter offsetParam = new SqlParameter("@Offset", SqlDbType.BigInt, 8);
    SqlParameter lengthParam = new SqlParameter("@Length", SqlDbType.BigInt, 8);
    SqlParameter keyParam = new SqlParameter("@KeyVal", SqlDbType.BigInt, 8);

    cmd.CommandText = sql;
    cmd.Parameters.Add(bytesParam);
    cmd.Parameters.Add(offsetParam);
    cmd.Parameters.Add(lengthParam);
    cmd.Parameters.Add(keyParam);

    using( BinaryReader reader = new BinaryReader(inputStream) )
    {
        while( true )
        {
            byte[] bytes = reader.ReadBytes(max_chunk_size);

            if( bytes.Length == 0 )
            {
                break;
            }

            bytesParam.Value = bytes;
            offsetParam.Value = offset;
            lengthParam.Value = (long)bytes.Length;
            keyParam.Value = keyVal;
            cmd.ExecuteNonQuery();

            offset += (long)bytes.Length;
        }
    }
}
dana