I have data that is streamed from disk and processed in memory by a Java application and that finally needs to be copied into SQL Server. The data can be fairly large (hence the streaming) and can require up to several 100,000 rows to be inserted. The fastest solution seems to be using SQL Server's bulk-copy feature. However, I haven't found any way for Java programs to do this easily or nearly fast enough.
Here are some ways that I've already investigated:
Using the SqlBulkCopy class in .NET. This is very efficient since you can stream data right from a data source and straight to SQL Server. The problem with this approach is that you need to be running .NET. Perhaps this could be used using a Java to .NET bridge. Although, I wonder about the cost of marshalling data between runtimes.
Using the BULK INSERT TSQL statement. The problem with this is that you need create a properly formatted file on disk. I've seen some small performance gains over JDBC's batch insert using this. Also, this is only useful locally.
Write files to disk and use the bcp command line utility. Still a little faster than JDBC batch insert but not that much. I also lose the ability to use a transaction with this method.
Use the C API. Again, very efficient, but you need to be using C. There would be a way to use this through JNI. If there's some free Java library out there that does this, I'd like to know about it.
I'm looking for the fastest solution. Memory is not an issue.
Thanks!