I am trying to move from a mysql set to a postgres one, and one of the fields that I'm having trouble moving is a mysql LBLOB. I am trying to move it into a LargeObject type in postgres and I'm having some speed issues. I'm doing this in Java/Groovy and, frankly, the streaming business has me confused.
I've tried two approaches: Hold the LBLOB in memory and write it directly to the LO, and write the LBLOB to disk (it is a file after all) and then read the file back in to the LO.
The second approach is many many times faster, and I can't figure out why because I think it is still entirely too slow.
Here is the first approach.
InputStream ins = rs.getBinaryStream(1);
def b
while ((b = ins.read()) > -1) {
obj.write(b.toInteger().byteValue())
}
"ins" is a ByteArrayInputStream so I read that in (to an int) and then write it to "obj", the LO. This takes about 7 minutes for a 1MB file. My gut tells me this should be more efficient than the second one, but it is not.
I'll spare the code snippet for the write to file version but it is pretty basic. It reads from the db the same way, but then writes the output to a file on disk. Then I go read the file from disk and write it to the LO. That approach takes about 8 seconds for the same file.
What is going on?