views:

918

answers:

3

I want to copy the content from one object stored in one docbase to another object stored in another docbase. I do not want to create a file because I have more than 300 k files to copy. Below is a part of my code:

ByteArrayOutputStream baos = new ByteArrayOutputStream();

IOUtils.copy(source.getContent(), baos);

[...]
targetObj.setContent(baos); // Documentum DFC
targetObj.save(); // Documentum DFC

If I do not tune the JVM, IOUtils.copy(source.getContent(), baos); gives java.lang.OutOfMemoryError: Java heap space.

If I tune the JVM by setting Xmx max value, the previous instruction is ok, but java.lang.OutOfMemoryError: Java heap space occurs with targetObj.setContent(baos);.

With an only 8332175 Bytes large content... (7.94 MB)

Any idea what's wrong? A better way to copy from ByteArrayInputStream to ByteArrayOutputStream? Something else?


Some Documentum API

getContent

public ByteArrayInputStream getContent() throws DfException

Copies this object's content from the Documentum server into a ByteArrayInputStream >object.

The following code example demonstrates how to copy an objects content from the >Documentum server into memory:

    IDfSysObject sysObj = (IDfSysObject)session.getObject(new DfId("0900d5bb8001f900"));
    ByteArrayInputStream bais = sysObj.getContent();
    if (bais.available() > 0)
    {
         // Data successfully fetched from the server...
    }

Returns: a ByteArrayInputStream object containing the objects content. Throws: DfException - if a server error occurs.

And

setContent

public boolean setContent(ByteArrayOutputStream content) throws DfException

Sets new content to an object. Use this method when you want to set data that resides >in working memory.

The following code example demonstrates how to set content residing in memory to a new document:

    IDfSysObject sysObj = (IDfSysObject)sess.newObject("dm_document");
    sysObj.setObjectName("testDoc");
    sysObj.setContentType("crtext");
    byte b[] = {35,36,37,38,39};
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    out.write(b, 0, 5);
    sysObj.setContent(out);
    sysObj.save();

Parameters: content - the content as a ByteArrayOutputStream. Throws: DfException - if a server error occurs.

+2  A: 

As long as you use ByteArrayOutputStream, the data will have to fit in memory.

I know nothing about Documentum, but is there maybe a targetObj.setContent(File) or setContent(InputStream), so that you can avoid reading the whole chunk into a byte[]?

(8MB is not all that huge though, maybe you can just adjust the Java heap space. It could also help to pre-size the buffer used by the BAOS, you can pass the initial size to its constructor)

Update: Are you sure setContent takes a ByteArray Output Stream? Usually, a setter would read from an InputStream.

Thilo
There is no other known way but setContent(BAOS) with Documentum 5.3. Of course my first action was to increase the Xmx to 1024m, but still java heap space. It is not a solution because I do not know the maxFileSize of the database.setContent(BAOS) and getContent() returns a BAIS, that why I use IOUtils.copy();
enguerran
weird. got a JavaDoc link?
Thilo
If possible eliminate all links to source and its BAIS immediately after the copy operation, so that its copy of the data can be garbage-collected.
Thilo
documentum is weird...
enguerran
That is what I thought, I'll try to null my useless source asap.
enguerran
I patched my code with your advises. Setting the size of the BAOS is great for memory management and execution velocity. Setting to null asap we do not need the object anymore seems to be useful for memory management. Increase the Xms and Xmx jvm memory size is important.Thanks to all of you and Thilo !
enguerran
@Thilo Update: Are you sure setContent takes a ByteArray Output Stream? Usually, a setter would read from an InputStream.Documentum is weird...
enguerran
I validated your answer because the correct answer is : with documentum, if you want to be slow and strong, use setFile(String filepath). If you want to be fast, use a big machine with a large Xmx flag.
enguerran
A: 

If you need to store this temporarily in memory, then increase your JVM max memory size using:

java -Xmx256m

to increase the maximum memory allocation to 256Mb (default is 64Mb). See here for more details.

Brian Agnew
Sadly I fixed -Xms1024m and -Xmx1024m... still OutOfMemoryError
enguerran
+2  A: 

I've run into problems like this when dealing with large files and there really isn't any magic to it other than trying to increase your heap size. I know you said you don't want to create a file locally on the client where your code is running, but you may want to take a look at doing this with operations. Essentially you would just run an export operation to get the file from the source repository then use an import operation to create it in the target. As part of the import operation, there is a flag you can set to delete the source file when the operation completes.

   IDfClientEx clientx = new DfClientEx();
   IDfExportOperation exOp = clientx.getExportOperation();
   IDfSysObject exportObj = getObjectToExport();
   IDfExportNode = (IDfExportNode) exOp.add(exportObj);
   exOp.execute();
   String path = exOp.getFilePath();

   IDfImportOperation impOper = clientx.getImportOperation();
   IDfFile dfFile = new DfFile(path);
   IDfImportNode impNode = (IDfImportNode) impOper.add(dfFile);
   impNode.setDocbaseObjectType("dm_document");
   impNode.setDestinationFolderId(importFolderId);
   impNode.setNewObjectName("testDoc");
   impNode.setFormat("crtext");
   impOper.setKeepLocalFile(false);
   impOper.execute();
shsteimer
You are right, no magical solution: I increased the heap size to 1640m (my system max) and it seems to be enough for my needs (max file size: 8MB). I tried to create a local temporary file and it was good too. So no magical solution! ^^
enguerran