Being forced to read the whole file into memory with the first `getFileBytes(), in order to transmit in one piece, is most likely what's running the system out of memory.
Find a way to read about 100K, transmit it, then read another 100, until the whole file is done.
The HttpMultipartRequest
class's constructor as written allows only for the transmission of the file as one single object. Even though it's an implementation of the MIME multipart content protocol, it is limited to the case of transmitting just one part:
The class can be modified to allow sending multiple parts. Have a look at the protocol specification RFC1341, especially the example half-way through.
With these three lines together as they are in the constructor, the whole file is sent in one part;
bos.write(boundaryMessage.getBytes());
bos.write(fileBytes);
bos.write(endBoundary.getBytes());
But in the multipart case, there needs to be multiple boundaries, before the endBoundary
:
for(bytes=getMoreFileBytes(); ! bytes.empty; bytes=getMoreFileBytes()){
bos.write(boundaryMessage.getBytes());
bos.write(bytes);
}
bos.write(endBoundary.getBytes());
As a quick fix, let the constructor open the file and read it 100k at a time. It already receives a fileName
parameter.
The PHP script on the other end, should reassemble the original file from the pieces.