views:

49

answers:

1

A user has a few Lookup selections on a JSP page. After he's selected what data he wants exported, the JSP calls my servlet which should do something like this:

  1. get request data
  2. generate SQL for the request data
  3. execute the SQL and write it to a XML file
  4. package the XML file to a ZIP file and return it as a response

Now, the first two points are trivial, but the second two are what I'm unsure of:

 //3. execute the SQL and write it to a XML file
    ConnectionManager cm = new ConnectionManager();
    Connection conn = null;
    ResultSet rs = null;
    Statement stmt = null;

    try {
        conn =cm.getConnection();
        stmt = conn.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_READ_ONLY);
        rs = stmt.executeQuery(sqlQuery);
        // generates XML and writes it to a java.io.Writer
        OutputStreamWriter writer = new OutputStreamWriter( new FileOutputStream(Konstante.zipFolder + idVrsteSifarnika+ ".xml"), "UTF8");
        XMLWriter.writeXMLToWriter(rs,writer);
    } catch (Exception ex) {
        logger.error(ex);
    } finally {
        ConnectionManager.close(rs, stmt, conn); 
    }

//4. package the XML file to a ZIP file and return it as a response
    ZipEntry ze = new ZipEntry(idVrsteSifarnika + ".xml");
    BufferedOutputStream bos = new BufferedOutputStream(response.getOutputStream());
    ZipOutputStream zos = new ZipOutputStream(bos);
    zos.putNextEntry(ze);
    FileInputStream fis = new FileInputStream(new File(Konstante.zipFolder + idVrsteSifarnika+ ".xml"));
    final byte[] buffer = new byte[1024];
    int n;
    while ((n = fis.read(buffer)) != -1)
        zos.write(buffer, 0, n);
    zos.closeEntry();
    zos.flush();
    zos.close();
    fis.close();
    bos.close();

What I would like to know is if could optimize memory usage/execution speed by changing the way I do all this/changing the various streams?

P.S. This is the writeXMLToWriter method

//Set up XML
    DataWriter w = new DataWriter(writer);
    w.startDocument();
    w.setIndentStep(2);
    w.startElement(startingXMLElement);
    // Get the metadata
    ResultSetMetaData meta = rs.getMetaData();
    int count = meta.getColumnCount();
    // Iterate over the set
    while (rs.next()) {
        w.startElement(rowElement);
        for (int i = 0; i < count; i++) {
            Object ob = rs.getObject(i + 1);
            if (rs.wasNull()) {
                ob = null;
            }
            String colName = meta.getColumnLabel(i + 1);
            if (ob != null ) {
                if (ob instanceof Timestamp) {
                    w.dataElement(colName, Util.formatDate((Timestamp)ob, dateFormat)); 
                }
                else if (ob instanceof BigDecimal){
                    w.dataElement(colName, Util.transformToHTML(new Integer(((BigDecimal)ob).intValue())));
                }
                else {
                    w.dataElement(colName, ob.toString());
                }

            } else {
                w.emptyElement(colName);
            }
        }
        w.endElement(rowElement);
    }
    w.endElement(startingXMLElement);
    w.endDocument();
+4  A: 

There's no need to write the XML to disk at all. Simply create an OutputStreamWriter wrapping the ZipOutputStream and pass that to your writeXMLToWriter method:

ZipOutputStream zos = new ZipOutputStream(bos);
zos.putNextEntry(ze);
OutputStreamWriter writer = new OutputStreamWriter(zos, "UTF-8");
XMLWriter.writeXMLToWriter(rs,writer);
writer.close();
zos.closeEntry();
zos.flush();
Joachim Sauer
Wouldn't it create a memory problem if there's large amounts of data? User can select such data that his XML has around 200 MB unpacked.
Andrija
@Andrija: not really. You wouldn't hold a lot more data in memory than you currently do. You never hold the whole XML in memory (at least not in your code) and the ZIP file is held in memory exactly as long as it is in your version. Also: why don't you simply test it?
Joachim Sauer
To clarify: with "at least not in your code" I meant: not in any code you showed us here. It could be that `DataWriter` holds the XML in memory, from judging from how its used, I assume it doesn't. I don't know if `ZipOutputStream` needs to do some in-memory buffering or not, but that would apply to both your original solution and mine.
Joachim Sauer
I've done memory profiling and it seems that the biggest memory hog is the ResultSet itself, even though it has ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_READ_ONLYit still loads everything in the memory, which in max case is about 225MB. Other than that (which is another problem :) ) your solution works really good, thank you.
Andrija
@Andrija: you don't happen to use MySQL? It tends to load fully the `ResultSet` in memory no matter what you tell it to do. There might be a connection parameter that changes that.
Joachim Sauer
Saw that it could be a problem, but I'm using Oracle.
Andrija
@Andrija: also try using `TYPE_FORWARD_ONLY`.
Joachim Sauer