tags:

views:

67

answers:

2

I am attempting to read in a stream and save the read images to a zip file as this is going to be running over multiple days and would generate far too many individual files.

I now have an issue where I seem to be unable to save images into a zip file. The worker thread I have built for it is below. I am sure that the image is making it to the ImageIO.write. The result at the end however is a zip file of empty jpgs. I am wondering if perhaps ImageIO is not writing property to the ZipOutputStream.

Thanks for your help.

public class ZipSaveWorker implements Runnable{

    public static ZipOutputStream out=null;
    BufferedImage myImage;
    private static int counter=0;



    public void run() {
        ZipEntry entry=new ZipEntry("video"+counter+".jpg");
        counter++;
        try {
            out.putNextEntry(entry);
            ImageIO.write(myImage, ".jpg", out);

        } catch (IOException ex) {
            Logger.getLogger(ZipSaveWorker.class.getName()).log(Level.SEVERE, null, ex);
        }
    }

    public ZipSaveWorker(BufferedImage image)
    {
        if (out==null)
        {
            try {
                out = new ZipOutputStream(new BufferedOutputStream(new FileOutputStream(new File("images" + File.separator + "video.zip"))));
            } catch (FileNotFoundException ex) {
                Logger.getLogger(ZipSaveWorker.class.getName()).log(Level.SEVERE, null, ex);
            }
            counter=0;
        }

        myImage=image;

    }

    public static void closeStream()
    {
        try {
            out.flush();
            out.close();
        } catch (IOException ex) {
            Logger.getLogger(ZipSaveWorker.class.getName()).log(Level.SEVERE, null, ex);
        }
    }


}
+1  A: 

I am attempting to read in a stream and save the read images to a zip file as this is going to be running over multiple days and would generate far too many individual files.

I'm not convinced by this reasoning. The actual number of files should not really matter that much. You might lose on average 1/2 a (file system) disk block per file, but with terabyte disc drives available for a couple of hundred dollars, that's probably insignificant.

But the more important problem is what happens if your application ... or the power goes off. If you write all of your images straight into a ZIP file, the chances are that you will end up with nothing but a corrupt ZIP file for a multi-day run. I expect that the ZIP file contents will be mostly recoverable, but only using some third party (non-Java) application.

If file system resources (disk space, number of inodes, whatever) are a realistic concern, then maybe you should write a script to run (say) once an hour and ZIP up the files that were written in the last hour and (maybe) put the ZIP file somewhere else.

Stephen C
Agreed. Ultimately I don't think you'd save what you *think* you're saving. A ZIP file needs to maintain an index of all the files in an archive just like a standard non-compressed filesystem. Your JPEGs are unlikely to compress well using ZIP, so you're probably shooting yourself in the foot.
Quintus
The reasoning for a zip file was to save the file system from having to index all the individual jpgs. I have a fear of running low on INodes storing them separately. The issue was never to save space.
Bryan
+1  A: 

The error in your code is in the line:

ImageIO.write(myImage, ".jpg", out);

It should be:

ImageIO.write(myImage, "jpg", out);

I am also not sure if you should call closeEntry() after every image has been written.

Consider what Stephen C writes, this code could result in corrupt zip files if the power was cut or the VM died. Consider making backups of the zip files a few times a week, maybe even a few times a day, to ensure your multiple day runs aren't completely ruined (I assume a run can be resumed).

Jes
I will try it out and see if it works tonight. Thanks.
Bryan