views:

144

answers:

4

The processing I have in mind is this:

  • there are thousands of png files
  • each of them should be loaded, and its pixels accessed
  • each pixel's channels will be processed in some way, and then written to a binary file

I was thinking of using some sort of module, like ImageMagick wrappers, or some other wrapper for a C image processing backend. Will Perl slow me down if I choose it to implement this task? I have a tool already that's written in Java ( it uses JDK's BufferedImage ), and it's reasonably fast. Would I be crazy to expect the same speed from Perl?

+3  A: 

I don't think so, unless your Perl code is over-reliant on method calls in a tight loop. But if the actual image processing is done in C backend, Perl will not be a bottleneck performance-wise.

DVK
The only C backend will be the one exposed by the module's API. I'm not planning on writing C myself :)
Geo
@Geo: I suspect that the "C backend" DVK referred to was "ImageMagick... or some other wrapper for a C image processing backend". The actual image processing will be handled by ImageMagick (or whatever other graphics lib) and will probably be the most time-consuming part of the operation, so it doesn't much matter what language you use to call the library.
Dave Sherohman
@Dave - yep, that's what I meant
DVK
+2  A: 

The answer depends on what is limiting performance in the Java version. If you're limited by file I/O (including .png decompression), then moving to Perl will probably be fine. Otherwise, you're likely to pay a steep performance penalty for processing each pixel in Perl, but if you can call C routines to process entire images, you're likely to be just as fast (possibly faster, depending on the relative performance of the C and Java libraries).

So, in brief: if Perl must touch pixels, it will be slow. If Perl touches images and C touches pixels, it's probably fine.

Rex Kerr
+8  A: 

If you're using ImageMagick, or other any other C-based processing tool, perl will most certainly not be the bottleneck. The bottlenecks I could see (especially if processing thousands of files) would be:

  • Disk IO speeds
  • Memory access speeds
  • Library algorithm speed

Perl will make a great glue for doing what you want. The slow parts will still be slow. You might as well make the fast parts easy. :)

Also, remember the two Rules of Optimization:

  1. Don't do it.
  2. (For experts only: ) Don't do it yet.

When you do get it put together, run a profiler on it. If and when that becomes your goal, check out:

http://search.cpan.org/dist/Devel-NYTProf

Devel::NYTProf is pretty much the bee's knees when it comes to profiling tools. It'll show you exactly where your slowdowns are, so you don't just have a "warm fuzzy" feeling that you have it right...you'll know for sure.

Robert P
+1  A: 

Yes, I expect the performance of a perl implementation would be incredibly sucky at pixel-level image manipulation.

Yes, you could do it, but Perl's data structures don't lend themselves to this kind of thing. If you were using a library where you don't need to make 1x call per pixel, you'll be fine though.

MarkR
I'd need to get the pixel at each X,Y location and then process it using Perl code ( simple byte extraction ).
Geo
You'd probably want to put the image into a raw bitmap in a single perl scalar - then use slicing to get the appropriate bytes out. That would suck, but not as much as using Perl arrays. It'd still be slow for processing lots of big images - at least much slower than something like Java which can properly optimise code on byte arrays (I imagine) - and that's assuming you had libraries which were efficent for loading / saving to PNG or whatever.
MarkR