views:

291

answers:

5

I am interested in doing some image-hacking apps. To get a better sense of expected performance can someone give me some idea of the overhead of touching each pixel at fullscreen resolution?

Typical use case: The use pulls a photo out of the Photo Album, selects a visual effect and - unlike a Photoshop filter - gestural manipulation of the device drives the effect in realtime.

I'm just looking for ballpark performace numbers here. Obviously, the more compute intensive my effect the more lag I can expect.

Cheers, Doug

A: 

One thing that will make a big difference is if you're going to do this at device resolution or at the resolution of the photo itself. Typically, photos transferred from iTunes are scaled to 640x480 (4 times the number of pixels as the screen). Pictures from the camera roll will be larger than that - up to 3Mpix for 3GS photos.

I've only played around with this a little bit, but doing it the obvious way - i.e. a CGImage backed by an array in your code - you could see in the range of 5-10 FPS. If you want something more responsive than that, you'll have to come up with a more-creative solution. Maybe map the image as textures on a grid of points, and render with OpenGL?

Look up FaceGoo in the App Store. That's an example of an app that uses a straightforward OpenGL rendering loop to do something similar to what you're talking about.

Mark Bessey
I wouldn't dream of attempting this stuff at higher then fullscreen res. Cheers.
dugla
A: 

Not doable, not with the current APIs and a generic image filter. Currently you can only access the screen through OpenGL or higher abstractions and OpenGL is not much suited to framebuffer operations. (Certainly not the OpenGL ES implementation on iPhone.) If you change the image every frame you have to upload new textures, which is too expensive. In my opinion the only solution is to do the effects on the GPU, using OpenGL operations on the texture.

zoul
As in write a fragment shader?
dugla
I don’t know much about shaders and don’t know much about the newer GL|ES version on recent iPhones. I just think that it is not possible to push new textures every frame and sustain a decent framerate. You have to find a way around this – either get the data on the screen without using a texture (I don’t think that’s possible) or process the data on the GPU (I don’t know how exactly).
zoul
A fragment shader is analogous to a RenderMan shader. A - typically - small program run at each "fragment" of a model to compute final color. So, I would not actually swap textures but rather, repeatedly, hopefully 30fps, manipulate the parameters to a given shader. Hmmm, I think I just answered my own question as to a strategy.
dugla
A: 

My answer is just wait a litle until they get rid of the openGL 1.0 devices and finally bring Core Image over to the iphone SDK.

With Fragment shaders this is very doable on the newer devices.

zupamario
+2  A: 

You will need to know OpenGL well to do this. The iPhone OpenGL ES hardware has a distinct advantage over many desktop systems in that there is only one place for memory - so textures don't really need to be 'uploaded to the card'. There are ways to access the memory of a texture pretty well directly.

The 3GS has a much faster OpenGL stack than the 3G, you will need to try it on the 3GS/equivalent touch.

Also compile and run the GLImageProcessing example code.

Tom Andersen
Thanks Tom. Do you have a link to the GLImageProcessing code? Cheers.
dugla
Never mind. Got it.
dugla
A: 

I'm beginning to think the only way to pull this off is to write a suite of vertex/fragment shaders and do it all in OpenGL ES 2.0. I'd prefer not to incur the restriction of limiting the app to iPhone 3GS but I think thats the only viable way to go here.

I was really hoping there was some CoreGraphics approach that would work but that does not appear to be the case.

Thanks, Doug

dugla