views:

3221

answers:

5

I am building up a view with various text and image elements.

I want to display some text in the view with a blurry copy of the text behind it, but not just a text shadow.

How do I apply Gaussian blurred text onto a UIImage or layer?

+1  A: 

On the desktop, no question, you'd use CoreImage to do this.

On the phone though, I don't think there exists a way to do this using CoreGraphics. If it is absolutely critical OpenGLES may be able to help.

However, I would suggest rethinking your interface. I would think the blurred text would be distracting.

Edit: mledford points out in the comments that you could use CoreAnimation. I don't know if CA on the phone includes blur radius like on the desktop, but you could try it.

Colin Barrett
Colin, couldn't you also use Core Animation layers to build this up? Not that I recommend it... by creating a layer with a blur filter then composite another layer on top?
Michael Ledford
I should have checked. :-( Excerpt of documentation on the CALayer property filters. "Special Considerations While the CALayer class exposes this property, Core Image is not available in iPhone OS. Currently the filters available for this property are undefined."
Michael Ledford
Thanks Michael for the comments.
Chris Samuels
+1  A: 

iPhone OS doesn't provide any Core Image filters that I know of - otherwise, yes, a filtered CALayer would be the right way to do it. If NSBitmapImageRep were available, you could do a primitive blur by drawing the text to it, shrinking the image (downsampling), then enlarging the image again (upsampling) - unfortunately it seems to be missing as well. I've seen blurred text accomplished in Flash, which (last I checked) doesn't have pixel-level filtering; you might try looking for a tutorial on that and seeing what you can adapt to Cocoa Touch.

Noah Witherspoon
A: 

You will take a performance hit if you use alpha layers. Consider a different approach if possible (maybe even precompositing the text and flattening it into a graphic instead of multiple layers).

Try it, and use Instruments to check out the performance and see if it's acceptable. If you're doing it in a scrolling view, your scrolling will bog down a lot.

Joe McMahon
+3  A: 

Take a look at Apple's GLImageProcessing iPhone sample. It does some blurring, among other things.

The relevant code includes:

static void blur(V2fT2f *quad, float t) // t = 1
{
    GLint tex;
    V2fT2f tmpquad[4];
    float offw = t / Input.wide;
    float offh = t / Input.high;
    int i;

    glGetIntegerv(GL_TEXTURE_BINDING_2D, &tex);

    // Three pass small blur, using rotated pattern to sample 17 texels:
    //
    // .\/.. 
    // ./\\/ 
    // \/X/\   rotated samples filter across texel corners
    // /\\/. 
    // ../\. 

    // Pass one: center nearest sample
    glVertexPointer  (2, GL_FLOAT, sizeof(V2fT2f), &quad[0].x);
    glTexCoordPointer(2, GL_FLOAT, sizeof(V2fT2f), &quad[0].s);
    glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
    glColor4f(1.0/5, 1.0/5, 1.0/5, 1.0);
    validateTexEnv();
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    // Pass two: accumulate two rotated linear samples
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glEnable(GL_BLEND);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE);
    for (i = 0; i < 4; i++)
    {
     tmpquad[i].x = quad[i].s + 1.5 * offw;
     tmpquad[i].y = quad[i].t + 0.5 * offh;
     tmpquad[i].s = quad[i].s - 1.5 * offw;
     tmpquad[i].t = quad[i].t - 0.5 * offh;
    }
    glTexCoordPointer(2, GL_FLOAT, sizeof(V2fT2f), &tmpquad[0].x);
    glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
    glActiveTexture(GL_TEXTURE1);
    glEnable(GL_TEXTURE_2D);
    glClientActiveTexture(GL_TEXTURE1);
    glTexCoordPointer(2, GL_FLOAT, sizeof(V2fT2f), &tmpquad[0].s);
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);
    glBindTexture(GL_TEXTURE_2D, tex);
    glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);
    glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB,      GL_INTERPOLATE);
    glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB,         GL_TEXTURE);
    glTexEnvi(GL_TEXTURE_ENV, GL_SRC1_RGB,         GL_PREVIOUS);
    glTexEnvi(GL_TEXTURE_ENV, GL_SRC2_RGB,         GL_PRIMARY_COLOR);
    glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND2_RGB,     GL_SRC_COLOR);
    glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA,    GL_REPLACE);
    glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA,       GL_PRIMARY_COLOR);

    glColor4f(0.5, 0.5, 0.5, 2.0/5);
    validateTexEnv();
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    // Pass three: accumulate two rotated linear samples
    for (i = 0; i < 4; i++)
    {
     tmpquad[i].x = quad[i].s - 0.5 * offw;
     tmpquad[i].y = quad[i].t + 1.5 * offh;
     tmpquad[i].s = quad[i].s + 0.5 * offw;
     tmpquad[i].t = quad[i].t - 1.5 * offh;
    }
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    // Restore state
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
    glClientActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, Half.texID);
    glDisable(GL_TEXTURE_2D);
    glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND2_RGB,     GL_SRC_ALPHA);
    glActiveTexture(GL_TEXTURE0);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glDisable(GL_BLEND);
}
mahboudz
Is it correct to assume that it is possible to use openGL ES in a simple cocoa-touch app without having a openGL ES view? Could I use openGL just for that blur filtering? and write the output into a UIImage?
HelloMoon
No, I seem to recall a bit of EAGLView and glview peppered through the code.I'm experimenting with code I found at the following URL, which produces much nicer blurring than the code above.http://incubator.quasimondo.com/processing/fast_blur_deluxe.php
mahboudz
It's not fast, but if you only need it for blurring UILabels, or text, it might work rather well.
mahboudz
A: 

Did anyone come up with a way to do this?

codecowboy
Please don't post an answer that doesn't answer the question.
Peter Hosey