views:

1394

answers:

3

I've implemented a simple application which shows the camera picture on the screen. What I like to do now is grab a single frame and process it as bitmap. From what I could find out to this point it is not an easy thing to do.

I've tried using the onPreviewFrame method with which you get the current frame as a byte array and tried to decode it with the BitmapFactory class but it returns null. The format of the frame is a headerless YUV which could be translated to bitmap but it takes too long on a phone. Also I've read that the onPreviewFrame method has contraints on the runtime, if it takes too long the application could crash.

So what is the right way to do this?

A: 

Using ffmpeg we can do it in native code.

Vinay
+2  A: 

Ok what we ended up doing is using the onPreviewFrame method and decoding the data in a seperate Thread using a method which can be found in the android help group.

decodeYUV(argb8888, data, camSize.width, camSize.height);
Bitmap bitmap = Bitmap.createBitmap(argb8888, camSize.width,
                    camSize.height, Config.ARGB_8888);

...

// decode Y, U, and V values on the YUV 420 buffer described as YCbCr_422_SP by Android 
// David Manpearl 081201 
public void decodeYUV(int[] out, byte[] fg, int width, int height)
        throws NullPointerException, IllegalArgumentException {
    int sz = width * height;
    if (out == null)
        throw new NullPointerException("buffer out is null");
    if (out.length < sz)
        throw new IllegalArgumentException("buffer out size " + out.length
                + " < minimum " + sz);
    if (fg == null)
        throw new NullPointerException("buffer 'fg' is null");
    if (fg.length < sz)
        throw new IllegalArgumentException("buffer fg size " + fg.length
                + " < minimum " + sz * 3 / 2);
    int i, j;
    int Y, Cr = 0, Cb = 0;
    for (j = 0; j < height; j++) {
        int pixPtr = j * width;
        final int jDiv2 = j >> 1;
        for (i = 0; i < width; i++) {
            Y = fg[pixPtr];
            if (Y < 0)
                Y += 255;
            if ((i & 0x1) != 1) {
                final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
                Cb = fg[cOff];
                if (Cb < 0)
                    Cb += 127;
                else
                    Cb -= 128;
                Cr = fg[cOff + 1];
                if (Cr < 0)
                    Cr += 127;
                else
                    Cr -= 128;
            }
            int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
            if (R < 0)
                R = 0;
            else if (R > 255)
                R = 255;
            int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1)
                    + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
            if (G < 0)
                G = 0;
            else if (G > 255)
                G = 255;
            int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
            if (B < 0)
                B = 0;
            else if (B > 255)
                B = 255;
            out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
        }
    }

}

Link: http://groups.google.com/group/android-developers/browse_thread/thread/c85e829ab209ceea/3f180a16a4872b58?lnk=gst&amp;q=onpreviewframe#3f180a16a4872b58

Alexander Stolz
A: 

I actually tried the code given the previous answer found that the Colorvalues are not exact. I checked it by taking both the preview and the camera.takePicture which directly returns a JPEG array. And the colors were very different. After a little bit more searching I found another example to convert the PreviewImage from YCrCb to RGB:

static public void decodeYUV420SP(int[] rgb, byte[] yuv420sp, int width, int height) {
    final int frameSize = width * height;

    for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
        for (int i = 0; i < width; i++, yp++) {
            int y = (0xff & ((int) yuv420sp[yp])) - 16;
            if (y < 0) y = 0;
            if ((i & 1) == 0) {
                v = (0xff & yuv420sp[uvp++]) - 128;
                u = (0xff & yuv420sp[uvp++]) - 128;
            }
            int y1192 = 1192 * y;
            int r = (y1192 + 1634 * v);
            int g = (y1192 - 833 * v - 400 * u);
            int b = (y1192 + 2066 * u);

            if (r < 0) r = 0; else if (r > 262143) r = 262143;
            if (g < 0) g = 0; else if (g > 262143) g = 262143;
            if (b < 0) b = 0; else if (b > 262143) b = 262143;

            rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
        }
    }
}

The color values given by this and the takePicture() exactly match. I thought I should post it here. This is where I got this code from. Hope this helps.

Codevalley