views:

460

answers:

4

Hi. I am trying to encode series of images to one video file. I am using code from api-example.c, its works, but it gives me weird green colors in video. I know, I need to convert my RGB images to YUV, I found some solution, but its doesn't works, the colors is not green but very strange, so thats the code:

    // Register all formats and codecs
    av_register_all();

    AVCodec *codec;
    AVCodecContext *c= NULL;
    int i, out_size, size, outbuf_size;
    FILE *f;
    AVFrame *picture;
    uint8_t *outbuf;

    printf("Video encoding\n");

    /* find the mpeg video encoder */
    codec = avcodec_find_encoder(CODEC_ID_MPEG2VIDEO);
    if (!codec) {
        fprintf(stderr, "codec not found\n");
        exit(1);
    }

    c= avcodec_alloc_context();
    picture= avcodec_alloc_frame();

    /* put sample parameters */
    c->bit_rate = 400000;
    /* resolution must be a multiple of two */
    c->width = 352;
    c->height = 288;
    /* frames per second */
    c->time_base= (AVRational){1,25};
    c->gop_size = 10; /* emit one intra frame every ten frames */
    c->max_b_frames=1;
    c->pix_fmt = PIX_FMT_YUV420P;

    /* open it */
    if (avcodec_open(c, codec) < 0) {
        fprintf(stderr, "could not open codec\n");
        exit(1);
    }

    f = fopen(filename, "wb");
    if (!f) {
        fprintf(stderr, "could not open %s\n", filename);
        exit(1);
    }

    /* alloc image and output buffer */
    outbuf_size = 100000;
    outbuf = malloc(outbuf_size);
    size = c->width * c->height;

#pragma mark -
    AVFrame* outpic = avcodec_alloc_frame();
    int nbytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);

    //create buffer for the output image
    uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes);

#pragma mark -  
    for(i=1;i<77;i++) {
        fflush(stdout);

        int numBytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
        uint8_t *buffer = (uint8_t *)av_malloc(numBytes*sizeof(uint8_t));

        UIImage *image = [UIImage imageNamed:[NSString stringWithFormat:@"10%d", i]];
        CGImageRef newCgImage = [image CGImage];

        CGDataProviderRef dataProvider = CGImageGetDataProvider(newCgImage);
        CFDataRef bitmapData = CGDataProviderCopyData(dataProvider);
        buffer = (uint8_t *)CFDataGetBytePtr(bitmapData);   

        avpicture_fill((AVPicture*)picture, buffer, PIX_FMT_RGB8, c->width, c->height);
        avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, c->width, c->height);

        struct SwsContext* fooContext = sws_getContext(c->width, c->height, 
                                                PIX_FMT_RGB8, 
                                                c->width, c->height, 
                                                PIX_FMT_YUV420P, 
                                                SWS_FAST_BILINEAR, NULL, NULL, NULL);

        //perform the conversion
        sws_scale(fooContext, picture->data, picture->linesize, 0, c->height, outpic->data, outpic->linesize);
        // Here is where I try to convert to YUV

        /* encode the image */
        out_size = avcodec_encode_video(c, outbuf, outbuf_size, outpic);
        printf("encoding frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, out_size, f);

        free(buffer);
        buffer = NULL;      

    }

    /* get the delayed frames */
    for(; out_size; i++) {
        fflush(stdout);

        out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
        printf("write frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, outbuf_size, f);      
    }

    /* add sequence end code to have a real mpeg file */
    outbuf[0] = 0x00;
    outbuf[1] = 0x00;
    outbuf[2] = 0x01;
    outbuf[3] = 0xb7;
    fwrite(outbuf, 1, 4, f);
    fclose(f);
    free(outbuf);

    avcodec_close(c);
    av_free(c);
    av_free(picture);
    printf("\n");

Please give me advice how to fix that problem.

A: 

I think the problem is most likely that you are using PIX_FMT_RGB8 as your input pixel format. This does not mean 8 bits per channel like the commonly used 24-bit RGB or 32-bit ARGB. It means 8 bits per pixel, meaning that all three color channels are housed in a single byte. I am guessing that this is not the format of your image since it is quite uncommon, so you need to use PIX_FMT_RGB24 or PIX_FMT_RGB32 depending on whether or not your input image has an alpha channel. See this documentation page for info on the pixel formats.

Jason
Thanks. I tried PIX_FMT_RGB24 and PIX_FMT_RGB32 and many others. I used PIX_FMT_RGB8 because my images was optimized for network, its indexed 8bit colors pngs and other photoshop stuff to make it smaller. But I tried other images in usual formats, just to check, with different options of PIX_FMT, it still gives me wrong result, so I think the trouble in code.
Steve
Did you try PIX_FMT_PAL8? That format is for 8-bit indexed color. With PIX_FMT_PAL8, a color palette must be sent along with the frame. That documentation page I linked to talks about it right before defining all of the pixel formats. It says that AVFrame.data[0] contains indexes into the palette for each pixel while AVFrame.data[1] contains a 1024 byte array defining the actual ARGB color values for each 8-bit index.
Jason
Thanks, I'll try it.
Steve
A: 

You can see article http://unick-soft.ru/Articles.cgi?id=20. But it is article on Russian, but it includes code samples and VS Example.

Unick
A: 

Hi Steve ..Same issue..I think you might solve this...anybody guide me please...

Sat
Try 32bit color pictures
Steve
A: 

PIX_FMT_BGR32 fixed the problem. Thx for code.

drag