Mark

thank you for a great email. It makes a bit more sense. I am using YUV_420, so 
I think I will be ok in terms of subsampling. I also understand that pixels are 
stored in data[0] followed by UV with 2x2 format. One thing i am unclear is how 
do I build a new frame. Do i just allocate and then assign pixel from data[0] 
of original frame to data[0] of a new frame which i will transpose according to 
y'=-x, and x'=y formulas to rotate the image? What do i do with UV or data[1] 
and data[2]? will it be the same translation?

for example:
int num;
uint8_t *buffer;
AVFrame *newframe;

newframe = avcodec_alloc_frame();
if (newframe == NULL)
        return;
num = avpicture_get_size(PIX_FMT_UYV420P, width, height);


buffer = (uint8_t *) av_malloc(num * sizeof(uint8_t));
avpicture_fill((AVPicture *) newframe, buffer, PIX_FMT_YUV420P, width, height);

for (h=0;h<height;h++)
for (w=0;w<width;w++)
        newh = w*(-1);
        neww=h;
        (newframe->data[0]+((newh * 
newframe->linesiz[0])+neww)=origFrame->data[0]+((h*->linesize[0])+w);

Once the transpose is complete, i would pass a new frame to encoder.

One thing i am not so sure is if my formula for pixel transpose is correct. can 
you provide corrections?

Alex

On Nov 2, 2010, at 8:10 PM, Mark Heath wrote:

> 
> Hi Alex,
> 
> I do a lot of work creating filters for ffmpeg and other YUV based video 
> libraries.
> 
> When doing a rotate, there are 2 gotchas you have to be aware of.
> 
> 1) Chroma subsampling.  Anything that has a different vertical sampling to 
> horizontal will cause problems for rotation. EG chroma subsampling such as 
> 422, 411 and 410 will cause problems. (444 and 420 should be ok)
> 
> 2) interlaced video.  If your video is from an interlaced source you will 
> have a great deal of headaches.
> 
> 
> The video data is pretty simply stored.
> 
> It's 3 planes of data, one for Y one for U and one for V.
> Which are referenced in the AVFrame struct as data[0], data[1] and data[2]
> To determine your X and Y location in the data you multiply the Y by the 
> linesize
> 
> eg
> pixel = *((AVFrame->data[0])+y*AVFrame->linesize[0]+x);
> 
> Due to chroma subsampling the size of the U and V planes will be smaller than 
> the Y.  444 video has no chroma subsampling.
> 
> Here's a quick table to illustrate how the subsampling affects the chroma 
> plane size, using a video size of 720x576.
> 
> 444 720x480
> 422 360x480
> 411 180x480
> 420 360x288
> 410 180x288
> 
> The chroma pixel (UV) refers to a number of luma pixels (Y)
> It'd be a good idea learning how these pixels are arranged.
> 
> 
> Any questions, feel free to ask.
> I hope this helps.
> 
> Mark
> 
> 
> On 02/11/2010, at 8:30 AM, [email protected] wrote:
> 
>> sorry for the newbie questions....
>> 
>> I need to perform 90 degree rotation of each AVFrame. I need to do it 
>> programmatically rather then using filters on ffmpeg command line. I was 
>> trying to understand how to do it, but unfortunately there is not much 
>> details on AVFrame.data and linesize elements. I am using YUV_420 format and 
>> I understand there will be only 3 references in data and linesize. That is 
>> as far as I got. Any information would be greatly appreciated.
>> 
>> Alex_______________________________________________
>> libav-user mailing list
>> [email protected]
>> https://lists.mplayerhq.hu/mailman/listinfo/libav-user
> 
> _______________________________________________
> libav-user mailing list
> [email protected]
> https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Attachment: smime.p7s
Description: S/MIME cryptographic signature

_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to