On 03/18/2012 07:12 PM, Ronald S. Bultje wrote:
> @@ -960,19 +887,17 @@ static void ipvideo_decode_opcodes(IpvideoContext *s)
> av_dlog(NULL, "------------------ frame %d\n", frame);
> frame++;
>
> + bytestream2_skip(&s->stream_ptr, 14); /* data starts 14 bytes in */
> if (!s->is_16bpp) {
> /* this is PAL8, so make the palette available */
> memcpy(s->current_frame.data[1], s->pal, AVPALETTE_SIZE);
>
> s->stride = s->current_frame.linesize[0];
> - s->stream_ptr = s->buf + 14; /* data starts 14 bytes in */
> - s->stream_end = s->buf + s->size;
> } else {
> s->stride = s->current_frame.linesize[0] >> 1;
> - s->stream_ptr = s->buf + 16;
> - s->stream_end =
> - s->mv_ptr = s->buf + 14 + AV_RL16(s->buf+14);
> - s->mv_end = s->buf + s->size;
> + s->mv_ptr = s->stream_ptr;
> + bytestream2_skip(&s->mv_ptr, bytestream2_get_le16(&s->stream_ptr));
> + s->stream_ptr.buffer_end = s->mv_ptr.buffer;
That last part looks kind of odd. Why do you have to change the internal
GetByteContext fields like that?
-Justin
_______________________________________________
libav-devel mailing list
[email protected]
https://lists.libav.org/mailman/listinfo/libav-devel