Hi all,
I am decoding a real-time stream sent from another machine all works
flawlessly on LAN. So to test the performance I started simulating for
various network characteristics by use of netem on the server side.
Artifacts are in the bounds of expactation for corruption, duplication and
packet loss, however I get an unusable image (white overlay over the whole
frame) when adding the slightest amount of variance to the delay parameter.
Can anyone point me into the right direction on what might be causing this?
Are there certain flags I need to set for the codeccontext maybe? It is
streamed in the MPEG-TS container over UDP with H.264 as the codec. I tried
to optimize the codeccontext for minimal delay with aims for real-time
encoding.
c->bit_rate = 400000;
c->width = M_WIDTH;
c->height = M_HEIGHT;
c->time_base.den = 15;
c->time_base.num = 1;
c->gop_size = 15;
c->codec = *codec;
c->codec_type = AVMEDIA_TYPE_VIDEO;
c->coder_type = FF_CODER_TYPE_VLC;
c->delay = 0;
c->max_b_frames = 0;
c->thread_count = 1;
c->pix_fmt = AV_PIX_FMT_YUV420P;
av_opt_set(c->priv_data, "preset", "ultrafast", 0);
av_opt_set(c->priv_data, "tune", "zerolatency", 0);
av_opt_set_int(c->priv_data, "intra-refresh",1,0);
av_opt_set_int(c->priv_data,"slice-max-size", 1200,0 );
av_opt_set_int(c->priv_data,"vbv-maxrate", 3000,0 );
av_opt_set_int(c->priv_data,"vbv-bufsize", 200,0 );
av_opt_set_int(c->priv_data,"crf", 23, 0);
av_opt_set(c->priv_data,
"x264opts","no-mbtree:sliced-threads:sync-lookahead=0", 0);
Any suggestions on what might be causing this as well as setting
improvements are welcome.
Best Regards,
Janis
_______________________________________________
libav-api mailing list
[email protected]
https://lists.libav.org/mailman/listinfo/libav-api