> Losing the alpha channel is not ideal. I can do that by encoding both
> images to JPG and blending the two together, but the top image is
> mostly blank, so that just results in a dark background. Send the
> background, overlay, and a separate grayscale alpha mask for the
> overlay, all as JPGs, is another option, but that increases image
> encoding time and bandwidth requirements.
I can’t say for sure since I don’t know what your source is, but there probably 
was no alpha channel to begin with. If the top image is blank maybe you want to 
switch the order and/or use a different blend mode? As you say, generating a 
mask and applying it will be too complicated if 10 frames of delay is 
unacceptable.
 
> Also, it looks like I made a mistake when testing PNG-only stream. I'm
> now seeing the same 10+ frame latency with a single PNG input as with
> JPG background and PNG overlay, which actually makes me feel a little
> better since that eliminates the filter graph as the source of the
> delay (unless it's the RGB -> YUV conversion?). I think it has to be
> the PNG decoder.

What exactly is this latency being measured as, by the way? I thought it was 
between the two streams of images, but I guess not? Are you sure it’s not the 
network or the image generator taking longer?

As long as you have a png in there the slower speed is unavoidable because of 
the conversion to rgb, but also the png stream is probably a lot bigger, so if 
you have to transfer it over a network to process (back into yuv it sounds 
like) you should probably try to keep it in yuv. Anything you do in rgb should 
be doable in yuv too with a little math, as long as you’re not rotoscoping 
things out or something.
_______________________________________________
ffmpeg-user mailing list
[email protected]
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
[email protected] with subject "unsubscribe".

Reply via email to