Hello,
my goal is to play/record audio data and de/encode it using the MediaCodec
and AAC.
I worked my way through the MediaCodec(.*) documentation (which is very
poor I think) and am now able to en/decode audio.
Currently I'm writing the encoded data into a file. The problem is that
there
Dear all,
I am using the new MediaCodec API to successfully decode a H264 stream from
a live source. The decoder outputs decoded data in YUV420 Planar, I would
like to have RGB data. Since the YUV to RGB conversion will introduce
additional overhead to my application, we would like to tell the
Dear all,
we are trying to use the new MediaCodec API and we follow the example here
http://developer.android.com/reference/android/media/MediaCodec.html to
decode live H264 video. We queue individual H264 frames to the decoder and
expect to receive decoded data after dequeueing the output
Hello, I'm working on video player using the latest MediaCodec API. I have
audio and video tracks playing in separate threads. The problem is that
video is rendered too fast. Audio and video track are not synchronized. How
to achieve that? Where can be the problem with fast video?
--
You
4 matches
Mail list logo