Hello, *My Cases:*
1. I have a RTP stack, using which i am receiving raw audio and video streams from the network. 2. Similarly, I will be sending audio and video streams towards the network using the same RTP stack. *Following are my questions/doubts:* For working with audio formats to realize cases-1 and 2, I can use the AudioTrack and AudioRecord APIs to play/record audio streams. However, for working with video formats to realize cases-1 and 2, are there any alternatives available similar to the Audio APIs ? If I receive a raw video stream from the network through my stack, and i wish to play it in real-time, how do I go about it ? Can i use the MediaController and MediaPlayer to render the video on a custom VideoRenderer activity (GUI) ? For video transmission, I am thinking of capturing the "preview" stream from the camera and transmitting it as Live video through my stack. *Problem Summary:* In short, I can find ways to "encode" and "decode" audio streams in android as well as "encode" video from the camera source. However, I dont know how to "decode" incoming video streams on android using its APIs. Inputs and suggestions appreciated and will be very helpful. If there is anything above that is conceptually incorrect, please let me know. Best Regards -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

