Hi,
I know there has been stagrefright and open core inside the Android
Multimedia Framework.But Now I am trying to use the audio/video streaming
functionalities using the *gstreamer* commands(as well c application for
them)  in the adb shell.

For example wheni pluggedin a web cam in my ubuntu machine and issue this
command --> *gst-launch-0.10 autovideosrc !
video/x-raw-yuv,framerate=\(fraction\)30/1,width=640,height=480 !
ffmpegcolorspace ! autovideosink* while connecting the web cam to my linux
machine i will be able to see the web cam turned on, now I want to simulate
the same on the android phone and issue the same command(with some
modification and also by cross compiling it with respect to the andrid
phone or tablet)

I have got this
link<http://gstreamer.freedesktop.org/wiki/GstreamerAndroid_InstallInstructions>for
compiling the gstreamer for Nexus S .Before get to start i want to ask
:
(1) Has any one tried cross compiling the *gstreamer* for any android phone
and using it (application or via commands)
(2) Is it possible to compile it using the ndk tool chain and put the *
gstreamer* code inside our android application by creatign a shared library
out of it and loading and using the functionalities in the  app code.

I want to request :

*(1) For any tutorial /blog link if you have doen this before*

Thanks and advance,I will keep adding to this post my work so far
plz assit and send your inputs
Rgds,
Softy

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to