Hi Riasat, the APIs you mention won't probably do what you need. You can build a command line app (shell access *does* usually give you access to hw codecs) but no hope from java / jni. You can use hw codecs, have a look at ffmpeg head, but don't tell anyone because they're not public APIs, but just internal interfaces to StageFright (I refer to OmxClient here) Actually those "not an API"s might be stable enough for your needs (it seems like Flash player is using them - I'd say google video call also - and this might make it undesirable to change them).
Please remember that if you use those APIs you will contribute to Android fragmentation. Hope it helps, Luca. On Sun, Oct 16, 2011 at 12:04 PM, Abir <[email protected]> wrote: > Hi, > > Have anyone integrated openmax hardware encoding/decoding with Android > ndk program? > > I've wrote a program following this: > > http://developer.nvidia.com/archived-tegra-forums/forum/android-how-call-libpvnvomxso-successfully > > But I get OMX_ErrorInsufficientResources and my application doesn't > have system privilege. > > I've managed to run the same program in Android shell (as su) and it > works! > > So, How can I Integrate OMX HW encoding/decoding with Android App? > > Thanks, > Riasat > > -- > unsubscribe: [email protected] > website: http://groups.google.com/group/android-porting > -- unsubscribe: [email protected] website: http://groups.google.com/group/android-porting
