long duration a/v sync in opencore

2009-07-22 Thread Andy Quan
Hi all  esp. PV engineers,
I am working on a hard bug found during long duration video playback by
OpenCORE v1.0 and it hangs mediaserver.

I find an extremely old email talking about bugs related to long duration
avsync playback of opencore by Dave Sparks. It seems to be about OpenCORE
v1.0. I'd like to know the details about this bug, e.g. description and root
cause and workarounds? Thanks.

Please refer to the following statements by Dave Sparks. Thanks.
On Thu, Feb 26, 2009 at 4:02 PM, Dave Sparks davidspa...@android.comwrote:


 The audio MIO counts the number of audio samples output to the audio
 sink and divides by the frame rate. This value is adjusted by the
 latency reported by the audio sink (which ultimately comes from the
 audio HAL plus the latency in the software mixer engine). For some
 reason with OpenCore 1.0, we could never reconcile the real hardware
 latency with the presentation of video frames in the MIO, so we had to
 add a fudge factor. We are hoping that OpenCore 2.0's new media
 playback clock will resolve this issue.

 I think there is an outstanding bug related to the playback clock and
 long streams, but I seem to recall that was 5 hours. For most mobile
 devices, the battery is going to die long before that if the DSP and
 backlight are powered up the entire time.
 --
 Thanks,
 Andy


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: long duration a/v sync in opencore

2009-07-22 Thread Andy Quan
Thanks, Ravi. I will have a look at updateclock.
The crash eventually lies in video decoding thread but before that, I find
GetCurrentTime64 returning unreasonable data.

On Thu, Jul 23, 2009 at 11:07 AM, RaviY yend...@pv.com wrote:


 Have you tried logging the android_audio_mio.cpp ... function:
 AndroidAudioMIOActiveTimingSupport::UpdateClock(). Check if the clock
 value is overflowing.

 Also, when you did see a crash, did you get a stack trace, or do you
 know where it is crashing?

 -Ravi



 On Jul 22, 9:37 pm, Andy Quan androidr...@gmail.com wrote:
  My video clip is 80min but it crashed sometime in the mid of somewhere,
 and
  othertime after 1-2 round playing. But According to Dave, it was 5 Hrs. I
  thought this might be different due to hardware platform.
 
  Do you have any idea of this?
 
 
 
   On Thu, Jul 23, 2009 at 10:11 AM, RaviY yend...@pv.com wrote:
 
   How long is the long duration?
 
   On Jul 22, 11:08 am, Andy Quan androidr...@gmail.com wrote:
Hi all  esp. PV engineers,
I am working on a hard bug found during long duration video playback
 by
OpenCORE v1.0 and it hangs mediaserver.
 
I find an extremely old email talking about bugs related to long
 duration
avsync playback of opencore by Dave Sparks. It seems to be about
 OpenCORE
v1.0. I'd like to know the details about this bug, e.g. description
 and
   root
cause and workarounds? Thanks.
 
Please refer to the following statements by Dave Sparks. Thanks.
On Thu, Feb 26, 2009 at 4:02 PM, Dave Sparks 
 davidspa...@android.com
   wrote:
 
 The audio MIO counts the number of audio samples output to the
 audio
 sink and divides by the frame rate. This value is adjusted by the
 latency reported by the audio sink (which ultimately comes from the
 audio HAL plus the latency in the software mixer engine). For some
 reason with OpenCore 1.0, we could never reconcile the real
 hardware
 latency with the presentation of video frames in the MIO, so we had
 to
 add a fudge factor. We are hoping that OpenCore 2.0's new media
 playback clock will resolve this issue.
 
 I think there is an outstanding bug related to the playback clock
 and
 long streams, but I seem to recall that was 5 hours. For most
 mobile
 devices, the battery is going to die long before that if the DSP
 and
 backlight are powered up the entire time.
 --
 Thanks,
 Andy
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Intra frame recognition in OpenCORE

2009-07-20 Thread Andy Quan
Hi,
I am working on video playback in Android 1.5 with OpenCORE. I want to know
whether OpenCORE provides any method to recognize Intra frame in the
framework, that is, outside OMX IL component.

I find there is OMX flag named OMX_BUFFER_SYNCFRAME, but I did not find it
set when Intra frame was passed in SendInputBuffertoOMXComponent.

Thanks a lot.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Intra frame recognition in OpenCORE

2009-07-20 Thread Andy Quan
I want to recognize Intra frame before that frame is sent to OMX IL for
decoding.
I guess random access point information might be in mp4/3gp container. But
I think this might not be definitely true for each stream...

Do you have any suggestion? Thanks.

On Tue, Jul 21, 2009 at 12:27 AM, RaviY yend...@pv.com wrote:


 This is usually set by the source node.

 But, it looks like, in some cases, it is being set only during
 repositioning and not during regular playback.

 Can you explain what your usecase is and I can try to see what could
 be done?

 -Ravi

 On Jul 20, 11:04 am, Andy Quan androidr...@gmail.com wrote:
  Ravi,How is this bit
 PVMF_MEDIA_DATA_MARKER_INFO_RANDOM_ACCESS_POINT_BIT
  set during decoding? I mean what PVMF relies to recognize a random access
  point.
 
  I printed nflags of certain H264 mp4 file but found no
  OMX_BUFFERFLAG_SYNCFRAME set.
 
  Could you please help? Thanks.
 
 
 
  On Mon, Jul 20, 2009 at 11:52 PM, RaviY yend...@pv.com wrote:
 
  http://android.git.kernel.org/?p=platform%2Fexternal%2Fopencore.gita.
 ..
 
   PVMF has a bit set for each frame
   (PVMF_MEDIA_DATA_MARKER_INFO_RANDOM_ACCESS_POINT_BIT) that is
   considered to be a random access point. The OMX node looks at that bit
   and then sets the OMX_BUFFERFLAG_SYNCFRAME bit to the input buffer
   header flags.
 
   -Ravi
 
   On Jul 20, 1:47 am, Andy Quan androidr...@gmail.com wrote:
Hi,
I am working on video playback in Android 1.5 with OpenCORE. I want
 to
   know
whether OpenCORE provides any method to recognize Intra frame in the
framework, that is, outside OMX IL component.
 
I find there is OMX flag named OMX_BUFFER_SYNCFRAME, but I did not
 find
   it
set when Intra frame was passed in SendInputBuffertoOMXComponent.
 
Thanks a lot.
 
--
Thanks,
Andy
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: libopencorehw implementation

2009-06-16 Thread Andy Quan
Ravi,Could you help look at line 1045 of this file?
Line 1045:
spMemoryHeapBase master = (MemoryHeapBase *) fd;

fd is of uint32 and stands for file descriptor from OMX unit. But this line
crashed as soon as it was reached. I guess this crash is because of
definition of sp.

314http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l314templatetypename
T
315http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l315spT
spT::operator = (T* other)
316http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l316{
317http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l317
   if (other) other-incStrong(this);
318http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l318
   if (m_ptr) m_ptr-decStrong(this);
319http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l319
   m_ptr = other;
320http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l320
   return *this;
321http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;f=include/utils/RefBase.h;h=e37b56f5bf93f3e14fce37aa06edbd66a6f381c2;hb=release-1.0#l321}

It seems that fd-incStrong is called but actually fd is only a file
descriptor instead of a refbase object...

Do you have any comment on this problem? Did I misunderstand anything? Thank
you.

On Tue, Jun 16, 2009 at 1:38 PM, Ravi yend...@pv.com wrote:


 Look at the code in release-1.0.

 http://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=android/android_surface_output.cpp;h=aa63c05d94ae265056a671aff9548769c139778b;hb=release-1.0

 This was definitely a working piece of code. A very close (if not the
 same) version was used in the first device release. But the code has
 changed quite a bit.

 The code under samples is a version spun off from the original, and
 is merely to demonstrate the usage.

 -Ravi

 On Jun 15, 11:57 pm, Andy Quan androidr...@gmail.com wrote:
  Hi,I find there are some sample files under
  external/opencore/android/samples demonstrating how child class of
  android_surface_output should be created. However, I did not find it
  eventually used in open source git. So my question is that, is this
 sample
  file the same as the one used in G1 or HTC release?
  I reused this file but there come up some problems in PMEM usage. So I
  wonder if this is a verified source code or simply a demonstration. Thank
  you.
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: libopencorehw implementation

2009-06-16 Thread Andy Quan
Thanks.Dave, could you please help?

On Tue, Jun 16, 2009 at 9:53 PM, Ravi yend...@pv.com wrote:


 Are you using this code for your own hardware?

 Did you check if fd was valid? Looks like fd can be uninitialized.

 Dave might be able to help you. I am not too familiar with the device
 specific code.

 -Ravi

 On Jun 16, 3:18 am, Andy Quan androidr...@gmail.com wrote:
  Ravi,Could you help look at line 1045 of this file?
  Line 1045:
  spMemoryHeapBase master = (MemoryHeapBase *) fd;
 
  fd is of uint32 and stands for file descriptor from OMX unit. But this
 line
  crashed as soon as it was reached. I guess this crash is because of
  definition of sp.
 
  314
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;..
 .templatetypename
  T
  315
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;..
 .spT
  spT::operator = (T* other)
  316
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...{
  317
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...
 if (other) other-incStrong(this);
  318
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...
 if (m_ptr) m_ptr-decStrong(this);
  319
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...
 m_ptr = other;
  320
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...
 return *this;
  321
 http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...}
 
  It seems that fd-incStrong is called but actually fd is only a file
  descriptor instead of a refbase object...
 
  Do you have any comment on this problem? Did I misunderstand anything?
 Thank
  you.
 
 
 
  On Tue, Jun 16, 2009 at 1:38 PM, Ravi yend...@pv.com wrote:
 
   Look at the code in release-1.0.
 
  http://android.git.kernel.org/?p=platform/external/opencore.git;a=blo.
 ..
 
   This was definitely a working piece of code. A very close (if not the
   same) version was used in the first device release. But the code has
   changed quite a bit.
 
   The code under samples is a version spun off from the original, and
   is merely to demonstrate the usage.
 
   -Ravi
 
   On Jun 15, 11:57 pm, Andy Quan androidr...@gmail.com wrote:
Hi,I find there are some sample files under
external/opencore/android/samples demonstrating how child class of
android_surface_output should be created. However, I did not find it
eventually used in open source git. So my question is that, is this
   sample
file the same as the one used in G1 or HTC release?
I reused this file but there come up some problems in PMEM usage. So
 I
wonder if this is a verified source code or simply a demonstration.
 Thank
you.
 
--
Thanks,
Andy
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



libopencorehw implementation

2009-06-15 Thread Andy Quan
Hi,I find there are some sample files under
external/opencore/android/samples demonstrating how child class of
android_surface_output should be created. However, I did not find it
eventually used in open source git. So my question is that, is this sample
file the same as the one used in G1 or HTC release?
I reused this file but there come up some problems in PMEM usage. So I
wonder if this is a verified source code or simply a demonstration. Thank
you.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



pmem usage in opencore of Android1.5

2009-06-09 Thread Andy Quan
I have a question about pmem usage during video playback in opencore.
In android_surface_output_fb.cpp, there is a special path for YVUSemiplanar
format, where a private struct pointer is passed from video omx unit,
i.e. data_header_info.private_data_ptr. 2 local functions are provided as
getPmemFd and getOffset to achieve file descriptor and offset.

My question is: does video omx pass fd and offset only to MIO or does it
pass MemHeapBase pointer? The latter might mean that MemHeapBase is
initially constructed inside video omx unit. I just could not recognize
where that MemHeapBase is created. Anyone can help me understand? Thanks in
advance!!

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



a/v sync timing in Android 1.5

2009-06-07 Thread Andy Quan
Hi,I met some severe frame drops in high motion movie playback in Android
1.5. I know that there are early margin and late margin in
pv_player_engine_tunables.h. I'd like to know if these are the only 2
parameters I should try to tune for this problem. Any other parameters I
should pay attention to? Thanks in advance.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: low performance of audio out thread in android 1.5

2009-05-29 Thread Andy Quan
Hi folks,Any idea of this problem, i.e. AudioTrack::obtainBuffer()? I'd
appreciate if there is any information to help me out:)

On Tue, May 26, 2009 at 5:29 PM, Andy Quan androidr...@gmail.com wrote:

 Hi,I meet a problem where audio out thread of media playback is of
 unexpected high performance (8% of 624MHz) during video+audio playback. It
 does not come up for pure audio playback. Anybody met similar situations
 here?

 I find the problem is because that audio out thread is not able to obtain
 audio writing buffer timely. I use 8 buffer count in track creation, so dont
 really understand why this happens.

 At least one thing I find is that there are some changes in the following
 function:
 status_t AudioTrack::obtainBuffer(Buffer* audioBuffer, int32_t waitCount).
 It uses a specific interval of 10 ms (WAIT_PERIOD_MS) as a timing out to
 check the availability of audio write buffer. However, in older code base,
 which might be a couple of months ago, the timing out is 1 second. And I
 changed it back to be 1 second and that big performance cost was gone. So I
 guess if it could be due to many times of conditional response function in 
 out.

 Any one here could shed some light on why there is a change from 1s to
 10ms? And I am not sure if I have used a correct method to solve this
 problem. So please correct me if I made any misunderstanding. Thanks.

 --
 Thanks,
 Andy




-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



low performance of audio out thread in android 1.5

2009-05-26 Thread Andy Quan
Hi,I meet a problem where audio out thread of media playback is of
unexpected high performance (8% of 624MHz) during video+audio playback. It
does not come up for pure audio playback. Anybody met similar situations
here?

I find the problem is because that audio out thread is not able to obtain
audio writing buffer timely. I use 8 buffer count in track creation, so dont
really understand why this happens.

At least one thing I find is that there are some changes in the following
function:
status_t AudioTrack::obtainBuffer(Buffer* audioBuffer, int32_t waitCount).
It uses a specific interval of 10 ms (WAIT_PERIOD_MS) as a timing out to
check the availability of audio write buffer. However, in older code base,
which might be a couple of months ago, the timing out is 1 second. And I
changed it back to be 1 second and that big performance cost was gone. So I
guess if it could be due to many times of conditional response function in 
out.

Any one here could shed some light on why there is a change from 1s to 10ms?
And I am not sure if I have used a correct method to solve this problem. So
please correct me if I made any misunderstanding. Thanks.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: MediaRecorder / MediaPlayer simultaneous use?

2009-05-20 Thread Andy Quan
Dave,As for the restriction you mentioned, would I run into this violation
if I first minimize my video player (push it to the background like what it
is for music player), then open camcorder?

On Tue, May 19, 2009 at 11:03 PM, Dave Sparks davidspa...@android.comwrote:


 The media framework should be able to support it if you don't overtax
 the DSP, but I believe there's a restriction in SurfaceFlinger that
 prevents you from having two push-buffer SurfaceViews at the same
 time.

 On May 18, 5:34 pm, John Bresnik jbres...@gmail.com wrote:
  Anyone have experience doing simultaneous playback / recording, i.e. full
  duplex - couldn't find a definitive answer on whether it was support or
  not.
  Thanks
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



OpenCORE rtsp streaming in android1.5

2009-05-20 Thread Andy Quan
Hi,I am working on rtsp streaming with opencore in android1.5. I want to
check time stamp information for each data packet at the very beginning of
opencore framework. Could anybody tell me which part of source codes I
should refer to? I guess it should be rtp packets receiver at the beginning
of opencore pipeline.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



opencore in donut

2009-05-10 Thread Andy Quan
Hi,It is said that opencore v2.x is under test along with donut branch.
Though right now we are spending most efforts on cupcake release with
opencore v1.0, I want to know whether there will be a time line around which
we may get to know if v2.x will eventually be used in donut.
We did some experiment jobs on v2.x. And according to the notes, there are
many bug fix and improvements, but for developers and vendors, my personal
understanding is that it is still high risk for us to put more efforts on
this version before we are finally informed of the donut decision.
Any heads up or comment is welcome!

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



aac header info in PVAuthor

2009-04-26 Thread Andy Quan
Hi,
I have a few questions with regarding to AAC encoder OMX integration in
OpenCORE v2.x.

1. If the output format is PVMF_MIME_AAC_MP4FF, is it the OMX IL's
responsibility to generate AudioSpecificConfig defined in ISO spec?

2. In FillBufferDoneProcessing, it is indicated that the first output buffer
of AAC encoder should be codec specific header info, is this codec header
info exactly the struct of AudioSpecificConfig? Any more information
involved?

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



encoder support in OpenCORE/cupcake

2009-04-21 Thread Andy Quan
I find in ./engines/author/src/single_core/pvaenodefactoryutility.h,
PVMFOMXVideoEncNodeFactory::CreateVideoEncNode() is called to create video
node. In my understanding, this means OMX is expected to be used in
PVAuthor. However, I did not find any encoder OMX provided by PV under
opencore folder...

My question is: is encoding path in opencore of cupcake tested? I think
theoratically 3rd party OMX should work in that node but I have no idea
whether that node itself can work correctly.

Many thanks in advance!

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: encoder support in OpenCORE/cupcake

2009-04-21 Thread Andy Quan
Thanks, Ravi.

In Cupcake, the OMX encoder component is hard-coded into the OMX
encoder node. It's not an issue because there are no software OMX
encoders. 

What does this mean? Hardware OMX component for internal use?

On Tue, Apr 21, 2009 at 8:32 PM, Ravi yend...@pv.com wrote:


 Repost ...


 http://groups.google.com/group/android-framework/browse_thread/thread/ef0d6aaab8390ea3/


 On Apr 21, 7:27 am, Andy Quan androidr...@gmail.com wrote:
  I find in ./engines/author/src/single_core/pvaenodefactoryutility.h,
  PVMFOMXVideoEncNodeFactory::CreateVideoEncNode() is called to create
 video
  node. In my understanding, this means OMX is expected to be used in
  PVAuthor. However, I did not find any encoder OMX provided by PV under
  opencore folder...
 
  My question is: is encoding path in opencore of cupcake tested? I think
  theoratically 3rd party OMX should work in that node but I have no idea
  whether that node itself can work correctly.
 
  Many thanks in advance!
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



camcorder question

2009-04-14 Thread Andy Quan
Hi,
I noticed that at the end of video recording, there will be some video
decoding behaviors which result in a still picture displayed as a background
picture. I guess this behavior can be enabled/disabled in Java code... Could
anybody tell me how to disable this decoding behavior? Below is PV logger
about what I said. Thanks.
PVLOG:TID(0x17260):Time=161:PORT OMXVideoDecIn Connected
PVLOG:TID(0x17260):Time=161:PORT PVMFMP4FFParOut(Vide Connected
PVLOG:TID(0x17260):Time=161:PORT MediaOutIn(Video) Connected
PVLOG:TID(0x17260):Time=162:PORT OMXVideoDecOut Connected
PVLOG:TID(0x17260):Time=219:PVMediaOutputNodePort::setParametersSync - FSI -
Fmt=X-YUV-420
PVLOG:TID(0x17260):Time=219:PVMediaOutputNodePort::ConfigMIO
setParametersSync of PVMF_FORMAT_SPECIFIC_INFO_KEY_VIDEO failed
PVLOG:TID(0x17260):Time=226:PVMFMP4FFParserNode::DoSetDataSourcePosition()
*actualMediaDataTS 0
PVLOG:TID(0x17260):Time=226:PVMFMP4FFParserNode::DoSetDataSourcePosition()
getOffsetByTime ret 0 Track 1 NPT 0 Offset 36
PVLOG:TID(0x17260):Time=226:PVMFMP4FFParserNode::DoSetDataSourcePosition()
Video targetNPT = 0 returns actualNPT=0 for trackId=1
PVLOG:TID(0x17260):Time=226:PVMFMP4FFParserNode::DoSetDataSourcePosition:
targetNPT=0, actualNPT=0, actualTS=0
PVLOG:TID(0x17260):Time=226:PORT OMXVideoDecIn In Msg Received  MediaCmd
FmtId BOS, SeqNum 0, SId 0, TS 0, Q-depth 1/10
PVLOG:TID(0x17260):Time=226:PORT PVMFMP4FFParOut(Vide Msg Sent Directly
MediaCmd FmtId BOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=226:PVMFMP4FFParserNode::SendBeginOfMediaStreamCommand()
BOS sent - Mime=video/MP4V-ES, StreamId=0, TS=0
PVLOG:TID(0x17260):Time=226:MEMP PVMFMP4FFPar(Video) Chunk allocated, 1/8 in
use
PVLOG:TID(0x17260):Time=226:MEMP PVMFMP4FFPar(Video) Chunk freed, 1/8 in use
PVLOG:TID(0x17260):Time=226:PORT OMXVideoDecIn In Msg Received  MediaCmd
FmtId EOS, SeqNum 0, SId 0, TS 0, Q-depth 2/10
PVLOG:TID(0x17260):Time=227:PORT PVMFMP4FFParOut(Vide Msg Sent Directly
MediaCmd FmtId EOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=227:PVMFMP4FFParserNode::SendEndOfTrackCommand -
Mime=video/MP4V-ES, StreamID=0, TrackID=1, TS=0, SEQNUM=0
PVLOG:TID(0x17260):Time=227:PVMediaOutputNodePort::SetSkipTimeStamp: TS=0,
Fmt=X-YUV-420
PVLOG:TID(0x17260):Time=228:PORT OMXVideoDecIn In Msg De-Q'd MediaCmd FmtId
BOS, SeqNum 0, SId 0, TS 0, Q-depth 1/10
PVLOG:TID(0x17260):Time=230:PORT MediaOutIn(Video) In Msg Received  MediaCmd
FmtId BOS, SeqNum 0, SId 0, TS 0, Q-depth 1/10
PVLOG:TID(0x17260):Time=230:PORT MediaOutIn(Video) In Msg De-Q'd MediaCmd
FmtId BOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=230:PVMediaOutputNodePort::HPA: BOS Recvd -
Fmt=X-YUV-420, TS=0, StreamID=0, Qs=0
PVLOG:TID(0x17260):Time=230:PORT OMXVideoDecOut Msg Sent Directly MediaCmd
FmtId BOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=230:PORT OMXVideoDecIn In Msg De-Q'd MediaCmd FmtId
EOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=230:PORT MediaOutIn(Video) In Msg Received  MediaCmd
FmtId EOS, SeqNum 0, SId 0, TS 0, Q-depth 1/10
PVLOG:TID(0x17260):Time=230:PVMediaOutputNodePort::HPA: PVMFInfoStartOfData
- Fmt=X-YUV-420
PVLOG:TID(0x17260):Time=230:PORT MediaOutIn(Video) In Msg De-Q'd MediaCmd
FmtId EOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=230:PVMediaOutputNodePort::HPA - EOS Recvd -
StreamID=0, Seq=0, TS=0, Fmt=X-YUV-420, Qs=0
PVLOG:TID(0x17260):Time=230:PVMediaOutputNodePort::SendEndOfData -
AsyncWrite - Fmt=X-YUV-420, Seq=0, TS=0, ClnUpQSize=0
PVLOG:TID(0x17260):Time=230:PORT OMXVideoDecOut Msg Sent Directly MediaCmd
FmtId EOS, SeqNum 0, SId 0, TS 0, Q-depth 0/10
PVLOG:TID(0x17260):Time=231:PVMediaOutputNodePort::writeComplete For EOS -
Fmt=X-YUV-420
PVLOG:TID(0x17260):Time=234:PORT OMXVideoDecIn Disconnected
PVLOG:TID(0x17260):Time=234:PORT PVMFMP4FFParOut(Vide Disconnected
PVLOG:TID(0x17260):Time=234:PORT MediaOutIn(Video) Disconnected
PVLOG:TID(0x17260):Time=234:PORT OMXVideoDecOut Disconnected
PVLOG:TID(0x17260):Time=234:PORT MediaOutIn(Video) Deleted
PVLOG:TID(0x17260):Time=234:PORT OMXVideoDecIn Deleted
PVLOG:TID(0x17260):Time=234:PORT PVMFMP4FFParOut(Vide Deleted
PVLOG:TID(0x17260):Time=235:PORT OMXVideoDecOut Deleted
-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



OpenCORE on Cupcake

2009-03-26 Thread Andy Quan
Hi,
I happen to find that OpenCORE on cupcake tree is still an older version
rather than v2.x.
1. Will v2.x be merged to cupcake tree recently?
2. If I simply replace the old folder with v2.x external/opencore folder,
will it work?

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: integrate OMX core in 2.0

2009-03-19 Thread Andy Quan
In my case, that macro difference would block building by dropping some dead
code. So I guess it would not help if you have already reached application
running.
As for functionality failures, I guess it is most probable that nodes or
render are not working correctly.

2009/3/19 Henry sdch...@gmail.com


 What is the difference between LOCAL_STATIC_LIBRARIES and
 LOCAL_WHOLE_STATIC_LIBRARIES?Do you use the marvell ipp lib?I am doing
 the work now.I can get the audio of video,but the picture of video is
 wrong.Can you give me some information?

 On 3月12日, 上午11时11分, Andy Quan androidr...@gmail.com wrote:
  Ravi,
  I have found the answer myself. It is because of the macro
  LOCAL_STATIC_LIBRARIES... I am now using LOCAL_WHOLE_STATIC_LIBRARIES
  instead. Thanks.
 
 
 
 
 
   On Wed, Mar 11, 2009 at 9:11 PM, rktb yend...@pv.com wrote:
 
   Which toolchain are your static libraries built from?
 
   On Mar 10, 9:15 pm, Andy Quan androidr...@gmail.com wrote:
Hi,
I am working on OMX core integration in 2.0 framework. I have used
 static
link before 2.0 release with exactly the same core and IL libs inside
 the
framework and they have been working correctly. Basically I have
 followed
the steps in integration guide documentation this time. I use google
 tool
chain to put all my prebuilt (not google tool chain) static OMX libs
 into
   a
single libOmx.so. But I am stuck in a problem where my libOmx.so
 can
   be
dlopened in interface class but all OMX core methods can not be found
 by
dlsym.
 
Actually the weird thing is I find symbols for OMX core methods are
 not
located in my libOmx.so when I try to use grep OMX_Init
   libOmx.so...
But I can grep other symbols located in IL components like aac omx
   component
from libOmx.so... My guess is that the object file containing my
 core
methods is somehow dropped during static-dynamic lib conversion.
   Therefore
it can not be grepped...
 
The 1st one below is my Android.mk generating libOmx.so. The 2nd
 one is
   to
push my prebuilt (not google tool chain) static libraries into $out
directory. OMX core methods, i.e. OMX_xxx, are located in
lib_il_basecore.a. Could anybody help on this? Thanks!
 
  
 --
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_SRC_FILES := \
src/OmxComponentRegistry.cpp
LOCAL_CFLAGS := $(PV_CFLAGS)
LOCAL_CFLAGS += -g
LOCAL_ARM_MODE := arm
LOCAL_STATIC_LIBRARIES := \
lib_il_aacdec \
lib_il_mp3dec \
lib_il_h264dec \
lib_il_mpeg4aspdec \
lib_il_omxmem \
lib_il_basecore \
miscGen
LOCAL_LDLIBS += -lpthread -ldl
LOCAL_MODULE := libOmx
LOCAL_C_INCLUDES := \
$(PV_TOP)/extern_libs_v2/khronos/openmax/include \
$(PV_INCLUDES)
-include $(PV_TOP)/Android_platform_extras.mk
-include $(PV_TOP)/Android_system_extras.mk
include $(BUILD_SHARED_LIBRARY)
include $(PV_TOP)/codecs_v2/omx/omx_vendor/lib/Android.mk
 
  
 -
LOCAL_PATH := $(call my-dir)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_basecore.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, miscGen.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_aacdec.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_mp3dec.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_h264dec.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_mpeg4aspdec.a)
$(call add-prebuilt-files, STATIC_LIBRARIES, lib_il_omxmem.a)
 
  
 ---
 
--
Thanks,
Andy
 
  --
  Thanks,
  Andy- 隐藏被引用文字 -
 
  - 显示引用的文字 -
  



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



how to dump logger info from PVAuthorEngineTest

2009-03-17 Thread Andy Quan
Hi,
I am running native pv author engine test application in OpenCORE 2.0 on
emulator. I find that pvaelogger.txt with 1st uncommented line as
LOGTOFILE will trigger the creation of pvauthorlog.txt in the same
working folder. But it is empty... Could anybody give an example of the
correct content inside pvaelogger.txt which can trigger dumping PV logger
information from various nodes during this test application?

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: how to dump logger info from PVAuthorEngineTest

2009-03-17 Thread Andy Quan
Thanks, Ravi. It's working now.

On Wed, Mar 18, 2009 at 12:14 AM, rktb yend...@pv.com wrote:


 LOGTOFILE
 #ALLNODES;PVLOGMSG_DEBUG -- For all logging, uncomment this line
 #PVAuthorEngine;PVLOGMSG_DEBUG -- For only the engine logging,
 uncomment this line



 On Mar 17, 9:08 am, Andy Quan androidr...@gmail.com wrote:
  Hi,
  I am running native pv author engine test application in OpenCORE 2.0 on
  emulator. I find that pvaelogger.txt with 1st uncommented line as
  LOGTOFILE will trigger the creation of pvauthorlog.txt in the same
  working folder. But it is empty... Could anybody give an example of the
  correct content inside pvaelogger.txt which can trigger dumping PV
 logger
  information from various nodes during this test application?
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



dynamic lib link problem

2009-03-09 Thread Andy Quan
Hi all,
I may have a stupid question here about how to link a prebuilt *.so into
another local building *.so inside android.

I use call add-prebuilt-files, SHARED_LIBRARIES, xx.so to push my prebuilt
dynamic lib and link it with LOCAL_SHARED_LIBRARIES += xx. The problem is
that it eventually searches inside $(out)/obj/lib instead of
$(out)/obj/SHARED_LIBRARIES and simply can not find the pushed file. How can
I solve this problem? Thank you so much!

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



system_server during camera preview

2009-03-07 Thread Andy Quan
On real hardware, I happened to find that CPU usage is really high with
system_server using around 300MHz when I opened camera application in
the launcher panel for preview at 30 fps only. Anybody knows what that
system_server is for during camera preview? I dont find that system_server
roars in other applications like media player. Some materials mention that
it has something to do with Java. Could anybody help on this?

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: system_server during camera preview

2009-03-07 Thread Andy Quan
Dave,
Do you mean that surfaceflinger is shown as system_server in process
list?

Yes, you are right. I am using software YUV-RGB before calling postbuffer.
I just thought that system_server was only limited to Java level and low
level implementation was counted into processes like mediaserver or
surfaceflinger :). Could you give some hints on this? Thanks.

On Sun, Mar 8, 2009 at 2:08 AM, Dave Sparks davidspa...@android.com wrote:


 Hmm... could that be SurfaceFlinger? I thought it ran in its own
 process. If so, it seems likely that you are using the software
 blitter instead of a hardware blitter. You didnt mention what hardware
 you are using.

 On Mar 7, 7:41 am, Andy Quan androidr...@gmail.com wrote:
  On real hardware, I happened to find that CPU usage is really high with
  system_server using around 300MHz when I opened camera application in
  the launcher panel for preview at 30 fps only. Anybody knows what that
  system_server is for during camera preview? I dont find that
 system_server
  roars in other applications like media player. Some materials mention
 that
  it has something to do with Java. Could anybody help on this?
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Announcing OpenCORE 2.1 release

2009-03-03 Thread Andy Quan
Question on AAC encoder OMX support:
- Added OMX AAC encoder support in the OMX encoder.

It seems OMX wrapper for AAC encoder is already there in opencore 2.0
release. So is this only some kind of slight modification in wrapper code?
Besides, are we going to have the encoder itself in the near future? Thanks.

On Thu, Feb 26, 2009 at 10:27 PM, GregS sherw...@pv.com wrote:


 OpenCORE 2.1 has been released and is now available on the master
 branch.  There is a git tag v2.1 marking the version.  It contains a
 number of new features and improvements on top of the OpenCORE 2.0
 release that happened about 1 month ago.  Here's a link to the
 previous announcement in case you missed it:

 http://groups.google.com/group/android-framework/browse_thread/thread/92e10684c6f09e16
 .
 Below is a description of the changes that have been introduced since
 then as part of OpenCORE 2.1:


 New Features
 * OpenMAX Codec-related:
  - Introduced the OMXConfigParser API to help in determining
which OpenMAX components can support the input bitstream.
It is used to narrow the list of candidate OpenMAX components
to be used for playback.  See the OpenMAX Core Integration Guide
document in the doc directory for more information.
  - Added OMX AAC encoder support in the OMX encoder.
  - Modified to use separate component roles for AMR-NB and AMR-WB as
described in the OpenMAX IL spec version 1.1.2.
  - Added support for a new buffer format for H.264/AVC decode and
 encode
to allow passing multiple NALs in a single buffer.  The format
 uses
OMX_OTHER_EXTRADATA structure defined in section 4.2.33 of the
OpenMAX IL spec version 1.1.2 to pass NAL lengths.  See the
OpenMAX Core Integration Guide document in the doc directory
for more information.
 * Author-related:
  - Added support for authoring files with AAC audio.
  - Added support for authoring AMR-WB audio to MP4/3GP files and
IETF storage format.
  - Added support for writing to an open file descriptor as an option
instead of simply providing a filename.  The file descriptor
 option
is useful for cases where another process needs to open the file
 because
of permissions.
 * Added large file support in OSCL (i.e., 64-bit file size/offset
 support)
  to handle files greater than 2 GiB on filesystems that support it.
 * Added rotation support in the 32-bit color-conversion class.

 Improvements
 * Removed dynamically loaded modules from the prelink map to avoid
 clutter
  and to make both the prelink map and loadable modules easier to
 manage.
  There may be an issue if a single instance of process tries to load
  libraries not in the prelink map more than 256 times (See
 http://code.google.com/p/android/issues/detail?id=2042).
 * Update to the MP3 Decoder to fix security issue (oCERT_2009-002,
 CVE-2009-0475)
 * Renamed the OSCL config directory linux_nj to android to match the
 platform name.
  Replaced all references of nj with android in the codebase.
 * General security improvements found from static analysis in the
 following areas:
  - Buffer and type overruns and underruns
  - Null pointer references
 * Refactored the jitter buffer node into a more modular architecture
 for
  better support of different streaming use-cases and protocols.
 * Fixed an issue in the MP3 decoder when decoding for very of long
  durations (over 2 GiB of data).
 * General improvements found during 3GPP packet-switched streaming
  interoperability testing.
 * General improvements and resolution of issues found from module
  level and engine (player, author, 2-way) level unit testing.

 New APIs / behaviors
 * Added support in the player engine to cancel a single pending
 command
  using the CancelCommand API.  See the player engine API document for
 details.
 * Renumbered the author test cases to avoid issues with
  preprocessor conditionals changing the test numbers based on
 settings.
  Now the test numbers shouldn't change.
 * In the case of 3rd party OMX components that support multiple roles
 -
  returns an error if the component cannot set the role parameter
 * OMX components need to explicitly set nPortIndex parameter for
  all appropriate parameters
 * Added fix for buffering percentage notification in streaming
 scenario
  (see https://review.source.android.com/Gerrit#change,8699)
 * Updated omx shared library build configuration to separate component
 registration from component build
 * Added methods in baselibs to serialize and deserialize the UTF-16,
 UTF-16LE, UTF-16BE strings
 * Removed the iUseCPMPluginRegistry flag from the source data that was
 previously
  used to enable the content policy manager.  Since the CPM plugins
 are dynamically
  loaded, the flag is not needed and was removed.  See the
 playerdriver.cpp for details
  of the change.


 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To 

Re: OpenCORE capability

2009-02-24 Thread Andy Quan
I see... shall be looking forward to that:)

On Tue, Feb 24, 2009 at 9:46 PM, rktb yend...@pv.com wrote:


 Hi,

 Yes, I agree that OpenCORE might not be supporting various media
 formats. However, the framework is out in the open for anyone to add
 new modules/support. We are trying to furnish more documentation to
 ease the contributor's work. We definitely would appreciate feedback
 or reasonable requests for any further documentation.

 We do have more features coming up in our roadmap. I will check with
 folks internally to see what can be shared with you.

 -Ravi

 On Feb 24, 4:03 am, Andy Quan androidr...@gmail.com wrote:
  Hi Ravi (or any other from PacketVideo),
  I have a straight forward question here about OpenCORE. In my
 understanding
  and based on many feedbacks here at least, though OpenCORE is an awesome
 OMX
  solution, OpenCORE's capability of supporting various media formats is to
  some extent limited :) campared with other MMF like GStreamer. It is not
  only a problem of the codec but also some missing features in parser
  components. I just want to know whether it is on your roadmap to extend
  support for more media formats in the near future. I'd appreciate if
 there
  is some heads up in detail. Thank you.
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: a/v sync in mediarecorder

2009-02-23 Thread Andy Quan
Dave,
Thanks to your information.
I'd like to know:

   1. Is it a common used method that there is no a/v sync in camcorder
   usage?
   2. You mean if there is sync, it is android rather than OpenCORE who
   should take care of it, right?
   3. Is it in your roadmap to make sync happen? I guess monotonic system
   still brings no sync.

Correct me if I made any misunderstanding.

On Tue, Feb 24, 2009 at 7:06 AM, Dave Sparks davidspa...@android.comwrote:


 Since the typical use case for mobile is short clips, we haven't
 bothered with sync yet. We did recently make a change to have the
 video base its timestamp on the monotonic system clock, but I don't
 think that change is pushed out yet.

 On Feb 23, 8:14 am, Andy Quan androidr...@gmail.com wrote:
  Hi,
  Anybody can shed light on where I can find a/v sync code in
 PVMediaRecorder?
  I find in opencore/android/ that audio input takes audio sample number as
  its time stamp while camera input takes OsclTickCount offset for a
  reference. Is there supposed to be a synchronization on these 2 clocks
 just
  like what it is in player? If not, is it meant to be 2 seperate clocking
 in
  PV?
 
  --
  Thanks,
  Andy
 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: MP4 file open error

2009-02-12 Thread Andy Quan
Dave,
I have it tested both on real arm hardware and SDK r2 emulator. They are
opened with native Videoplayer as local file on sdcard. I traced the log
information and found they failed faraway before running into OMX decoding.
Some of them are of MPEG4 ASP but I dont think this matters as errors come
up in an early stage. As for the size, is there any limitation? We used to
have them as test streams for other MMF like GStreamer.

As I remember there are some discussions here last year about limitations of
MP4 container itself in PV framework, but I can not find them. Could you
please help? Correct me if I misunderstand anything.

-- 
Thanks,
Andy

On Thu, Feb 12, 2009 at 11:38 AM, Dave Sparks davidspa...@android.comwrote:


 There are limitations that may prevent a file from playing.

 What platform are you testing on?
 Is the streaming or local file playback?
 If streaming, HTTP or RTSP?
 What are the video and audio stream parameters (codec, frame size, bit
 rate, sample rate, etc)?

 On Feb 11, 7:24 pm, Andy Quan androidr...@gmail.com wrote:
  Hi,
  Is there any known issue on OpenCORE mp4 parser? I find that once a while
  some MP4 files fail before they are sent to decoding. However these files
  can be played in other MMF like GStreamer. Usually it fails in
 prepareAsync.
 
  --
  Thanks,
  Andy
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: MP4 file open error

2009-02-12 Thread Andy Quan
Ravi,
Very clear explanation. Thanks. I guess ASP still will not be supported in
OpenCORE 2.0 release, right?

In my understanding, those bytes handled by config parser are raw data from
the container. So are you expecting codec vendor to provide container
parsing capability as you want that function to be provided by vendor
themselves?


On 2/13/09, rktb yend...@pv.com wrote:


 Hi Andy,

 Your expectation is correct. But, there is a small deviation. Once the
 parser succeeds in parsing data, config data is passed to the OMX
 decoder node to verify whether the decoder can support it. To validate
 this, we have in place, something called the config parser which, in
 brief, parses the bitstream to check whether our OMX component can
 handle/support it. It is here that we see that the clip is advertising
 ASP, and the track is rejected.

 Ideally, this config parser should be provided by the decoder
 supplier so that when validating a XYZ_component, the corresponding
 XYZ_config_parser is used to verify whether a codec can support it or
 not. We had internal discussions and were planning on providing this
 API in the OpenCORE 2.0 time frame, but got delayed. If you have any
 suggestions regarding the same, we would be glad to discuss it with
 you.

 Regarding the specific clip, you can modify the video config parser to
 choose the tools that your codec supports and then use it to verify
 the functionality of your codec.

 -Ravi

 On Feb 12, 9:54 pm, Andy Quan androidr...@gmail.com wrote:
  Ravi,
  Your information is very helpful. But I dont understand this: if the
 parser
  supports ASP then I expect to see at least some behaviors of OMX
 pipeline.
  So is it because the parser recognized ASP and refused to go
  on? An equivalent question is if I replace OMX component with an ASP
 capable
  one, can I get that file played? My experiment showed that it still did
 not
  work.
 
  On 2/13/09, rktb yend...@pv.com wrote:
 
 
 
 
 
   Hi Andy,
 
   On Feb 12, 7:32 pm, Andy Quan androidr...@gmail.com wrote:
Ravi,
Thank you! Let me make this clear: Is it the parser or OMX of
 OpenCORE
   that
can not handle MPEG4 ASP?
   It is the codec that is not capable of handling the ASP.
 
1. And if OpenCORE has a limitation of media contents support, no
 matter
   it
is within parser or OMX, I'd like to know where I can get a
 full  feature
list information.
   I don't have a ready list of supported feature set or tools that the
   codecs can support. I shall try to collect it and post the details.
 
2. If there is no such kind of list, are there any known issues
 regarding
   to
media format/profile/contents support in the git code?
   I did not get this question.
 
I think a big problem for many android developers here is that we get
   video
cant be played information but never know why. This will help reduce
   many
questions here.
 
   Yes. We acknowledge this issue. We are trying to come up with more
   meaningful errors in future versions.
 
Many thanks again for your help!
--
Thanks,
Andy
 
On 2/13/09, rktb yend...@pv.com wrote:
 
 It looks like the clip is not simple profile and that's the reason
 we
 reject it.
 
 -Ravi
 
 On Feb 12, 9:41 am, Andy Quan androidr...@gmail.com wrote:
   Test stream is attached as well. Thank you!
 
  I/ActivityManager( 44): Starting activity: Intent {
  action=android.intent.acti
 
  on.VIEW data=content://media/external/video/media/18
 type=video/mp4
 comp={
  com.an droid.music/com.android.music.MovieView} }
 
  W/SensorService( 44): could not enable sensor 2
 
  I/MediaPlayer-JNI( 171): prepareAsync: surface=0x1b0698 (id=1)
 
  V/VideoMIO( 31): AndroidAudioSurfaceOutput surface=0x1c6a8
 
  W/PlayerDriver( 31): PVMFInfoErrorHandlingComplete
 
  I/ActivityManager( 44): Displayed activity
   com.android.music/.MovieView:
 236
  ms
 
  E/MediaPlayer( 171): Error (-1,0)
 
  D/VideoView( 171): Error: -1,0
 
  On 2/12/09, rktb yend...@pv.com wrote:
 
   Can you provide us the actual content that you are using so it
 is
   easier to see what is going wrong instead of speculating?
 
   What log did you trace? Can you share that log?
 
   -Ravi
 
   On Feb 12, 8:05 am, Andy Quan androidr...@gmail.com wrote:
Dave,
I have it tested both on real arm hardware and SDK r2
 emulator.
   They
 are
opened with native Videoplayer as local file on sdcard. I
 traced
   the
 log
information and found they failed faraway before running into
 OMX
   decoding.
Some of them are of MPEG4 ASP but I dont think this matters
 as
   errors
   come
up in an early stage. As for the size, is there any
 limitation?
   We
 used
   to
have them as test streams for other MMF like GStreamer.
 
As I remember there are some discussions here last year about
 limitations

MP4 file open error

2009-02-11 Thread Andy Quan
Hi,
Is there any known issue on OpenCORE mp4 parser? I find that once a while
some MP4 files fail before they are sent to decoding. However these files
can be played in other MMF like GStreamer. Usually it fails in prepareAsync.


-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



audio flinger performance

2009-02-08 Thread Andy Quan
Hi,
I am working on android multimedia on an ARMv5TE compatible chip with
hardware SIMD acceleration (CPU 600Mhz, 32K I$/D$, no L2$, 130Mhz external
DDR, 128MB DRAM). With some performance calibration codes, I find that
audio flinger thread costs around 50Mhz - 60Mhz when music player is
playing any 44Khz - 48Khz sample rate stereo audio (single track). This is
much higher beyond my expectation (20Mhz - 30Mhz).

In my understanding, major cost of audio flinger should be resampling,
mixing, alsa rendering and some data copy.
1. And I have confirmed that my testing case does not require resampling,
that is, a simple mixing and rendering will lead to the end.
2. I have confirmed that on this platform alsa rendering does not need so
many cycles as I observed.

Anybody can help me understand why this audio flinger thread costs so many
Mhz? Or did I misunderstand anything? I guess I did not catch the bottleneck
of this audio flinger:)

BTW, my code base is based on the one by the end of last year, so I am not
sure if there is any performance related update in the current release.

-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: audio flinger performance

2009-02-08 Thread Andy Quan
Bas,
Thank you for your information. Could you please let me know where is the
allocation of 24 buffers in mediaplayer service? I can't find it... :(


On 2/9/09, baskar rajagopal baskars...@gmail.com wrote:

 Hi Andy,

 check the below

 1. How much memory you are allocating for a client?  In the original code
 they might allocate 1Mb of memory in the client heap - i.e., for 32 tracks
 and 8 buffer each of 4Kb size. If you are using only one track you can
 reduce this memory allocation.

 2. In the current desing , even though it is mentioned 8 buffers of 4 Kb
 size per track , but from the mediaplayer services it is made as 24 buffers
 of 4 Kb per track ( this is for emulator ) , if you are using for target ,
 you can make it as 8 buffers per track.


 regards,
 bas.


  On Sun, Feb 8, 2009 at 4:34 PM, Andy Quan androidr...@gmail.com wrote:

  Hi,
 I am working on android multimedia on an ARMv5TE compatible chip with
 hardware SIMD acceleration (CPU 600Mhz, 32K I$/D$, no L2$, 130Mhz external
 DDR, 128MB DRAM). With some performance calibration codes, I find that
 audio flinger thread costs around 50Mhz - 60Mhz when music player is
 playing any 44Khz - 48Khz sample rate stereo audio (single track). This is
 much higher beyond my expectation (20Mhz - 30Mhz).

 In my understanding, major cost of audio flinger should be resampling,
 mixing, alsa rendering and some data copy.
 1. And I have confirmed that my testing case does not require resampling,
 that is, a simple mixing and rendering will lead to the end.
 2. I have confirmed that on this platform alsa rendering does not need so
 many cycles as I observed.

 Anybody can help me understand why this audio flinger thread costs so many
 Mhz? Or did I misunderstand anything? I guess I did not catch the bottleneck
 of this audio flinger:)

 BTW, my code base is based on the one by the end of last year, so I am not
 sure if there is any performance related update in the current release.

 --
 Thanks,
 Andy

 



-- 
Thanks,
Andy

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---