Briefly, one reason you're probably not seeing to much response to this
is that, for support reasons, TI is encouraging I/O take place via
ARM-side device drivers.  This is to avoid things like ARM/DSP
peripheral ownership collisions - e.g. what if someone reads from
/dev/dsp on the ARM at the same time your DSP-side driver is reading
from it?  (I realize you can constrain your system to avoid this)

Codec Engine itself doesn't support I/O as a side effect of calling its
"VISA" API's.  That is, when an ARM-side application calls, for example,
AUDENC_process(), it's a blocking call that returns the encoded data
back to the ARM-side app.  No I/O occurs within AUDENC_process().
There's nothing like a "AUDENC_readAndEncodeNextBlockOfData()" API,
which your diagram may be suggesting.

If you _really_ need to do I/O on the DSP side, you might want to
consider using your own DSP Link-based framework to move data from the
ARM->DSP or vise-versa.  Note that you can still use Codec Engine APIs
_on the DSP-side_ to invoke "local" DSP-side codecs.  This is not the
traditional use case exemplified in the demos, but can be supported by
the software in the DVEVM/DVSDK.  The data flow might look like this for
a "player":

   1. ARM-side app reads encoded data (from disk?)
   2. ARM-side app uses Link APIs (like MSGQ_put) to send data from ARM
to DSP.
   3. DSP-side app uses Link APIs (like MSGQ_get) to receive data from
ARM.
   4. DSP-side app uses CE APIs (like AUDDEC_process) on a locally
configured codec to decode data.
   5. DSP-side app plays decoded data out using a DSP-side device
driver.

This will be more work, and I don't know of any examples of this, but
it's possible.  I'd encourage you revisit your requirement that I/O
occur on the DSP-side - if it can take place on the ARM, I think you're
well enabled.

A few direct replies also inline'd below...

Chris  

> -----Original Message-----
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED]
] On Behalf Of Andy Ngo
> Sent: Thursday, December 21, 2006 5:53 PM
> To: [email protected]
> Subject: Re: Sample AIC33 loopback test
> 
> In trying to figure out the topic I posted below, I searched 
> through the DaVinci audio driver (in
> /opt/mv_pro_4.0/montavista/pro/devkit/lsp/ti-davinci/sound/oss
) and noticed that it doesn't use
> any CE API.  Does this mean all the audio processing is done 
> on the ARM side?

Audio I/O is done on the ARM (e.g. this OSS driver).  An app
acquires/plays raw data using these drivers.

Audio processing is typically done on the DSP, when the same application
uses the CE APIs to encode or decode a buffer of data.

>  The project
> I'm working on requires me to offload all the audio 
> processing (reading samples in and writing
> samples out) to the DSP side; the ARM side is reserved for a 
> very critical real-time application
> so I need to do all audio processing on the DSP side.  Does 
> this mean I can't use the Linux
> driver for my requirements?

Where does the encoded data live?  On an ARM-side disk drive or device?
If so, you'll have to do at least half the I/O (reading/writing the
encoded data to/from disk) on the ARM.

> How do the demo programs do it; 
> for example for the g711
> encoding, does it use the ARM side to sample the audio (via 
> the audio driver), send it via CE 
> API to the DSP side to encode the data into g711 and the 
> resulting encoded file is sent back 
> to the ARM side and saved on the filesystem?  If that's the 
> case, for g711 decoding, the
> ARM side gvies the encoded data to the DSP side via CE API, 
> which decodes it and gives
> it back the ARM side to play the decoded audio via the audio 
> driver; am I correct?

Exactly.

> For my
> project, the ARM side is only used to exchange encoded data 
> with the DSP side, which does
> all the process (sample and encode, decode and playback).  
> Any advice/comment is
> appreciated!
> 
> Regards,
> Andy
> 
> ----- Original Message ----
> From: Andy Ngo <[EMAIL PROTECTED]>
> To: [email protected]
> Sent: Thursday, December 21, 2006 2:49:10 PM
> Subject: Sample AIC33 loopback test
> 
> 
> Hi, 
> 
> Does anyone have or know if there's a sample DSP loopback 
> program that samples the AIC33 mic input (like say 8Khz, mono, 16-bit)
> and then plays it through the AIC33 headphone output? I know 
> I can do this on the ARM side by accessing the /dev/dsp device, but I
> need to access the mic input and headphone output directly 
> from the DSP side. Basically, I want to create a DSP codec 
> that reads from
> AIC33 mic input and sends the data to the ARM side (via CE 
> API) and vice versa: gets data from the ARM side and writes 
> it to the AIC33
> headphone output. I looked at the encode/decode examples in 
> the DVSDK but still don't know how to access ore make use of the
> AIC33.  Sorry, I'm very new to the TI DSP development 
> environment and don't know much; I spent the last week just 
> learning about
> xDAIS and DSP/BIOS by going through the CCS 3.2 tutorial.  
> Your help is much appreciated.  Thanks.
> 
> 
> Basic flow diagram: 
> 
> DSP side 
> 
> AIC33                          
> headphone output  <------- DSP codec <--------- CE API 
> <--------- simple ARM application
> 
> AIC33                          
> microphone input  -------> DSP codec ---------> CE API 
> ----------> simple ARM application
> 
> 
> Regards,
> Andy Ngo
> _______________________________________________
> Davinci-linux-open-source mailing list
> [email protected]
> http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
> _______________________________________________
> Davinci-linux-open-source mailing list
> [email protected]
> http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
> 
_______________________________________________
Davinci-linux-open-source mailing list
[email protected]
http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source

Reply via email to