Hi every Core Audio expert!

I’m currently developing a professional audio application (research
oriented) but I’ve questions about moving the playhead (seek function).

My application would be a Swift application but regarding the advices of
Chris Adamson, I decided to build all the core audio in C/C++.

My application would be one ore more source the user could mixe and only
one output.
And I would use only custom DSP code.
So I thought to do an AUGraph Like this:


FileAu-1 - -  - — - - > | |
… | |
|     Mixer unit |
FileAU-n - - -  - — - > | |
RenderCallBack - - -  > | |


Where:
- the 'FileAu’ would be my source generator from audio on disk and use
ExtAudioFile (and properties
'kAudioUnitProperty_ScheduledFileIDs',
'kAudioUnitProperty_ScheduledFileRegion’,
'kAudioUnitProperty_ScheduledFilePrime' and
’kAudioUnitProperty_ScheduleStartTimeStamp')

- the render callback would be my custom DSP processing (I use a personal
lib where I can link several DSP units so only one render call back with
the right DSP unit initialization could do the job).


My questions are:

1°) Does AUGraph, ExtAudioFileRef, AudioUnit and all the low level will be
deprecated and do I need to tuse AV stuff like AVAudioEngine, AVAudioUnit
and so on instead? If no, what are the difference between AUgraph and
AVAudioEngine because they look like the same.

2°) How can I perform a seek in this situation? I cannot manage to do
something as simple as a seek. Is it possible to do a global seek using
only AUGraph? Or must I use a ringbuffer with rendercallback like few
examples I saw even if I don’t use one?
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to