Re: AUGraph deprecation

2018-07-15 Thread Paul Sanders
Interesting, seems the post Greg quotes got pulled. I wonder why. I'd 
have deleted the lot.


Anyway, AQ, I remember you, you're always pulling stunts like this and 
it just clutters up the board.  My only question is (and I suggest you 
go away and think long and hard about this) is **why**?


Paul Sanders, occasional poster.


On 15/07/2018 07:17, Greg Weston wrote:



*I'm* out of line?


Yes. Very much so. Take it to a Usenet advocacy group, where ignorant 
and useless arguments are the norm.



...


___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-15 Thread Greg Weston

> *I'm* out of line?

Yes. Very much so. Take it to a Usenet advocacy group, where ignorant and 
useless arguments are the norm.

> (And again, why doesn't this *ever* happen in Microsoft's ecosystem?)

Speaking as someone who has been a Windows developer for 25 years, it 
absolutely *does* happen in the Windows ecosystem.

https://docs.microsoft.com/en-us/windows/desktop/shell/deprecated-api
https://msdn.microsoft.com/en-us/library/bb546234.aspx
https://docs.microsoft.com/en-us/windows/desktop/apiindex/windows-api-list#deprecated-or-legacy-apis

To offer a small subset of examples. In fact, since you specifically mentioned 
1995, I could point out that in 1995 Win95 and WinNT were both current, 
supported platforms that couldn’t reliably run each other’s software because 
neither of them implemented a portion of the Win32 spec that was a superset of 
what the other did.


 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-14 Thread Vieira Damiani, Luis F
Let us please watch language an tone!

On Jul 13, 2018, at 4:25 PM, Admiral Quality 
mailto:a...@admiralquality.com>> wrote:

And you should strongly consider not being planned obsolescence
flogging asshole hypocrites.

- AQ

On Fri, Jul 13, 2018 at 4:22 PM, Doug Wyatt 
mailto:dwy...@apple.com>> wrote:


On Jul 13, 2018, at 13:17 , Admiral Quality 
mailto:a...@admiralquality.com>> wrote:

Meanwhile we can still run Windows software written in 1995 on their latest OS.

I can't wait to see you people go out of business.

- AQ

Deprecation doesn't mean removal.

Carbon components have been deprecated for many years but still work.

Deprecation just functions as a warning that you should strongly consider 
another API in new code, and a deterrent to complaints about missing features. 
For example, Audio Unit extensions are not supported by AUGraph, and won't be 
(couldn't be, without new API).

Doug

___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  
(Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://urldefense.proofpoint.com/v2/url?u=https-3A__lists.apple.com_mailman_options_coreaudio-2Dapi_damiani-2540ufl.edu=DwICAg=pZJPUDQ3SB9JplYbifm4nt2lEVG5pWx2KikqINpWlZM=YihhonIjnlIN6Wuo58LaXg=zYxLgTP5qNJi_Ky7KWiwqSn_MgOgx_S5Xk6NPGpcfMM=r88MGUvM3GqfkUrbP5s1SLKRuB3H40azYKmw8ZUQfDM=

This email sent to dami...@ufl.edu

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Benjamin Federer
Thanks for the info, Doug. That takes the pressure off a bit. The general 
feeling seems to be that AVAudioEngine is on a good way but „not quite there“ 
yet. Therefore more feature parity sounds exactly right.

Could you guys put some more info up about 
real time manual rendering mode? Unless it’s there and I’m just too blind to 
find it, of course. 

Benjamin 

> Am 13.07.2018 um 22:15 schrieb Doug Wyatt :
> 
> I don't think we ever really knew when the deprecation be official. There are 
> yet a few things we want to add to AVAudioEngine for more feature parity, and 
> we don't want folks to be seeing warnings until that's done.
> 
> I can say that there is now an availability mechanism so that we can mark 
> API's as "to be deprecated", and we have marked up AUGraph with that, to 
> appear ... someday.
> 
> Doug
> 
> 
>> On Jun 8, 2018, at 7:59 , Benjamin Federer  wrote:
>> 
>> Last year at WWDC it was announced that AUGraph would be deprecated in 2018. 
>> I just browsed the documentation 
>> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major)
>>  but found 
>> Audio Unit Processing Graph Services not marked for deprecation. The AUGraph 
>> header files rolled out with Xcode 10 beta also have no mention of a 
>> deprecation in 10.14. I searched for audio-specific sessions at this year’s 
>> WWDC but wasn’t able to find anything relevant. Has anyone come across new 
>> information regarding this?
>> 
>> Judging by how much changes and features Apple seems to be holding back 
>> until next year I dare ask: Has AUGraph API deprecation been moved to a 
>> later time?
>> 
>> Benjamin
> 
 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Admiral Quality
*I'm* out of line? Just... wow. And oh no, I might get removed from
this list and (maybe, if I care to bother) have to sign up again from
a different email? Horror of horrors!

We don't all play the forced upgrade game, there are some honest
business left out here and we don't make any money from this extra
work you force on us for absolutely no good reason other than to milk
customers to buy the same products over and over again. (And again,
why doesn't this *ever* happen in Microsoft's ecosystem?)

Anyway, the public is catching on, finally. Apple don't have much more
time left if they don't smarten up soon. With Jobs thankfully dead
you'd think that would have happened already, but I apparently he
wasn't the only c-word higher up there. Your corporate culture is
poison and rotting from the head down.

https://www.youtube.com/watch?v=AUaJ8pDlxi8

https://www.youtube.com/watch?v=UVzigC33sGc

https://www.youtube.com/watch?v=qNghvlAqzaA

You know I'm right. All you Apple employees and remaining loyal
fanboys should be ashamed of yourselves.

Repent, before Darwin gets you! (And I don't mean the kernel code you stole.)

- AQ


On Fri, Jul 13, 2018 at 4:28 PM, Chris Adamson  wrote:
> Dude, seriously, chill. You’re way, way out of line with the Terms of Use,
> and basic professional behavior.
>
> https://discussions.apple.com/docs/DOC-5952
>
> Be polite. Everyone should feel comfortable reading Submissions and
> participating in discussions. Apple will not tolerate flames or other
> inappropriate statements, material, or links. Most often, a "flame" is
> simply a statement that is taunting and thus arbitrarily inflammatory.
> However, this also includes those which are libelous, defamatory, indecent,
> harmful, harassing, intimidating, threatening, hateful, objectionable,
> discriminatory, abusive, vulgar, obscene, pornographic, sexually explicit,
> or offensive in a sexual, racial, cultural, or ethnic context
>
>
> —Chris
>
>
> On Jul 13, 2018, at 4:25 PM, Admiral Quality  wrote:
>
> And you should strongly consider not being planned obsolescence
> flogging asshole hypocrites.
>
> - AQ
>
> On Fri, Jul 13, 2018 at 4:22 PM, Doug Wyatt  wrote:
>
>
>
> On Jul 13, 2018, at 13:17 , Admiral Quality  wrote:
>
> Meanwhile we can still run Windows software written in 1995 on their latest
> OS.
>
> I can't wait to see you people go out of business.
>
> - AQ
>
>
> Deprecation doesn't mean removal.
>
> Carbon components have been deprecated for many years but still work.
>
> Deprecation just functions as a warning that you should strongly consider
> another API in new code, and a deterrent to complaints about missing
> features. For example, Audio Unit extensions are not supported by AUGraph,
> and won't be (couldn't be, without new API).
>
> Doug
>
> ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/invalidname%40gmail.com
>
> This email sent to invalidn...@gmail.com
>
>
 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Chris Adamson
Dude, seriously, chill. You’re way, way out of line with the Terms of Use, and 
basic professional behavior.

https://discussions.apple.com/docs/DOC-5952 


> Be polite. Everyone should feel comfortable reading Submissions and 
> participating in discussions. Apple will not tolerate flames or other 
> inappropriate statements, material, or links. Most often, a "flame" is simply 
> a statement that is taunting and thus arbitrarily inflammatory. However, this 
> also includes those which are libelous, defamatory, indecent, harmful, 
> harassing, intimidating, threatening, hateful, objectionable, discriminatory, 
> abusive, vulgar, obscene, pornographic, sexually explicit, or offensive in a 
> sexual, racial, cultural, or ethnic context


—Chris


> On Jul 13, 2018, at 4:25 PM, Admiral Quality  wrote:
> 
> And you should strongly consider not being planned obsolescence
> flogging asshole hypocrites.
> 
> - AQ
> 
> On Fri, Jul 13, 2018 at 4:22 PM, Doug Wyatt  wrote:
>> 
>> 
>>> On Jul 13, 2018, at 13:17 , Admiral Quality  wrote:
>>> 
>>> Meanwhile we can still run Windows software written in 1995 on their latest 
>>> OS.
>>> 
>>> I can't wait to see you people go out of business.
>>> 
>>> - AQ
>> 
>> Deprecation doesn't mean removal.
>> 
>> Carbon components have been deprecated for many years but still work.
>> 
>> Deprecation just functions as a warning that you should strongly consider 
>> another API in new code, and a deterrent to complaints about missing 
>> features. For example, Audio Unit extensions are not supported by AUGraph, 
>> and won't be (couldn't be, without new API).
>> 
>> Doug
>> 
> ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/invalidname%40gmail.com
> 
> This email sent to invalidn...@gmail.com

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Admiral Quality
And you should strongly consider not being planned obsolescence
flogging asshole hypocrites.

- AQ

On Fri, Jul 13, 2018 at 4:22 PM, Doug Wyatt  wrote:
>
>
>> On Jul 13, 2018, at 13:17 , Admiral Quality  wrote:
>>
>> Meanwhile we can still run Windows software written in 1995 on their latest 
>> OS.
>>
>> I can't wait to see you people go out of business.
>>
>> - AQ
>
> Deprecation doesn't mean removal.
>
> Carbon components have been deprecated for many years but still work.
>
> Deprecation just functions as a warning that you should strongly consider 
> another API in new code, and a deterrent to complaints about missing 
> features. For example, Audio Unit extensions are not supported by AUGraph, 
> and won't be (couldn't be, without new API).
>
> Doug
>
 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Doug Wyatt



> On Jul 13, 2018, at 13:17 , Admiral Quality  wrote:
> 
> Meanwhile we can still run Windows software written in 1995 on their latest 
> OS.
> 
> I can't wait to see you people go out of business.
> 
> - AQ

Deprecation doesn't mean removal.

Carbon components have been deprecated for many years but still work.

Deprecation just functions as a warning that you should strongly consider 
another API in new code, and a deterrent to complaints about missing features. 
For example, Audio Unit extensions are not supported by AUGraph, and won't be 
(couldn't be, without new API).

Doug

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Admiral Quality
Meanwhile we can still run Windows software written in 1995 on their latest OS.

I can't wait to see you people go out of business.

- AQ

On Fri, Jul 13, 2018 at 4:15 PM, Doug Wyatt  wrote:
> I don't think we ever really knew when the deprecation be official. There are 
> yet a few things we want to add to AVAudioEngine for more feature parity, and 
> we don't want folks to be seeing warnings until that's done.
>
> I can say that there is now an availability mechanism so that we can mark 
> API's as "to be deprecated", and we have marked up AUGraph with that, to 
> appear ... someday.
>
> Doug
>
>
>> On Jun 8, 2018, at 7:59 , Benjamin Federer  wrote:
>>
>> Last year at WWDC it was announced that AUGraph would be deprecated in 2018. 
>> I just browsed the documentation 
>> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major)
>>  but found
>> Audio Unit Processing Graph Services not marked for deprecation. The AUGraph 
>> header files rolled out with Xcode 10 beta also have no mention of a 
>> deprecation in 10.14. I searched for audio-specific sessions at this year’s 
>> WWDC but wasn’t able to find anything relevant. Has anyone come across new 
>> information regarding this?
>>
>> Judging by how much changes and features Apple seems to be holding back 
>> until next year I dare ask: Has AUGraph API deprecation been moved to a 
>> later time?
>>
>> Benjamin
>
>  ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/aq%40admiralquality.com
>
> This email sent to a...@admiralquality.com
 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-13 Thread Doug Wyatt
I don't think we ever really knew when the deprecation be official. There are 
yet a few things we want to add to AVAudioEngine for more feature parity, and 
we don't want folks to be seeing warnings until that's done.

I can say that there is now an availability mechanism so that we can mark API's 
as "to be deprecated", and we have marked up AUGraph with that, to appear ... 
someday.

Doug


> On Jun 8, 2018, at 7:59 , Benjamin Federer  wrote:
> 
> Last year at WWDC it was announced that AUGraph would be deprecated in 2018. 
> I just browsed the documentation 
> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major) 
> but found 
> Audio Unit Processing Graph Services not marked for deprecation. The AUGraph 
> header files rolled out with Xcode 10 beta also have no mention of a 
> deprecation in 10.14. I searched for audio-specific sessions at this year’s 
> WWDC but wasn’t able to find anything relevant. Has anyone come across new 
> information regarding this?
> 
> Judging by how much changes and features Apple seems to be holding back until 
> next year I dare ask: Has AUGraph API deprecation been moved to a later time?
> 
> Benjamin

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-11 Thread Benjamin Federer
Arshia, Laurent,

have you seen this WWDC session video which demonstrates AVAudioEngine's real 
time manual rendering mode?

https://developer.apple.com/videos/play/wwdc2017-501/?time=942 


Unfortunately the resources only provide sample code for offline manual 
rendering mode and I have a lot more questions than answers after watching that 
video. For example, at least on iOS each engine only has one input node. Does 
that mean there can only be one input process? Can there be more than one input 
nodes on macOS? Where does that C++ code come from? How can the input block 
*not* be Swift or Objc while still being called in a realtime context?

If anyone knows of a working code sample online or succeeds in doing this, 
please post.

As a bonus this video officially turns down Swift for realtime processing: „... 
it is not safe to use Objective-C or Swift runtime from a real-time context.“ 
Hadn't seen or read this from Apple before.

Benjamin


> Am 11.07.2018 um 15:30 schrieb Laurent Noudohounsi 
> :
> 
> Thanks Benjamin for the precision. I thought that `installTapOnBus` was the 
> successor of `RenderCallback`. 
> For me it was not natural to mix old api like 
> `kAudioUnitProperty_SetRenderCallback` in AVAudioEngine.
> 
> So as Arshia said, I'm also looking for a way to use real-time processing 
> with AVAudioEngine.
> 
> Le mer. 11 juil. 2018 à 15:05, Arshia Cont  > a écrit :
> Interesting thread here!
> 
> Anyone has achieved low-latency processing on AVAudioEngine? 
> 
> The RenderCallback seems natural to me (which is the good “old” way of doing 
> it with AUGraph). But I’m curious to hear if anyone has done/achieved real 
> stuff here with AVAudioEngine real-time processing and how.
> 
> 
> Arshia 
> 
> 
>> On 11 Jul 2018, at 15:00, Benjamin Federer > > wrote:
>> 
>> Laurent,
>> 
>> `installTapOnBus` is not intended for realtime processing as a tap only 
>> provides the current frame buffer but does not pass it back into the signal 
>> chain. The documentation reads `Installs an audio tap on the bus to record. 
>> monitor, and observe the output of the node`.
>> 
>> Although I have not done that myself yet my understanding is that for 
>> realtime processing you can still retrieve the underlying audio unit from an 
>> AVAudioNode (or at least some nodes?) and attach an input render callback 
>> via AudioUnitSetProperty with kAudioUnitProperty_SetRenderCallback.
>> 
>> I assume the other way would be to subclass AUAudioUnit and wrap that into 
>> an AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses me, too. 
>> Random Google result with further information: 
>> https://forums.developer.apple.com/thread/72674 
>> 
>> 
>> Benjamin
>> 
>> 
>>> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi 
>>> mailto:laurent.noudohou...@gmail.com>>:
>>> 
>>> Hi all,
>>> 
>>> I'm interested in this topic since I've not found any information about it 
>>> yet.
>>> 
>>> Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms 
>>> latency. It's what I see in the header file of `AVAudioNode` with its 
>>> method `installTapOnBus`: 
>>> 
>>> @param bufferSize the requested size of the incoming buffers in sample 
>>> frames. Supported range is [100, 400] ms.
>>> 
>>> Maybe I'm wrong but I don't see any other way to have a lower latency audio 
>>> processing in an AVAudioNode.
>>> 
>>> Best,
>>> Laurent
>>> 
>>> Le mer. 11 juil. 2018 à 13:57, Arshia Cont >> > a écrit :
>>> Benjamin and list,
>>> 
>>> I double Benjamin’s request. It would be great if someone from the 
>>> CoreAudio Team could respond to the question.
>>> 
>>> Two years ago, after basic tests I realised that AVAudioEngine was not 
>>> ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a 
>>> feeling that this is no longer the case on iOS and we can move to 
>>> AVAudioEngine for low-latency audio processing. Anyone can share experience 
>>> here? We do real-time spectral analysis and resynthesis of sound and go as 
>>> low as 64 samples per cycle if the device allows.
>>> 
>>> Thanks in advance.
>>> 
>>> 
>>> Arshia
>>> 
>>> 
>>> PS: I actually brought the deprecation issue of AUGraph in a local Apple 
>>> Dev meeting where the EU director of developer relation was present. 
>>> According to him, when Apple announces a deprecation, it WILL happen. My 
>>> interpretation of the conversation is that AUGraph is no longer maintained 
>>> but provided as is.
>>> 
 On 11 Jul 2018, at 12:36, Benjamin Federer >>> > wrote:
 
 Since it was mentioned in another email (thread) I’m giving this topic a 
 bump. Would be great if someone at Apple, or anyone else in the know, 
 could take the time to respond. The documentation at the link cited below 
 still has no 

Re: AUGraph deprecation

2018-07-11 Thread Bartosz Nowotny
Arshia,

Thank you for clearing that up.

On Wed, Jul 11, 2018 at 4:10 PM, Arshia Cont 
wrote:

> Bartosz,
>
> Laurent was mentioning the installTapOnBus. Your published code would not
> need that. You are just playing MIDI. You would be concerned if you had to
> do custom real-time audio processing on the audio output of your MIDI
> device (such as FFT analysis).
>
> Arshia
>
> On 11 Jul 2018, at 16:04, Bartosz Nowotny 
> wrote:
>
> Laurent,
>
> What you said about not being able to achieve latency lower than 100ms is
> worrisome. I need a realtime MIDI synth, low latency is absolutely crucial.
> Does the limitation you mention apply only to signal processing or other
> applications of the API as well, in particular MIDI synthesis?
>
> Regards,
> Bartosz
>
>
> On Wed, Jul 11, 2018 at 3:30 PM, Laurent Noudohounsi <
> laurent.noudohou...@gmail.com> wrote:
>
>> Thanks Benjamin for the precision. I thought that `installTapOnBus` was
>> the successor of `RenderCallback`.
>> For me it was not natural to mix old api like `kAudioUnitProperty_
>> SetRenderCallback` in AVAudioEngine.
>>
>> So as Arshia said, I'm also looking for a way to use real-time processing
>> with AVAudioEngine.
>>
>> Le mer. 11 juil. 2018 à 15:05, Arshia Cont  a
>> écrit :
>>
>>> Interesting thread here!
>>>
>>> Anyone has achieved low-latency processing on AVAudioEngine?
>>>
>>> The RenderCallback seems natural to me (which is the good “old” way of
>>> doing it with AUGraph). But I’m curious to hear if anyone has done/achieved
>>> real stuff here with AVAudioEngine real-time processing and how.
>>>
>>>
>>> Arshia
>>>
>>>
>>> On 11 Jul 2018, at 15:00, Benjamin Federer  wrote:
>>>
>>> Laurent,
>>>
>>> `installTapOnBus` is not intended for realtime processing as a tap only
>>> provides the current frame buffer but does not pass it back into the signal
>>> chain. The documentation reads `Installs an audio tap on the bus to record.
>>> monitor, and observe the output of the node`.
>>>
>>> Although I have not done that myself yet my understanding is that for
>>> realtime processing you can still retrieve the underlying audio unit from
>>> an AVAudioNode (or at least some nodes?) and attach an input render
>>> callback via AudioUnitSetProperty with kAudioUnitProperty_SetRenderCa
>>> llback.
>>>
>>> I assume the other way would be to subclass AUAudioUnit and wrap that
>>> into an AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses
>>> me, too. Random Google result with further information:
>>> https://forums.developer.apple.com/thread/72674
>>>
>>> Benjamin
>>>
>>>
>>> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi <
>>> laurent.noudohou...@gmail.com>:
>>>
>>> Hi all,
>>>
>>> I'm interested in this topic since I've not found any information about
>>> it yet.
>>>
>>> Correct me if I'm wrong but AVAudioEngine is not able to lower than
>>> 100ms latency. It's what I see in the header file of `AVAudioNode` with its
>>> method `installTapOnBus`:
>>>
>>> @param bufferSize the requested size of the incoming buffers in sample
>>> frames. Supported range is [100, 400] ms.
>>>
>>> Maybe I'm wrong but I don't see any other way to have a lower latency
>>> audio processing in an AVAudioNode.
>>>
>>> Best,
>>> Laurent
>>>
>>> Le mer. 11 juil. 2018 à 13:57, Arshia Cont  a
>>> écrit :
>>>
 Benjamin and list,

 I double Benjamin’s request. It would be great if someone from the
 CoreAudio Team could respond to the question.

 Two years ago, after basic tests I realised that AVAudioEngine was not
 ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a
 feeling that this is no longer the case on iOS and we can move to
 AVAudioEngine for low-latency audio processing. Anyone can share experience
 here? We do real-time spectral analysis and resynthesis of sound and go as
 low as 64 samples per cycle if the device allows.

 Thanks in advance.


 Arshia


 PS: I actually brought the deprecation issue of AUGraph in a local
 Apple Dev meeting where the EU director of developer relation was present.
 According to him, when Apple announces a deprecation, it WILL happen. My
 interpretation of the conversation is that AUGraph is no longer maintained
 but provided as is.

 On 11 Jul 2018, at 12:36, Benjamin Federer  wrote:

 Since it was mentioned in another email (thread) I’m giving this topic
 a bump. Would be great if someone at Apple, or anyone else in the know,
 could take the time to respond. The documentation at the link cited below
 still has no indication of deprecation. Will it come with one of the next
 Xcode Beta releases?

 On another note I am really interested in how transitioning over to
 AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS.
 What I am interested in is any macOS specifics or hardships.

 From my experience AVAudioEngine is relatively robust in 

Re: AUGraph deprecation

2018-07-11 Thread Arshia Cont
Bartosz,

Laurent was mentioning the installTapOnBus. Your published code would not need 
that. You are just playing MIDI. You would be concerned if you had to do custom 
real-time audio processing on the audio output of your MIDI device (such as FFT 
analysis).

Arshia

> On 11 Jul 2018, at 16:04, Bartosz Nowotny  wrote:
> 
> Laurent,
> 
> What you said about not being able to achieve latency lower than 100ms is 
> worrisome. I need a realtime MIDI synth, low latency is absolutely crucial. 
> Does the limitation you mention apply only to signal processing or other 
> applications of the API as well, in particular MIDI synthesis?
> 
> Regards,
> Bartosz
> 
> 
> On Wed, Jul 11, 2018 at 3:30 PM, Laurent Noudohounsi 
> mailto:laurent.noudohou...@gmail.com>> wrote:
> Thanks Benjamin for the precision. I thought that `installTapOnBus` was the 
> successor of `RenderCallback`. 
> For me it was not natural to mix old api like 
> `kAudioUnitProperty_SetRenderCallback` in AVAudioEngine.
> 
> So as Arshia said, I'm also looking for a way to use real-time processing 
> with AVAudioEngine.
> 
> Le mer. 11 juil. 2018 à 15:05, Arshia Cont  > a écrit :
> Interesting thread here!
> 
> Anyone has achieved low-latency processing on AVAudioEngine? 
> 
> The RenderCallback seems natural to me (which is the good “old” way of doing 
> it with AUGraph). But I’m curious to hear if anyone has done/achieved real 
> stuff here with AVAudioEngine real-time processing and how.
> 
> 
> Arshia 
> 
> 
>> On 11 Jul 2018, at 15:00, Benjamin Federer > > wrote:
>> 
>> Laurent,
>> 
>> `installTapOnBus` is not intended for realtime processing as a tap only 
>> provides the current frame buffer but does not pass it back into the signal 
>> chain. The documentation reads `Installs an audio tap on the bus to record. 
>> monitor, and observe the output of the node`.
>> 
>> Although I have not done that myself yet my understanding is that for 
>> realtime processing you can still retrieve the underlying audio unit from an 
>> AVAudioNode (or at least some nodes?) and attach an input render callback 
>> via AudioUnitSetProperty with kAudioUnitProperty_SetRenderCallback.
>> 
>> I assume the other way would be to subclass AUAudioUnit and wrap that into 
>> an AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses me, too. 
>> Random Google result with further information: 
>> https://forums.developer.apple.com/thread/72674 
>> 
>> 
>> Benjamin
>> 
>> 
>>> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi 
>>> mailto:laurent.noudohou...@gmail.com>>:
>>> 
>>> Hi all,
>>> 
>>> I'm interested in this topic since I've not found any information about it 
>>> yet.
>>> 
>>> Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms 
>>> latency. It's what I see in the header file of `AVAudioNode` with its 
>>> method `installTapOnBus`: 
>>> 
>>> @param bufferSize the requested size of the incoming buffers in sample 
>>> frames. Supported range is [100, 400] ms.
>>> 
>>> Maybe I'm wrong but I don't see any other way to have a lower latency audio 
>>> processing in an AVAudioNode.
>>> 
>>> Best,
>>> Laurent
>>> 
>>> Le mer. 11 juil. 2018 à 13:57, Arshia Cont >> > a écrit :
>>> Benjamin and list,
>>> 
>>> I double Benjamin’s request. It would be great if someone from the 
>>> CoreAudio Team could respond to the question.
>>> 
>>> Two years ago, after basic tests I realised that AVAudioEngine was not 
>>> ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a 
>>> feeling that this is no longer the case on iOS and we can move to 
>>> AVAudioEngine for low-latency audio processing. Anyone can share experience 
>>> here? We do real-time spectral analysis and resynthesis of sound and go as 
>>> low as 64 samples per cycle if the device allows.
>>> 
>>> Thanks in advance.
>>> 
>>> 
>>> Arshia
>>> 
>>> 
>>> PS: I actually brought the deprecation issue of AUGraph in a local Apple 
>>> Dev meeting where the EU director of developer relation was present. 
>>> According to him, when Apple announces a deprecation, it WILL happen. My 
>>> interpretation of the conversation is that AUGraph is no longer maintained 
>>> but provided as is.
>>> 
 On 11 Jul 2018, at 12:36, Benjamin Federer >>> > wrote:
 
 Since it was mentioned in another email (thread) I’m giving this topic a 
 bump. Would be great if someone at Apple, or anyone else in the know, 
 could take the time to respond. The documentation at the link cited below 
 still has no indication of deprecation. Will it come with one of the next 
 Xcode Beta releases?
 
 On another note I am really interested in how transitioning over to 
 AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS. 
 What I am interested in is any macOS specifics or hardships.

Re: AUGraph deprecation

2018-07-11 Thread Laurent Noudohounsi
Thanks Benjamin for the precision. I thought that `installTapOnBus` was the
successor of `RenderCallback`.
For me it was not natural to mix old api like
`kAudioUnitProperty_SetRenderCallback`
in AVAudioEngine.

So as Arshia said, I'm also looking for a way to use real-time processing
with AVAudioEngine.

Le mer. 11 juil. 2018 à 15:05, Arshia Cont  a
écrit :

> Interesting thread here!
>
> Anyone has achieved low-latency processing on AVAudioEngine?
>
> The RenderCallback seems natural to me (which is the good “old” way of
> doing it with AUGraph). But I’m curious to hear if anyone has done/achieved
> real stuff here with AVAudioEngine real-time processing and how.
>
>
> Arshia
>
>
> On 11 Jul 2018, at 15:00, Benjamin Federer  wrote:
>
> Laurent,
>
> `installTapOnBus` is not intended for realtime processing as a tap only
> provides the current frame buffer but does not pass it back into the signal
> chain. The documentation reads `Installs an audio tap on the bus to record.
> monitor, and observe the output of the node`.
>
> Although I have not done that myself yet my understanding is that for
> realtime processing you can still retrieve the underlying audio unit from
> an AVAudioNode (or at least some nodes?) and attach an input render
> callback via AudioUnitSetProperty with kAudioUnitProperty_SetRenderCallback.
>
> I assume the other way would be to subclass AUAudioUnit and wrap that into
> an AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses me,
> too. Random Google result with further information:
> https://forums.developer.apple.com/thread/72674
>
> Benjamin
>
>
> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi <
> laurent.noudohou...@gmail.com>:
>
> Hi all,
>
> I'm interested in this topic since I've not found any information about it
> yet.
>
> Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms
> latency. It's what I see in the header file of `AVAudioNode` with its
> method `installTapOnBus`:
>
> @param bufferSize the requested size of the incoming buffers in sample
> frames. Supported range is [100, 400] ms.
>
> Maybe I'm wrong but I don't see any other way to have a lower latency
> audio processing in an AVAudioNode.
>
> Best,
> Laurent
>
> Le mer. 11 juil. 2018 à 13:57, Arshia Cont  a
> écrit :
>
>> Benjamin and list,
>>
>> I double Benjamin’s request. It would be great if someone from the
>> CoreAudio Team could respond to the question.
>>
>> Two years ago, after basic tests I realised that AVAudioEngine was not
>> ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a
>> feeling that this is no longer the case on iOS and we can move to
>> AVAudioEngine for low-latency audio processing. Anyone can share experience
>> here? We do real-time spectral analysis and resynthesis of sound and go as
>> low as 64 samples per cycle if the device allows.
>>
>> Thanks in advance.
>>
>>
>> Arshia
>>
>>
>> PS: I actually brought the deprecation issue of AUGraph in a local Apple
>> Dev meeting where the EU director of developer relation was present.
>> According to him, when Apple announces a deprecation, it WILL happen. My
>> interpretation of the conversation is that AUGraph is no longer maintained
>> but provided as is.
>>
>> On 11 Jul 2018, at 12:36, Benjamin Federer  wrote:
>>
>> Since it was mentioned in another email (thread) I’m giving this topic a
>> bump. Would be great if someone at Apple, or anyone else in the know, could
>> take the time to respond. The documentation at the link cited below still
>> has no indication of deprecation. Will it come with one of the next Xcode
>> Beta releases?
>>
>> On another note I am really interested in how transitioning over to
>> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS.
>> What I am interested in is any macOS specifics or hardships.
>>
>> From my experience AVAudioEngine is relatively robust in handling
>> multiple graphs, i.e. separate chains of audio units. I had some issues
>> with the AVAudioPlayerNode connecting to multiple destinations in that
>> scenario. Also connect:toConnectionPoints:fromBus:format: did not work for
>> me as it only connected to one of the destination points. Anyone else
>> experienced problems in that regard?
>>
>> Thanks
>>
>> Benjamin
>>
>>
>> Am 08.06.2018 um 16:59 schrieb Benjamin Federer :
>>
>> Last year at WWDC it was announced that AUGraph would be deprecated in
>> 2018. I just browsed the documentation (
>> https://developer.apple.com/documentation/audiotoolbox?changes=latest_major)
>> but found
>> Audio Unit Processing Graph Services not marked for deprecation.
>> The AUGraph header files rolled out with Xcode 10 beta also have no mention
>> of a deprecation in 10.14. I searched for audio-specific sessions at this
>> year’s WWDC but wasn’t able to find anything relevant. Has anyone come
>> across new information regarding this?
>>
>> Judging by how much changes and features Apple seems to be holding back
>> until next year I dare ask: 

Re: AUGraph deprecation

2018-07-11 Thread Arshia Cont
Interesting thread here!

Anyone has achieved low-latency processing on AVAudioEngine? 

The RenderCallback seems natural to me (which is the good “old” way of doing it 
with AUGraph). But I’m curious to hear if anyone has done/achieved real stuff 
here with AVAudioEngine real-time processing and how.


Arshia 

> On 11 Jul 2018, at 15:00, Benjamin Federer  wrote:
> 
> Laurent,
> 
> `installTapOnBus` is not intended for realtime processing as a tap only 
> provides the current frame buffer but does not pass it back into the signal 
> chain. The documentation reads `Installs an audio tap on the bus to record. 
> monitor, and observe the output of the node`.
> 
> Although I have not done that myself yet my understanding is that for 
> realtime processing you can still retrieve the underlying audio unit from an 
> AVAudioNode (or at least some nodes?) and attach an input render callback via 
> AudioUnitSetProperty with kAudioUnitProperty_SetRenderCallback.
> 
> I assume the other way would be to subclass AUAudioUnit and wrap that into an 
> AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses me, too. 
> Random Google result with further information: 
> https://forums.developer.apple.com/thread/72674 
> 
> 
> Benjamin
> 
> 
>> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi 
>> mailto:laurent.noudohou...@gmail.com>>:
>> 
>> Hi all,
>> 
>> I'm interested in this topic since I've not found any information about it 
>> yet.
>> 
>> Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms 
>> latency. It's what I see in the header file of `AVAudioNode` with its method 
>> `installTapOnBus`: 
>> 
>> @param bufferSize the requested size of the incoming buffers in sample 
>> frames. Supported range is [100, 400] ms.
>> 
>> Maybe I'm wrong but I don't see any other way to have a lower latency audio 
>> processing in an AVAudioNode.
>> 
>> Best,
>> Laurent
>> 
>> Le mer. 11 juil. 2018 à 13:57, Arshia Cont > > a écrit :
>> Benjamin and list,
>> 
>> I double Benjamin’s request. It would be great if someone from the CoreAudio 
>> Team could respond to the question.
>> 
>> Two years ago, after basic tests I realised that AVAudioEngine was not ready 
>> for Low Latency Audio analysis on iOS. So we used AUGraph. I have a feeling 
>> that this is no longer the case on iOS and we can move to AVAudioEngine for 
>> low-latency audio processing. Anyone can share experience here? We do 
>> real-time spectral analysis and resynthesis of sound and go as low as 64 
>> samples per cycle if the device allows.
>> 
>> Thanks in advance.
>> 
>> 
>> Arshia
>> 
>> 
>> PS: I actually brought the deprecation issue of AUGraph in a local Apple Dev 
>> meeting where the EU director of developer relation was present. According 
>> to him, when Apple announces a deprecation, it WILL happen. My 
>> interpretation of the conversation is that AUGraph is no longer maintained 
>> but provided as is.
>> 
>>> On 11 Jul 2018, at 12:36, Benjamin Federer >> > wrote:
>>> 
>>> Since it was mentioned in another email (thread) I’m giving this topic a 
>>> bump. Would be great if someone at Apple, or anyone else in the know, could 
>>> take the time to respond. The documentation at the link cited below still 
>>> has no indication of deprecation. Will it come with one of the next Xcode 
>>> Beta releases?
>>> 
>>> On another note I am really interested in how transitioning over to 
>>> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS. 
>>> What I am interested in is any macOS specifics or hardships.
>>> 
>>> From my experience AVAudioEngine is relatively robust in handling multiple 
>>> graphs, i.e. separate chains of audio units. I had some issues with the 
>>> AVAudioPlayerNode connecting to multiple destinations in that scenario. 
>>> Also connect:toConnectionPoints:fromBus:format: did not work for me as it 
>>> only connected to one of the destination points. Anyone else experienced 
>>> problems in that regard?
>>> 
>>> Thanks
>>> 
>>> Benjamin
>>> 
>>> 
 Am 08.06.2018 um 16:59 schrieb Benjamin Federer >>> >:
 
 Last year at WWDC it was announced that AUGraph would be deprecated in 
 2018. I just browsed the documentation 
 (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major
  
 )
  but found 
 Audio Unit Processing Graph Services not marked for deprecation. The 
 AUGraph header files rolled out with Xcode 10 beta also have no mention of 
 a deprecation in 10.14. I searched for audio-specific sessions at this 
 year’s WWDC but wasn’t able to find anything relevant. Has anyone come 
 across new information regarding this?
 
 Judging by how much changes and features Apple seems to be holding back 
 until next 

Re: AUGraph deprecation

2018-07-11 Thread Benjamin Federer
Laurent,

`installTapOnBus` is not intended for realtime processing as a tap only 
provides the current frame buffer but does not pass it back into the signal 
chain. The documentation reads `Installs an audio tap on the bus to record. 
monitor, and observe the output of the node`.

Although I have not done that myself yet my understanding is that for realtime 
processing you can still retrieve the underlying audio unit from an AVAudioNode 
(or at least some nodes?) and attach an input render callback via 
AudioUnitSetProperty with kAudioUnitProperty_SetRenderCallback.

I assume the other way would be to subclass AUAudioUnit and wrap that into an 
AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses me, too. 
Random Google result with further information: 
https://forums.developer.apple.com/thread/72674

Benjamin


> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi 
> :
> 
> Hi all,
> 
> I'm interested in this topic since I've not found any information about it 
> yet.
> 
> Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms 
> latency. It's what I see in the header file of `AVAudioNode` with its method 
> `installTapOnBus`: 
> 
> @param bufferSize the requested size of the incoming buffers in sample 
> frames. Supported range is [100, 400] ms.
> 
> Maybe I'm wrong but I don't see any other way to have a lower latency audio 
> processing in an AVAudioNode.
> 
> Best,
> Laurent
> 
> Le mer. 11 juil. 2018 à 13:57, Arshia Cont  > a écrit :
> Benjamin and list,
> 
> I double Benjamin’s request. It would be great if someone from the CoreAudio 
> Team could respond to the question.
> 
> Two years ago, after basic tests I realised that AVAudioEngine was not ready 
> for Low Latency Audio analysis on iOS. So we used AUGraph. I have a feeling 
> that this is no longer the case on iOS and we can move to AVAudioEngine for 
> low-latency audio processing. Anyone can share experience here? We do 
> real-time spectral analysis and resynthesis of sound and go as low as 64 
> samples per cycle if the device allows.
> 
> Thanks in advance.
> 
> 
> Arshia
> 
> 
> PS: I actually brought the deprecation issue of AUGraph in a local Apple Dev 
> meeting where the EU director of developer relation was present. According to 
> him, when Apple announces a deprecation, it WILL happen. My interpretation of 
> the conversation is that AUGraph is no longer maintained but provided as is.
> 
>> On 11 Jul 2018, at 12:36, Benjamin Federer > > wrote:
>> 
>> Since it was mentioned in another email (thread) I’m giving this topic a 
>> bump. Would be great if someone at Apple, or anyone else in the know, could 
>> take the time to respond. The documentation at the link cited below still 
>> has no indication of deprecation. Will it come with one of the next Xcode 
>> Beta releases?
>> 
>> On another note I am really interested in how transitioning over to 
>> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS. What 
>> I am interested in is any macOS specifics or hardships.
>> 
>> From my experience AVAudioEngine is relatively robust in handling multiple 
>> graphs, i.e. separate chains of audio units. I had some issues with the 
>> AVAudioPlayerNode connecting to multiple destinations in that scenario. Also 
>> connect:toConnectionPoints:fromBus:format: did not work for me as it only 
>> connected to one of the destination points. Anyone else experienced problems 
>> in that regard?
>> 
>> Thanks
>> 
>> Benjamin
>> 
>> 
>>> Am 08.06.2018 um 16:59 schrieb Benjamin Federer >> >:
>>> 
>>> Last year at WWDC it was announced that AUGraph would be deprecated in 
>>> 2018. I just browsed the documentation 
>>> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major
>>>  
>>> )
>>>  but found 
>>> Audio Unit Processing Graph Services not marked for deprecation. The 
>>> AUGraph header files rolled out with Xcode 10 beta also have no mention of 
>>> a deprecation in 10.14. I searched for audio-specific sessions at this 
>>> year’s WWDC but wasn’t able to find anything relevant. Has anyone come 
>>> across new information regarding this?
>>> 
>>> Judging by how much changes and features Apple seems to be holding back 
>>> until next year I dare ask: Has AUGraph API deprecation been moved to a 
>>> later time?
>>> 
>>> Benjamin
>> 
>> ___
>> Do not post admin requests to the list. They will be ignored.
>> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com 
>> )
>> Help/Unsubscribe/Update your Subscription:
>> https://lists.apple.com/mailman/options/coreaudio-api/arshiacont%40antescofo.com
>>  
>> 
>> 
>> This email sent to 

Re: AUGraph deprecation

2018-07-11 Thread Laurent Noudohounsi
Hi all,

I'm interested in this topic since I've not found any information about it
yet.

Correct me if I'm wrong but AVAudioEngine is not able to lower than 100ms
latency. It's what I see in the header file of `AVAudioNode` with its
method `installTapOnBus`:

@param bufferSize the requested size of the incoming buffers in sample
frames. Supported range is [100, 400] ms.

Maybe I'm wrong but I don't see any other way to have a lower latency audio
processing in an AVAudioNode.

Best,
Laurent

Le mer. 11 juil. 2018 à 13:57, Arshia Cont  a
écrit :

> Benjamin and list,
>
> I double Benjamin’s request. It would be great if someone from the
> CoreAudio Team could respond to the question.
>
> Two years ago, after basic tests I realised that AVAudioEngine was not
> ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a
> feeling that this is no longer the case on iOS and we can move to
> AVAudioEngine for low-latency audio processing. Anyone can share experience
> here? We do real-time spectral analysis and resynthesis of sound and go as
> low as 64 samples per cycle if the device allows.
>
> Thanks in advance.
>
>
> Arshia
>
>
> PS: I actually brought the deprecation issue of AUGraph in a local Apple
> Dev meeting where the EU director of developer relation was present.
> According to him, when Apple announces a deprecation, it WILL happen. My
> interpretation of the conversation is that AUGraph is no longer maintained
> but provided as is.
>
> On 11 Jul 2018, at 12:36, Benjamin Federer  wrote:
>
> Since it was mentioned in another email (thread) I’m giving this topic a
> bump. Would be great if someone at Apple, or anyone else in the know, could
> take the time to respond. The documentation at the link cited below still
> has no indication of deprecation. Will it come with one of the next Xcode
> Beta releases?
>
> On another note I am really interested in how transitioning over to
> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS.
> What I am interested in is any macOS specifics or hardships.
>
> From my experience AVAudioEngine is relatively robust in handling multiple
> graphs, i.e. separate chains of audio units. I had some issues with the
> AVAudioPlayerNode connecting to multiple destinations in that scenario.
> Also connect:toConnectionPoints:fromBus:format: did not work for me as it
> only connected to one of the destination points. Anyone else experienced
> problems in that regard?
>
> Thanks
>
> Benjamin
>
>
> Am 08.06.2018 um 16:59 schrieb Benjamin Federer :
>
> Last year at WWDC it was announced that AUGraph would be deprecated in
> 2018. I just browsed the documentation (
> https://developer.apple.com/documentation/audiotoolbox?changes=latest_major)
> but found
> Audio Unit Processing Graph Services not marked for deprecation.
> The AUGraph header files rolled out with Xcode 10 beta also have no mention
> of a deprecation in 10.14. I searched for audio-specific sessions at this
> year’s WWDC but wasn’t able to find anything relevant. Has anyone come
> across new information regarding this?
>
> Judging by how much changes and features Apple seems to be holding back
> until next year I dare ask: Has AUGraph API deprecation been moved to a
> later time?
>
> Benjamin
>
>
> ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
>
> https://lists.apple.com/mailman/options/coreaudio-api/arshiacont%40antescofo.com
>
> This email sent to arshiac...@antescofo.com
>
>
>  ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
>
> https://lists.apple.com/mailman/options/coreaudio-api/laurent.noudohounsi%40gmail.com
>
> This email sent to laurent.noudohou...@gmail.com
>
 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-11 Thread Arshia Cont
Benjamin and list,

I double Benjamin’s request. It would be great if someone from the CoreAudio 
Team could respond to the question.

Two years ago, after basic tests I realised that AVAudioEngine was not ready 
for Low Latency Audio analysis on iOS. So we used AUGraph. I have a feeling 
that this is no longer the case on iOS and we can move to AVAudioEngine for 
low-latency audio processing. Anyone can share experience here? We do real-time 
spectral analysis and resynthesis of sound and go as low as 64 samples per 
cycle if the device allows.

Thanks in advance.


Arshia


PS: I actually brought the deprecation issue of AUGraph in a local Apple Dev 
meeting where the EU director of developer relation was present. According to 
him, when Apple announces a deprecation, it WILL happen. My interpretation of 
the conversation is that AUGraph is no longer maintained but provided as is.

> On 11 Jul 2018, at 12:36, Benjamin Federer  wrote:
> 
> Since it was mentioned in another email (thread) I’m giving this topic a 
> bump. Would be great if someone at Apple, or anyone else in the know, could 
> take the time to respond. The documentation at the link cited below still has 
> no indication of deprecation. Will it come with one of the next Xcode Beta 
> releases?
> 
> On another note I am really interested in how transitioning over to 
> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS. What 
> I am interested in is any macOS specifics or hardships.
> 
> From my experience AVAudioEngine is relatively robust in handling multiple 
> graphs, i.e. separate chains of audio units. I had some issues with the 
> AVAudioPlayerNode connecting to multiple destinations in that scenario. Also 
> connect:toConnectionPoints:fromBus:format: did not work for me as it only 
> connected to one of the destination points. Anyone else experienced problems 
> in that regard?
> 
> Thanks
> 
> Benjamin
> 
> 
>> Am 08.06.2018 um 16:59 schrieb Benjamin Federer > >:
>> 
>> Last year at WWDC it was announced that AUGraph would be deprecated in 2018. 
>> I just browsed the documentation 
>> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major 
>> )
>>  but found 
>> Audio Unit Processing Graph Services not marked for deprecation. The AUGraph 
>> header files rolled out with Xcode 10 beta also have no mention of a 
>> deprecation in 10.14. I searched for audio-specific sessions at this year’s 
>> WWDC but wasn’t able to find anything relevant. Has anyone come across new 
>> information regarding this?
>> 
>> Judging by how much changes and features Apple seems to be holding back 
>> until next year I dare ask: Has AUGraph API deprecation been moved to a 
>> later time?
>> 
>> Benjamin
> 
> ___
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/arshiacont%40antescofo.com
> 
> This email sent to arshiac...@antescofo.com

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: AUGraph deprecation

2018-07-11 Thread Benjamin Federer
Since it was mentioned in another email (thread) I’m giving this topic a bump. 
Would be great if someone at Apple, or anyone else in the know, could take the 
time to respond. The documentation at the link cited below still has no 
indication of deprecation. Will it come with one of the next Xcode Beta 
releases?

On another note I am really interested in how transitioning over to 
AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS. What I 
am interested in is any macOS specifics or hardships.

From my experience AVAudioEngine is relatively robust in handling multiple 
graphs, i.e. separate chains of audio units. I had some issues with the 
AVAudioPlayerNode connecting to multiple destinations in that scenario. Also 
connect:toConnectionPoints:fromBus:format: did not work for me as it only 
connected to one of the destination points. Anyone else experienced problems in 
that regard?

Thanks

Benjamin


> Am 08.06.2018 um 16:59 schrieb Benjamin Federer :
> 
> Last year at WWDC it was announced that AUGraph would be deprecated in 2018. 
> I just browsed the documentation 
> (https://developer.apple.com/documentation/audiotoolbox?changes=latest_major 
> )
>  but found 
> Audio Unit Processing Graph Services not marked for deprecation. The AUGraph 
> header files rolled out with Xcode 10 beta also have no mention of a 
> deprecation in 10.14. I searched for audio-specific sessions at this year’s 
> WWDC but wasn’t able to find anything relevant. Has anyone come across new 
> information regarding this?
> 
> Judging by how much changes and features Apple seems to be holding back until 
> next year I dare ask: Has AUGraph API deprecation been moved to a later time?
> 
> Benjamin

 ___
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list  (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com