Outstanding, so I need to do a deeper dive into this issue.
It’s so odd.  Last night I added a render notify callback (dropping the timer 
code) and everything worked as expected, even though the callback did 
absolutely nothing.

Thank you VERY much.
You might not believe it, but it was a tremendous help just knowing.


W.

From: Fulbert Boussaton [mailto:[email protected]]
Sent: Friday, July 07, 2017 3:11 AM
To: Waverly Edwards <[email protected]>
Cc: [email protected]
Subject: Re: No sound but buffer is playing????

Hi Waverly,

if I do the following on my MBP :

let playSine = SinePlayer() // should hear tone after only instantiating the 
class
sleep(4)


I’m hearing 4 seconds of your tone, so it works here without the call to 
executeTimerCode



Cheers.
F.

On 4 Jul 2017, at 20:11, Waverly Edwards 
<[email protected]<mailto:[email protected]>> wrote:

I've encountered a very odd thing.  I created a class in which I play content 
of an audio buffer but that buffer can *only* be heard if I execute some timer 
code.  The code is supposed to play a continuous tone but that tone is only 
heard if I execute with an objc method and once that method has been executed 
sound is no longer heard but it should be continuous.

Note, I dont want the timer code at all.  I only created the timer code after 
pondering a technote about blocking the main queue would cause avaudioengine to 
not allocate resources, however I have not done that.

I cleaned the project, deleted xcodes derived code, restarted xcode, restarted 
my machine but the behavior continues which leads me to think I've done 
something incorrectly but I am not seeing what that is.

This is test code.  My actual code is suffering the same way and this example 
was the simplest code I could come up with to reproduce the problem and 
potential workaround.

Any ideas on what I could be doing incorrectly and how to resolve this issue?


W.

import Foundation
import AVFoundation

class SinePlayer{
    var _engine:AVAudioEngine
    var _player:AVAudioPlayerNode
    var _buffer: AVAudioPCMBuffer

    @objc func myPerformeCode()
    {
        print("We executed myPerformeCode()")
    }

   // we dont want to use executeTimerCode at all, it just demonstrates a 
workaround
    func executeTimerCode(index: Int)
    {
        let kTimeInSeconds = 5.0

        switch index {
        case 1  :
            _ = Timer.scheduledTimer(timeInterval: kTimeInSeconds, target: 
self, selector: #selector(myPerformeCode), userInfo: nil, repeats: false)
            print( "Using Timer.scheduledTimer -- produces sound.")

        case 2 :

            DispatchQueue.main.asyncAfter(deadline: .now() + kTimeInSeconds) {
                self.myPerformeCode()
            }
            print( "Using DispatchQueue.main.asyncAfter, calling objc method -- 
produces sound.")

        case 3  :
            DispatchQueue.main.asyncAfter(deadline: .now() + kTimeInSeconds) {
                print( "Only a print statment in dispatch queue.")
            }
            print( "Using DispatchQueue.main.asyncAfter objc method not called 
-- no sound output")

        default :
            print( "Invalid value: Index value 1,2 or 3 are the only valid 
values.")
        }
    }


    init(){
        _engine = AVAudioEngine()
        _player = AVAudioPlayerNode()
        _buffer = AVAudioPCMBuffer(pcmFormat: _player.outputFormat(forBus: 0), 
frameCapacity: 100)
        _buffer.frameLength = 100

        // generate sine wave
        let sampleRate  = Float(_engine.mainMixerNode.outputFormat(forBus: 
0).sampleRate)
        let numChannels = _engine.mainMixerNode.outputFormat(forBus: 
0).channelCount

        for i in stride(from: 0, to: Int(_buffer.frameLength), by: 
Int(numChannels)) {
            let val = sinf(441.0*Float(i)*2*Float(Double.pi)/sampleRate)
            _buffer.floatChannelData?.pointee[i] = val * 0.5
        }

        _engine.attach(_player)
        _engine.connect(_player, to: _engine.mainMixerNode, format: nil)
        _engine.connect(_engine.mainMixerNode, to: _engine.outputNode, format: 
nil)

        do{
            try _engine.start()
            print("started")
        } catch let error as NSError {
            print("Error start:\(error)")
        }

//        _player.scheduleBuffer(_buffer, at: nil, options: [], 
completionHandler: nil)
        _player.scheduleBuffer(_buffer, at: nil, options: .loops, 
completionHandler: nil)
        _player.play()

    }
}

        let playSine = SinePlayer() // should hear tone after only 
instantiating the class
        playSine.executeTimerCode(index: 1)
//        playSine.executeTimerCode(index: 2)
//        playSine.executeTimerCode(index: 3)

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      
([email protected]<mailto:[email protected]>)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/42%40flubb.net

This email sent to [email protected]<mailto:[email protected]>

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to