Hi,

I am using AudioQueue (for code sharing reasons between macOS and iOS) to 
record user voice.
Here is the code snippet I use to configure and start the recording:

    init() {
        var formatFlags = AudioFormatFlags()
        formatFlags |= kLinearPCMFormatFlagIsSignedInteger
        formatFlags |= kLinearPCMFormatFlagIsPacked
        p_audioDataFormat = AudioStreamBasicDescription(
            mSampleRate: 16000.0,
            mFormatID: kAudioFormatLinearPCM,
            mFormatFlags: formatFlags,
            mBytesPerPacket: UInt32(1*MemoryLayout<Int16>.stride),
            mFramesPerPacket: 1,
            mBytesPerFrame: UInt32(1*MemoryLayout<Int16>.stride),
            mChannelsPerFrame: 1,
            mBitsPerChannel: 16,
            mReserved: 0
        )
    }

    private let p_audioInputCallback:AudioQueueInputCallback = {
        inUserData, inQueue, inBuffer, inStartTime, inNumPackets, inPacketDesc 
in

        guard let userData = inUserData else { return }
        let me = 
Unmanaged<SnipsService>.fromOpaque(userData).takeUnretainedValue()

        let buffer = inBuffer.pointee
        let int16Buffer = buffer.mAudioData.bindMemory(to: Int16.self, 
capacity: Int(buffer.mAudioDataByteSize) / MemoryLayout<Int16>.size)
        var samples = Array(UnsafeBufferPointer(start: int16Buffer, count: 
Int(buffer.mAudioDataByteSize) / MemoryLayout<Int16>.size))

        try! me.snips?.appendBuffer(samples) // Do something with samples

        // buffer is available again
        AudioQueueEnqueueBuffer(inQueue, inBuffer, 0, nil)
    }

    private func startRecording() throws {
        let selfPointer = 
UnsafeMutableRawPointer(Unmanaged.passUnretained(self).toOpaque())
        var status = AudioQueueNewInput(&p_audioDataFormat, 
p_audioInputCallback, selfPointer, nil, nil, 0, &p_audioQueue)
        if status == 0 {
            guard let queue = p_audioQueue else { return }

            // allocate and enqueue buffers
            let numBuffers = 4
            let bufferSize = UInt32(1024 * MemoryLayout<Int16>.size)
            for _ in 0..<numBuffers {
                let bufferRef = 
UnsafeMutablePointer<AudioQueueBufferRef?>.allocate(capacity: 1)
                AudioQueueAllocateBuffer(queue, bufferSize, bufferRef)
                if let buffer = bufferRef.pointee {
                    AudioQueueEnqueueBuffer(queue, buffer, 0, nil)
                }
            }
            AudioQueueStart(queue, nil)
        }
    }

Everything work perfectly, the audioInputCallback is called repeatedly until I 
start playing music with MediaPlayer API.
At this time, the callback is no longer called.
If I stop the music, it works again.

Is there a special settings, a bit like the .playAndRecord  category for 
AVAudioSession but for AudioQueue?

Thanks a lot

Sébastien
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to