Title: [277256] trunk/Source/WebKit
Revision
277256
Author
[email protected]
Date
2021-05-10 00:10:31 -0700 (Mon, 10 May 2021)

Log Message

Use IPC::Semaphore instead of sending an IPC message for every captured audio sample
https://bugs.webkit.org/show_bug.cgi?id=225452

Reviewed by Eric Carlson.

Previously, we were sending an IPC message from UIProcess or GPUProcess to WebProcess for every microphone audio sample chunk.
We are now using IPC::Semaphore to signal that a new chunk is to be processed.

We no longer send the chunk timestamp. Instead, we reconstruct it from the number of previously processed samples.
At audio storage change, we send the start time and we assume that there is continuous timing based on sample counts after that.
That is why we recreate a new audio storage change anytime we need to reset or the configuration changes, which should not happen often in practice.

We process fixed-size chunks on WebProcess side and signal it on GPUProcess/UIProcess side.
This size is sent through IPC at audio storage change time and is the max of 128 samples (WebAudio quantum) and AudioSession preferred size.
In case WebAudio is used, it should be 128 samples. In case WebAudio is not used, it should be 20 ms of audio data.

Covered by existing tests and manually tested.

* UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp:
(WebKit::UserMediaCaptureManagerProxy::SourceProxy::start):
(WebKit::UserMediaCaptureManagerProxy::SourceProxy::storageChanged):
* WebProcess/cocoa/RemoteCaptureSampleManager.cpp:
(WebKit::RemoteCaptureSampleManager::audioStorageChanged):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::RemoteAudio):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::stopThread):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::startThread):
(WebKit::RemoteCaptureSampleManager::RemoteAudio::setStorage):
* WebProcess/cocoa/RemoteCaptureSampleManager.h:
* WebProcess/cocoa/RemoteCaptureSampleManager.messages.in:

Modified Paths

Diff

Modified: trunk/Source/WebKit/ChangeLog (277255 => 277256)


--- trunk/Source/WebKit/ChangeLog	2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/ChangeLog	2021-05-10 07:10:31 UTC (rev 277256)
@@ -1,3 +1,36 @@
+2021-05-10  Youenn Fablet  <[email protected]>
+
+        Use IPC::Semaphore instead of sending an IPC message for every captured audio sample
+        https://bugs.webkit.org/show_bug.cgi?id=225452
+
+        Reviewed by Eric Carlson.
+
+        Previously, we were sending an IPC message from UIProcess or GPUProcess to WebProcess for every microphone audio sample chunk.
+        We are now using IPC::Semaphore to signal that a new chunk is to be processed.
+
+        We no longer send the chunk timestamp. Instead, we reconstruct it from the number of previously processed samples.
+        At audio storage change, we send the start time and we assume that there is continuous timing based on sample counts after that.
+        That is why we recreate a new audio storage change anytime we need to reset or the configuration changes, which should not happen often in practice.
+
+        We process fixed-size chunks on WebProcess side and signal it on GPUProcess/UIProcess side.
+        This size is sent through IPC at audio storage change time and is the max of 128 samples (WebAudio quantum) and AudioSession preferred size.
+        In case WebAudio is used, it should be 128 samples. In case WebAudio is not used, it should be 20 ms of audio data.
+
+        Covered by existing tests and manually tested.
+
+        * UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp:
+        (WebKit::UserMediaCaptureManagerProxy::SourceProxy::start):
+        (WebKit::UserMediaCaptureManagerProxy::SourceProxy::storageChanged):
+        * WebProcess/cocoa/RemoteCaptureSampleManager.cpp:
+        (WebKit::RemoteCaptureSampleManager::audioStorageChanged):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::RemoteAudio):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::stopThread):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::startThread):
+        (WebKit::RemoteCaptureSampleManager::RemoteAudio::setStorage):
+        * WebProcess/cocoa/RemoteCaptureSampleManager.h:
+        * WebProcess/cocoa/RemoteCaptureSampleManager.messages.in:
+
 2021-05-09  Ryosuke Niwa  <[email protected]>
 
         IPC testing API should have the ability to send and receive shared memory

Modified: trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp (277255 => 277256)


--- trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp	2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp	2021-05-10 07:10:31 UTC (rev 277256)
@@ -36,6 +36,7 @@
 #include "WebCoreArgumentCoders.h"
 #include "WebProcessProxy.h"
 #include <WebCore/AudioSession.h>
+#include <WebCore/AudioUtilities.h>
 #include <WebCore/CARingBuffer.h>
 #include <WebCore/ImageRotationSessionVT.h>
 #include <WebCore/MediaConstraints.h>
@@ -60,7 +61,6 @@
         : m_id(id)
         , m_connection(WTFMove(connection))
         , m_source(WTFMove(source))
-        , m_ringBuffer(makeUniqueRef<SharedRingBufferStorage>(std::bind(&SourceProxy::storageChanged, this, std::placeholders::_1, std::placeholders::_2, std::placeholders::_3)))
     {
         m_source->addObserver(*this);
         switch (m_source->type()) {
@@ -77,7 +77,8 @@
 
     ~SourceProxy()
     {
-        storage().invalidate();
+        if (m_ringBuffer)
+            static_cast<SharedRingBufferStorage&>(m_ringBuffer->storage()).invalidate();
 
         switch (m_source->type()) {
         case RealtimeMediaSource::Type::Audio:
@@ -93,7 +94,6 @@
     }
 
     RealtimeMediaSource& source() { return m_source; }
-    SharedRingBufferStorage& storage() { return static_cast<SharedRingBufferStorage&>(m_ringBuffer.storage()); }
     CAAudioStreamDescription& description() { return m_description; }
     int64_t numberOfFrames() { return m_numberOfFrames; }
 
@@ -108,6 +108,7 @@
 
     void start()
     {
+        m_shouldReset = true;
         m_isEnded = false;
         m_source->start();
     }
@@ -145,20 +146,35 @@
 
     // May get called on a background thread.
     void audioSamplesAvailable(const MediaTime& time, const PlatformAudioData& audioData, const AudioStreamDescription& description, size_t numberOfFrames) final {
-        DisableMallocRestrictionsForCurrentThreadScope scope;
+        if (m_description != description || m_shouldReset) {
+            DisableMallocRestrictionsForCurrentThreadScope scope;
 
-        if (m_description != description) {
+            m_shouldReset = false;
+            m_writeOffset = 0;
+            m_remainingFrameCount = 0;
+            m_startTime = time;
+            m_captureSemaphore = makeUnique<IPC::Semaphore>();
             ASSERT(description.platformDescription().type == PlatformDescription::CAAudioStreamBasicType);
             m_description = *WTF::get<const AudioStreamBasicDescription*>(description.platformDescription().description);
 
+            m_frameChunkSize = std::max(WebCore::AudioUtilities::renderQuantumSize, AudioSession::sharedSession().preferredBufferSize());
+
             // Allocate a ring buffer large enough to contain 2 seconds of audio.
             m_numberOfFrames = m_description.sampleRate() * 2;
-            m_ringBuffer.allocate(m_description.streamDescription(), m_numberOfFrames);
+            m_ringBuffer.reset();
+            auto storage = makeUniqueRef<SharedRingBufferStorage>(std::bind(&SourceProxy::storageChanged, this, std::placeholders::_1, std::placeholders::_2, std::placeholders::_3));
+            m_ringBuffer = makeUnique<CARingBuffer>(WTFMove(storage), m_description.streamDescription(), m_numberOfFrames);
         }
 
         ASSERT(is<WebAudioBufferList>(audioData));
-        m_ringBuffer.store(downcast<WebAudioBufferList>(audioData).list(), numberOfFrames, time.timeValue());
-        m_connection->send(Messages::RemoteCaptureSampleManager::AudioSamplesAvailable(m_id, time, numberOfFrames), 0);
+        m_ringBuffer->store(downcast<WebAudioBufferList>(audioData).list(), numberOfFrames, m_writeOffset);
+        m_writeOffset += numberOfFrames;
+
+        size_t framesToSend = numberOfFrames + m_remainingFrameCount;
+        size_t signalCount = framesToSend / m_frameChunkSize;
+        m_remainingFrameCount = framesToSend - (signalCount * m_frameChunkSize);
+        for (unsigned i = 0; i < signalCount; ++i)
+            m_captureSemaphore->signal();
     }
 
     void videoSampleAvailable(MediaSample& sample) final
@@ -199,7 +215,6 @@
 
     void storageChanged(SharedMemory* storage, const WebCore::CAAudioStreamDescription& format, size_t frameCount)
     {
-        DisableMallocRestrictionsForCurrentThreadScope scope;
         SharedMemory::Handle handle;
         if (storage)
             storage->createHandle(handle, SharedMemory::Protection::ReadOnly);
@@ -210,7 +225,7 @@
 #else
         uint64_t dataSize = 0;
 #endif
-        m_connection->send(Messages::RemoteCaptureSampleManager::AudioStorageChanged(m_id, SharedMemory::IPCHandle { WTFMove(handle),  dataSize }, format, frameCount), 0);
+        m_connection->send(Messages::RemoteCaptureSampleManager::AudioStorageChanged(m_id, SharedMemory::IPCHandle { WTFMove(handle),  dataSize }, format, frameCount, *m_captureSemaphore, m_startTime, m_frameChunkSize), 0);
     }
 
     bool preventSourceFromStopping()
@@ -223,12 +238,18 @@
     WeakPtr<PlatformMediaSessionManager> m_sessionManager;
     Ref<IPC::Connection> m_connection;
     Ref<RealtimeMediaSource> m_source;
-    CARingBuffer m_ringBuffer;
+    std::unique_ptr<CARingBuffer> m_ringBuffer;
     CAAudioStreamDescription m_description { };
     int64_t m_numberOfFrames { 0 };
     bool m_isEnded { false };
     std::unique_ptr<ImageRotationSessionVT> m_rotationSession;
     bool m_shouldApplyRotation { false };
+    std::unique_ptr<IPC::Semaphore> m_captureSemaphore;
+    int64_t m_writeOffset { 0 };
+    int64_t m_remainingFrameCount { 0 };
+    size_t m_frameChunkSize { 0 };
+    MediaTime m_startTime;
+    bool m_shouldReset { false };
 };
 
 UserMediaCaptureManagerProxy::UserMediaCaptureManagerProxy(UniqueRef<ConnectionProxy>&& connectionProxy)

Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp (277255 => 277256)


--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp	2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.cpp	2021-05-10 07:10:31 UTC (rev 277256)
@@ -125,7 +125,7 @@
     m_queue->dispatch(WTFMove(callback));
 }
 
-void RemoteCaptureSampleManager::audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier identifier, const SharedMemory::IPCHandle& ipcHandle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames)
+void RemoteCaptureSampleManager::audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier identifier, const SharedMemory::IPCHandle& ipcHandle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames, IPC::Semaphore&& semaphore, const MediaTime& mediaTime, size_t frameChunkSize)
 {
     ASSERT(!WTF::isMainRunLoop());
 
@@ -134,21 +134,9 @@
         RELEASE_LOG_ERROR(WebRTC, "Unable to find source %llu for storageChanged", identifier.toUInt64());
         return;
     }
-    iterator->value->setStorage(ipcHandle.handle, description, numberOfFrames);
+    iterator->value->setStorage(ipcHandle.handle, description, numberOfFrames, WTFMove(semaphore), mediaTime, frameChunkSize);
 }
 
-void RemoteCaptureSampleManager::audioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier identifier, MediaTime time, uint64_t numberOfFrames)
-{
-    ASSERT(!WTF::isMainRunLoop());
-
-    auto iterator = m_audioSources.find(identifier);
-    if (iterator == m_audioSources.end()) {
-        RELEASE_LOG_ERROR(WebRTC, "Unable to find source %llu for audioSamplesAvailable", identifier.toUInt64());
-        return;
-    }
-    iterator->value->audioSamplesAvailable(time, numberOfFrames);
-}
-
 void RemoteCaptureSampleManager::videoSampleAvailable(RealtimeMediaSourceIdentifier identifier, RemoteVideoSample&& sample)
 {
     ASSERT(!WTF::isMainRunLoop());
@@ -163,34 +151,68 @@
 
 RemoteCaptureSampleManager::RemoteAudio::RemoteAudio(Ref<RemoteRealtimeAudioSource>&& source)
     : m_source(WTFMove(source))
-    , m_ringBuffer(makeUnique<CARingBuffer>())
 {
 }
 
-void RemoteCaptureSampleManager::RemoteAudio::setStorage(const SharedMemory::Handle& handle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames)
+RemoteCaptureSampleManager::RemoteAudio::~RemoteAudio()
 {
-    m_description = description;
-    m_ringBuffer = makeUnique<CARingBuffer>(makeUniqueRef<ReadOnlySharedRingBufferStorage>(handle), description, numberOfFrames);
-    m_buffer = makeUnique<WebAudioBufferList>(description, numberOfFrames);
+    stopThread();
 }
 
-void RemoteCaptureSampleManager::RemoteAudio::audioSamplesAvailable(MediaTime time, uint64_t numberOfFrames)
+void RemoteCaptureSampleManager::RemoteAudio::stopThread()
 {
-    if (!m_buffer) {
-        RELEASE_LOG_ERROR(WebRTC, "buffer for audio source %llu is null", m_source->identifier().toUInt64());
+    if (!m_thread)
         return;
-    }
 
-    if (!WebAudioBufferList::isSupportedDescription(m_description, numberOfFrames)) {
-        RELEASE_LOG_ERROR(WebRTC, "Unable to support description with given number of frames for audio source %llu", m_source->identifier().toUInt64());
+    m_shouldStopThread = true;
+    m_semaphore.signal();
+    m_thread->waitForCompletion();
+    m_thread = nullptr;
+}
+
+void RemoteCaptureSampleManager::RemoteAudio::startThread()
+{
+    ASSERT(!m_thread);
+    m_shouldStopThread = false;
+    auto threadLoop = [this]() mutable {
+        m_readOffset = 0;
+        do {
+            // If waitFor fails, the semaphore on the other side was probably destroyed and we should just exit here and wait to launch a new thread.
+            if (!m_semaphore.waitFor(Seconds::infinity()))
+                break;
+            if (m_shouldStopThread)
+                break;
+
+            auto currentTime = m_startTime + MediaTime { m_readOffset, static_cast<uint32_t>(m_description.sampleRate()) };
+            m_ringBuffer->fetch(m_buffer->list(), m_frameChunkSize, m_readOffset);
+            m_readOffset += m_frameChunkSize;
+
+            m_source->remoteAudioSamplesAvailable(currentTime, *m_buffer, m_description, m_frameChunkSize);
+        } while (!m_shouldStopThread);
+    };
+    m_thread = Thread::create("RemoteAudioSourceProviderManager::RemoteAudio thread", WTFMove(threadLoop), ThreadType::Audio, Thread::QOS::UserInteractive);
+}
+
+void RemoteCaptureSampleManager::RemoteAudio::setStorage(const SharedMemory::Handle& handle, const WebCore::CAAudioStreamDescription& description, uint64_t numberOfFrames, IPC::Semaphore&& semaphore, const MediaTime& mediaTime, size_t frameChunkSize)
+{
+    stopThread();
+
+    if (!numberOfFrames) {
+        m_ringBuffer = nullptr;
+        m_buffer = nullptr;
         return;
     }
 
-    m_buffer->setSampleCount(numberOfFrames);
+    m_semaphore = WTFMove(semaphore);
+    m_description = description;
+    m_startTime = mediaTime;
+    m_frameChunkSize = frameChunkSize;
 
-    m_ringBuffer->fetch(m_buffer->list(), numberOfFrames, time.timeValue());
+    m_ringBuffer = makeUnique<CARingBuffer>(makeUniqueRef<ReadOnlySharedRingBufferStorage>(handle), description, numberOfFrames);
+    m_buffer = makeUnique<WebAudioBufferList>(description, numberOfFrames);
+    m_buffer->setSampleCount(m_frameChunkSize);
 
-    m_source->remoteAudioSamplesAvailable(time, *m_buffer, m_description, numberOfFrames);
+    startThread();
 }
 
 RemoteCaptureSampleManager::RemoteVideo::RemoteVideo(Ref<RemoteRealtimeVideoSource>&& source)

Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h (277255 => 277256)


--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h	2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.h	2021-05-10 07:10:31 UTC (rev 277256)
@@ -28,6 +28,7 @@
 #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
 
 #include "Connection.h"
+#include "IPCSemaphore.h"
 #include "MessageReceiver.h"
 #include "RemoteRealtimeAudioSource.h"
 #include "RemoteRealtimeVideoSource.h"
@@ -65,7 +66,7 @@
     void dispatchToThread(Function<void()>&&) final;
 
     // Messages
-    void audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier, const SharedMemory::IPCHandle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames);
+    void audioStorageChanged(WebCore::RealtimeMediaSourceIdentifier, const SharedMemory::IPCHandle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames, IPC::Semaphore&&, const MediaTime&, size_t frameSampleSize);
     void audioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier, MediaTime, uint64_t numberOfFrames);
     void videoSampleAvailable(WebCore::RealtimeMediaSourceIdentifier, WebCore::RemoteVideoSample&&);
 
@@ -75,15 +76,25 @@
         WTF_MAKE_FAST_ALLOCATED;
     public:
         explicit RemoteAudio(Ref<RemoteRealtimeAudioSource>&&);
+        ~RemoteAudio();
 
-        void setStorage(const SharedMemory::Handle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames);
-        void audioSamplesAvailable(MediaTime, uint64_t numberOfFrames);
+        void setStorage(const SharedMemory::Handle&, const WebCore::CAAudioStreamDescription&, uint64_t numberOfFrames, IPC::Semaphore&&, const MediaTime&, size_t frameChunkSize);
 
     private:
+        void stopThread();
+        void startThread();
+
         Ref<RemoteRealtimeAudioSource> m_source;
         WebCore::CAAudioStreamDescription m_description;
+        std::unique_ptr<WebCore::WebAudioBufferList> m_buffer;
         std::unique_ptr<WebCore::CARingBuffer> m_ringBuffer;
-        std::unique_ptr<WebCore::WebAudioBufferList> m_buffer;
+        int64_t m_readOffset { 0 };
+        MediaTime m_startTime;
+        size_t m_frameChunkSize { 0 };
+
+        IPC::Semaphore m_semaphore;
+        RefPtr<Thread> m_thread;
+        std::atomic<bool> m_shouldStopThread { false };
     };
 
     class RemoteVideo {

Modified: trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in (277255 => 277256)


--- trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in	2021-05-10 01:48:09 UTC (rev 277255)
+++ trunk/Source/WebKit/WebProcess/cocoa/RemoteCaptureSampleManager.messages.in	2021-05-10 07:10:31 UTC (rev 277256)
@@ -24,8 +24,7 @@
 #if ENABLE(MEDIA_STREAM)
 
 messages -> RemoteCaptureSampleManager NotRefCounted {
-    AudioStorageChanged(WebCore::RealtimeMediaSourceIdentifier id, WebKit::SharedMemory::IPCHandle storageHandle, WebCore::CAAudioStreamDescription description, uint64_t numberOfFrames)
-    AudioSamplesAvailable(WebCore::RealtimeMediaSourceIdentifier id, MediaTime time, uint64_t numberOfFrames)
+    AudioStorageChanged(WebCore::RealtimeMediaSourceIdentifier id, WebKit::SharedMemory::IPCHandle storageHandle, WebCore::CAAudioStreamDescription description, uint64_t numberOfFrames, IPC::Semaphore captureSemaphore, MediaTime mediaTime, size_t frameChunkSize);
     VideoSampleAvailable(WebCore::RealtimeMediaSourceIdentifier id, WebCore::RemoteVideoSample sample)
 }
 
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to