Title: [268600] trunk
Revision
268600
Author
[email protected]
Date
2020-10-16 11:31:58 -0700 (Fri, 16 Oct 2020)

Log Message

Move even more AudioContext-specific logic out of BaseAudioContext
https://bugs.webkit.org/show_bug.cgi?id=217803

Reviewed by Eric Carlson.

Source/WebCore:

Move even more AudioContext-specific logic out of BaseAudioContext and
into AudioContext. In particular, all the logic related to autoplay
restrictions, audible state, and platform media session interruptions
only apply to AudioContext and not OfflineAudioContext.

No new tests, no Web-facing behavior change.

* Modules/webaudio/AudioContext.cpp:
(WebCore::shouldDocumentAllowWebAudioToAutoPlay):
(WebCore::AudioContext::AudioContext):
(WebCore::AudioContext::constructCommon):
(WebCore::AudioContext::~AudioContext):
(WebCore::AudioContext::willPausePlayback):
(WebCore::AudioContext::mediaState const):
(WebCore::AudioContext::mayResumePlayback):
(WebCore::AudioContext::willBeginPlayback):
(WebCore::AudioContext::visibilityStateChanged):
(WebCore::AudioContext::suspend):
(WebCore::AudioContext::resume):
(WebCore::AudioContext::suspendPlayback):
(WebCore::AudioContext::hostingDocumentIdentifier const):
(WebCore::AudioContext::isSuspended const):
(WebCore::AudioContext::pageMutedStateDidChange):
(WebCore::AudioContext::mediaCanStart):
(WebCore::AudioContext::logger const):
* Modules/webaudio/AudioContext.h:
(WebCore::AudioContext::behaviorRestrictions const):
(WebCore::AudioContext::addBehaviorRestriction):
(WebCore::AudioContext::removeBehaviorRestriction):
(WebCore::AudioContext::userGestureRequiredForAudioStart const):
(WebCore::AudioContext::pageConsentRequiredForAudioStart const):
* Modules/webaudio/BaseAudioContext.cpp:
(WebCore::BaseAudioContext::BaseAudioContext):
(WebCore::BaseAudioContext::~BaseAudioContext):
* Modules/webaudio/BaseAudioContext.h:
* Modules/webaudio/OfflineAudioContext.cpp:
(WebCore::OfflineAudioContext::startOfflineRendering):
* testing/Internals.cpp:
(WebCore::Internals::setAudioContextRestrictions):
* testing/Internals.h:
* testing/Internals.idl:

LayoutTests:

Update test that was trying to set the autoplay restrictions on an OfflineAudioContext
to make sure they did not apply. Now that the autoplay restrictions are on AudioContext
instead of BaseAudioContext, they definitely do not apply to OfflineAudioContext and it
is not even possible to set the restrictions on an OfflineAudioContext. I updated the
test to use webkitOfflineAudioContext when available since WebKitOfflineAudioContext
does inherit WebKitAudioContext.

* webaudio/offlineaudiocontext-restriction.html:

Modified Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (268599 => 268600)


--- trunk/LayoutTests/ChangeLog	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/LayoutTests/ChangeLog	2020-10-16 18:31:58 UTC (rev 268600)
@@ -1,3 +1,19 @@
+2020-10-16  Chris Dumez  <[email protected]>
+
+        Move even more AudioContext-specific logic out of BaseAudioContext
+        https://bugs.webkit.org/show_bug.cgi?id=217803
+
+        Reviewed by Eric Carlson.
+
+        Update test that was trying to set the autoplay restrictions on an OfflineAudioContext
+        to make sure they did not apply. Now that the autoplay restrictions are on AudioContext
+        instead of BaseAudioContext, they definitely do not apply to OfflineAudioContext and it
+        is not even possible to set the restrictions on an OfflineAudioContext. I updated the
+        test to use webkitOfflineAudioContext when available since WebKitOfflineAudioContext
+        does inherit WebKitAudioContext.
+
+        * webaudio/offlineaudiocontext-restriction.html:
+
 2020-10-16  Youenn Fablet  <[email protected]>
 
         sdpFmtLine should be missing from RTCRtpCodecCapability instead of being an empty string

Modified: trunk/LayoutTests/webaudio/offlineaudiocontext-restriction.html (268599 => 268600)


--- trunk/LayoutTests/webaudio/offlineaudiocontext-restriction.html	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/LayoutTests/webaudio/offlineaudiocontext-restriction.html	2020-10-16 18:31:58 UTC (rev 268600)
@@ -20,9 +20,14 @@
 function runTest() {
     window.jsTestIsAsync = true;
 
-    context = new OfflineAudioContext(2, 1000, 44100);
+    if (window.webkitOfflineAudioContext)
+        context = new webkitOfflineAudioContext(2, 1000, 44100);
+    else
+        context = new OfflineAudioContext(2, 1000, 44100);
 
-    if (window.internals)
+    // It is not possible to set AudioContextRestrictions on an OfflineAudioContext since it
+    // does not subclass AudioContext / WebKitAudioContext.
+    if (window.internals && window.webkitOfflineAudioContext)
         internals.setAudioContextRestrictions(context, 'RequireUserGestureForAudioStart');
 
     shouldBe('context.state', '"suspended"');

Modified: trunk/Source/WebCore/ChangeLog (268599 => 268600)


--- trunk/Source/WebCore/ChangeLog	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/ChangeLog	2020-10-16 18:31:58 UTC (rev 268600)
@@ -1,3 +1,52 @@
+2020-10-16  Chris Dumez  <[email protected]>
+
+        Move even more AudioContext-specific logic out of BaseAudioContext
+        https://bugs.webkit.org/show_bug.cgi?id=217803
+
+        Reviewed by Eric Carlson.
+
+        Move even more AudioContext-specific logic out of BaseAudioContext and
+        into AudioContext. In particular, all the logic related to autoplay
+        restrictions, audible state, and platform media session interruptions
+        only apply to AudioContext and not OfflineAudioContext.
+
+        No new tests, no Web-facing behavior change.
+
+        * Modules/webaudio/AudioContext.cpp:
+        (WebCore::shouldDocumentAllowWebAudioToAutoPlay):
+        (WebCore::AudioContext::AudioContext):
+        (WebCore::AudioContext::constructCommon):
+        (WebCore::AudioContext::~AudioContext):
+        (WebCore::AudioContext::willPausePlayback):
+        (WebCore::AudioContext::mediaState const):
+        (WebCore::AudioContext::mayResumePlayback):
+        (WebCore::AudioContext::willBeginPlayback):
+        (WebCore::AudioContext::visibilityStateChanged):
+        (WebCore::AudioContext::suspend):
+        (WebCore::AudioContext::resume):
+        (WebCore::AudioContext::suspendPlayback):
+        (WebCore::AudioContext::hostingDocumentIdentifier const):
+        (WebCore::AudioContext::isSuspended const):
+        (WebCore::AudioContext::pageMutedStateDidChange):
+        (WebCore::AudioContext::mediaCanStart):
+        (WebCore::AudioContext::logger const):
+        * Modules/webaudio/AudioContext.h:
+        (WebCore::AudioContext::behaviorRestrictions const):
+        (WebCore::AudioContext::addBehaviorRestriction):
+        (WebCore::AudioContext::removeBehaviorRestriction):
+        (WebCore::AudioContext::userGestureRequiredForAudioStart const):
+        (WebCore::AudioContext::pageConsentRequiredForAudioStart const):
+        * Modules/webaudio/BaseAudioContext.cpp:
+        (WebCore::BaseAudioContext::BaseAudioContext):
+        (WebCore::BaseAudioContext::~BaseAudioContext):
+        * Modules/webaudio/BaseAudioContext.h:
+        * Modules/webaudio/OfflineAudioContext.cpp:
+        (WebCore::OfflineAudioContext::startOfflineRendering):
+        * testing/Internals.cpp:
+        (WebCore::Internals::setAudioContextRestrictions):
+        * testing/Internals.h:
+        * testing/Internals.idl:
+
 2020-10-16  Sam Weinig  <[email protected]>
 
         ENABLE_LEGACY_CSS_VENDOR_PREFIXES and RuntimeEnabledFeatures::legacyCSSVendorPrefixesEnabled() don't do anything

Modified: trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp (268599 => 268600)


--- trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/Modules/webaudio/AudioContext.cpp	2020-10-16 18:31:58 UTC (rev 268600)
@@ -31,8 +31,11 @@
 #include "AudioTimestamp.h"
 #include "DOMWindow.h"
 #include "JSDOMPromiseDeferred.h"
+#include "Logging.h"
 #include "Page.h"
 #include "Performance.h"
+#include "PlatformMediaSessionManager.h"
+#include "Quirks.h"
 #include <wtf/IsoMallocInlines.h>
 
 #if ENABLE(MEDIA_STREAM)
@@ -51,6 +54,8 @@
 
 namespace WebCore {
 
+#define RELEASE_LOG_IF_ALLOWED(fmt, ...) RELEASE_LOG_IF(document() && document()->page() && document()->page()->isAlwaysOnLoggingAllowed(), Media, "%p - AudioContext::" fmt, this, ##__VA_ARGS__)
+
 #if OS(WINDOWS)
 // Don't allow more than this number of simultaneous AudioContexts talking to hardware.
 constexpr unsigned maxHardwareContexts = 4;
@@ -64,6 +69,13 @@
     return sampleRate;
 }
 
+static bool shouldDocumentAllowWebAudioToAutoPlay(const Document& document)
+{
+    if (document.processingUserGestureForMedia() || document.isCapturing())
+        return true;
+    return document.quirks().shouldAutoplayWebAudioForArbitraryUserGesture() && document.topDocument().hasHadUserInteraction();
+}
+
 void AudioContext::setDefaultSampleRateForTesting(Optional<float> sampleRate)
 {
     defaultSampleRateForTesting() = sampleRate;
@@ -96,15 +108,46 @@
 // Constructor for rendering to the audio hardware.
 AudioContext::AudioContext(Document& document, const AudioContextOptions& contextOptions)
     : BaseAudioContext(document, contextOptions)
+    , m_mediaSession(PlatformMediaSession::create(PlatformMediaSessionManager::sharedManager(), *this))
 {
+    constructCommon();
+
+    // Initialize the destination node's muted state to match the page's current muted state.
+    pageMutedStateDidChange();
+
+    document.addAudioProducer(*this);
+    document.registerForVisibilityStateChangedCallbacks(*this);
 }
 
 // Only needed for WebKitOfflineAudioContext.
 AudioContext::AudioContext(Document& document, unsigned numberOfChannels, RefPtr<AudioBuffer>&& renderTarget)
     : BaseAudioContext(document, numberOfChannels, WTFMove(renderTarget))
+    , m_mediaSession(PlatformMediaSession::create(PlatformMediaSessionManager::sharedManager(), *this))
 {
+    constructCommon();
 }
 
+void AudioContext::constructCommon()
+{
+    ASSERT(document());
+    if (document()->audioPlaybackRequiresUserGesture())
+        addBehaviorRestriction(RequireUserGestureForAudioStartRestriction);
+    else
+        m_restrictions = NoRestrictions;
+
+#if PLATFORM(COCOA)
+    addBehaviorRestriction(RequirePageConsentForAudioStartRestriction);
+#endif
+}
+
+AudioContext::~AudioContext()
+{
+    if (!isOfflineContext() && scriptExecutionContext()) {
+        document()->removeAudioProducer(*this);
+        document()->unregisterForVisibilityStateChangedCallbacks(*this);
+    }
+}
+
 double AudioContext::baseLatency()
 {
     lazyInitialize();
@@ -263,7 +306,7 @@
     if (userGestureRequiredForAudioStart()) {
         if (!document->processingUserGestureForMedia())
             return false;
-        removeBehaviorRestriction(BaseAudioContext::RequireUserGestureForAudioStartRestriction);
+        removeBehaviorRestriction(RequireUserGestureForAudioStartRestriction);
     }
 
     if (pageConsentRequiredForAudioStart()) {
@@ -272,12 +315,155 @@
             document->addMediaCanStartListener(*this);
             return false;
         }
-        removeBehaviorRestriction(BaseAudioContext::RequirePageConsentForAudioStartRestriction);
+        removeBehaviorRestriction(RequirePageConsentForAudioStartRestriction);
     }
 
-    return mediaSession()->clientWillPausePlayback();
+    return m_mediaSession->clientWillPausePlayback();
 }
 
+MediaProducer::MediaStateFlags AudioContext::mediaState() const
+{
+    if (!isStopped() && destinationNode() && destinationNode()->isPlayingAudio())
+        return MediaProducer::IsPlayingAudio;
+
+    return MediaProducer::IsNotPlaying;
+}
+
+void AudioContext::mayResumePlayback(bool shouldResume)
+{
+    if (!destinationNode() || state() == State::Closed || state() == State::Running)
+        return;
+
+    if (!shouldResume) {
+        setState(State::Suspended);
+        return;
+    }
+
+    if (!willBeginPlayback())
+        return;
+
+    lazyInitialize();
+
+    destinationNode()->resume([this, protectedThis = makeRef(*this)] {
+        setState(State::Running);
+    });
+}
+
+bool AudioContext::willBeginPlayback()
+{
+    auto* document = this->document();
+    if (!document)
+        return false;
+
+    if (userGestureRequiredForAudioStart()) {
+        if (!shouldDocumentAllowWebAudioToAutoPlay(*document)) {
+            ALWAYS_LOG(LOGIDENTIFIER, "returning false, not processing user gesture or capturing");
+            return false;
+        }
+        removeBehaviorRestriction(RequireUserGestureForAudioStartRestriction);
+    }
+
+    if (pageConsentRequiredForAudioStart()) {
+        auto* page = document->page();
+        if (page && !page->canStartMedia()) {
+            document->addMediaCanStartListener(*this);
+            ALWAYS_LOG(LOGIDENTIFIER, "returning false, page doesn't allow media to start");
+            return false;
+        }
+        removeBehaviorRestriction(RequirePageConsentForAudioStartRestriction);
+    }
+
+    auto willBegin = m_mediaSession->clientWillBeginPlayback();
+    ALWAYS_LOG(LOGIDENTIFIER, "returning ", willBegin);
+
+    return willBegin;
+}
+
+void AudioContext::visibilityStateChanged()
+{
+    // Do not suspend if audio is audible.
+    if (!document() || mediaState() == MediaProducer::IsPlayingAudio || isStopped())
+        return;
+
+    if (document()->hidden()) {
+        if (state() == State::Running) {
+            ALWAYS_LOG(LOGIDENTIFIER, "Suspending playback after going to the background");
+            m_mediaSession->beginInterruption(PlatformMediaSession::EnteringBackground);
+        }
+    } else {
+        if (state() == State::Interrupted) {
+            ALWAYS_LOG(LOGIDENTIFIER, "Resuming playback after entering foreground");
+            m_mediaSession->endInterruption(PlatformMediaSession::MayResumePlaying);
+        }
+    }
+}
+
+void AudioContext::suspend(ReasonForSuspension)
+{
+    if (state() == State::Running) {
+        m_mediaSession->beginInterruption(PlatformMediaSession::PlaybackSuspended);
+        document()->updateIsPlayingMedia();
+    }
+}
+
+void AudioContext::resume()
+{
+    if (state() == State::Interrupted) {
+        m_mediaSession->endInterruption(PlatformMediaSession::MayResumePlaying);
+        document()->updateIsPlayingMedia();
+    }
+}
+
+void AudioContext::suspendPlayback()
+{
+    if (!destinationNode() || state() == State::Closed)
+        return;
+
+    if (state() == State::Suspended) {
+        if (m_mediaSession->state() == PlatformMediaSession::Interrupted)
+            setState(State::Interrupted);
+        return;
+    }
+
+    lazyInitialize();
+
+    destinationNode()->suspend([this, protectedThis = makeRef(*this)] {
+        bool interrupted = m_mediaSession->state() == PlatformMediaSession::Interrupted;
+        setState(interrupted ? State::Interrupted : State::Suspended);
+    });
+}
+
+DocumentIdentifier AudioContext::hostingDocumentIdentifier() const
+{
+    auto* document = downcast<Document>(m_scriptExecutionContext);
+    return document ? document->identifier() : DocumentIdentifier { };
+}
+
+bool AudioContext::isSuspended() const
+{
+    return !document() || document()->activeDOMObjectsAreSuspended() || document()->activeDOMObjectsAreStopped();
+}
+
+void AudioContext::pageMutedStateDidChange()
+{
+    if (destinationNode() && document() && document()->page())
+        destinationNode()->setMuted(document()->page()->isAudioMuted());
+}
+
+void AudioContext::mediaCanStart(Document& document)
+{
+    ASSERT_UNUSED(document, &document == this->document());
+    removeBehaviorRestriction(RequirePageConsentForAudioStartRestriction);
+    mayResumePlayback(true);
+}
+
+#if !RELEASE_LOG_DISABLED
+const Logger& AudioContext::logger() const
+{
+    return BaseAudioContext::logger();
+}
+#endif
+
 #if ENABLE(VIDEO)
 
 ExceptionOr<Ref<MediaElementAudioSourceNode>> AudioContext::createMediaElementSource(HTMLMediaElement& mediaElement)

Modified: trunk/Source/WebCore/Modules/webaudio/AudioContext.h (268599 => 268600)


--- trunk/Source/WebCore/Modules/webaudio/AudioContext.h	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/Modules/webaudio/AudioContext.h	2020-10-16 18:31:58 UTC (rev 268600)
@@ -28,6 +28,10 @@
 #include "AudioContextOptions.h"
 #include "BaseAudioContext.h"
 #include "DefaultAudioDestinationNode.h"
+#include "MediaCanStartListener.h"
+#include "MediaProducer.h"
+#include "PlatformMediaSession.h"
+#include "VisibilityChangeClient.h"
 
 namespace WebCore {
 
@@ -35,11 +39,18 @@
 
 struct AudioTimestamp;
 
-class AudioContext : public BaseAudioContext {
+class AudioContext
+    : public BaseAudioContext
+    , public MediaProducer
+    , public MediaCanStartListener
+    , private PlatformMediaSessionClient
+    , private VisibilityChangeClient {
     WTF_MAKE_ISO_ALLOCATED(AudioContext);
 public:
     // Create an AudioContext for rendering to the audio hardware.
     static ExceptionOr<Ref<AudioContext>> create(Document&, AudioContextOptions&& = { });
+    ~AudioContext();
+
     WEBCORE_EXPORT static void setDefaultSampleRateForTesting(Optional<float>);
 
     void close(DOMPromiseDeferred<void>&&);
@@ -65,13 +76,66 @@
 
     void startRendering();
 
+    // Restrictions to change default behaviors.
+    enum BehaviorRestrictionFlags {
+        NoRestrictions = 0,
+        RequireUserGestureForAudioStartRestriction = 1 << 0,
+        RequirePageConsentForAudioStartRestriction = 1 << 1,
+    };
+    typedef unsigned BehaviorRestrictions;
+    BehaviorRestrictions behaviorRestrictions() const { return m_restrictions; }
+    void addBehaviorRestriction(BehaviorRestrictions restriction) { m_restrictions |= restriction; }
+    void removeBehaviorRestriction(BehaviorRestrictions restriction) { m_restrictions &= ~restriction; }
+
 protected:
     explicit AudioContext(Document&, const AudioContextOptions& = { });
     AudioContext(Document&, unsigned numberOfChannels, RefPtr<AudioBuffer>&& renderTarget);
 
+    bool willBeginPlayback();
+
+#if !RELEASE_LOG_DISABLED
+    const Logger& logger() const final;
+#endif
+
 private:
+    void constructCommon();
+
+    bool userGestureRequiredForAudioStart() const { return !isOfflineContext() && m_restrictions & RequireUserGestureForAudioStartRestriction; }
+    bool pageConsentRequiredForAudioStart() const { return !isOfflineContext() && m_restrictions & RequirePageConsentForAudioStartRestriction; }
+
     bool willPausePlayback();
 
+    // MediaProducer
+    MediaProducer::MediaStateFlags mediaState() const override;
+    void pageMutedStateDidChange() override;
+
+    // PlatformMediaSessionClient
+    PlatformMediaSession::MediaType mediaType() const override { return PlatformMediaSession::MediaType::WebAudio; }
+    PlatformMediaSession::MediaType presentationType() const override { return PlatformMediaSession::MediaType::WebAudio; }
+    void mayResumePlayback(bool shouldResume) override;
+    void suspendPlayback() override;
+    bool canReceiveRemoteControlCommands() const override { return false; }
+    void didReceiveRemoteControlCommand(PlatformMediaSession::RemoteControlCommandType, const PlatformMediaSession::RemoteCommandArgument*) override { }
+    bool supportsSeeking() const override { return false; }
+    bool shouldOverrideBackgroundPlaybackRestriction(PlatformMediaSession::InterruptionType) const override { return false; }
+    bool canProduceAudio() const final { return true; }
+    bool isSuspended() const final;
+    DocumentIdentifier hostingDocumentIdentifier() const final;
+
+    // MediaCanStartListener.
+    void mediaCanStart(Document&) override;
+
+    // VisibilityChangeClient
+    void visibilityStateChanged() final;
+
+    // ActiveDOMObject
+    void suspend(ReasonForSuspension) final;
+    void resume() final;
+
+    std::unique_ptr<PlatformMediaSession> m_mediaSession;
+
+    BehaviorRestrictions m_restrictions { NoRestrictions };
+
     // [[suspended by user]] flag in the specification:
     // https://www.w3.org/TR/webaudio/#dom-audiocontext-suspended-by-user-slot
     bool m_wasSuspendedByScript { false };

Modified: trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.cpp (268599 => 268600)


--- trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.cpp	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.cpp	2020-10-16 18:31:58 UTC (rev 268600)
@@ -75,7 +75,6 @@
 #include "PannerNode.h"
 #include "PeriodicWave.h"
 #include "PeriodicWaveOptions.h"
-#include "PlatformMediaSessionManager.h"
 #include "ScriptController.h"
 #include "ScriptProcessorNode.h"
 #include "StereoPannerNode.h"
@@ -111,8 +110,6 @@
 
 WTF_MAKE_ISO_ALLOCATED_IMPL(BaseAudioContext);
 
-#define RELEASE_LOG_IF_ALLOWED(fmt, ...) RELEASE_LOG_IF(document() && document()->page() && document()->page()->isAlwaysOnLoggingAllowed(), Media, "%p - BaseAudioContext::" fmt, this, ##__VA_ARGS__)
-    
 bool BaseAudioContext::isSupportedSampleRate(float sampleRate)
 {
     return sampleRate >= 3000 && sampleRate <= 384000;
@@ -128,22 +125,15 @@
     , m_logIdentifier(uniqueLogIdentifier())
 #endif
     , m_worklet(AudioWorklet::create(*this))
-    , m_mediaSession(PlatformMediaSession::create(PlatformMediaSessionManager::sharedManager(), *this))
 {
     // According to spec AudioContext must die only after page navigate.
     // Lets mark it as ActiveDOMObject with pending activity and unmark it in clear method.
     makePendingActivity();
 
-    constructCommon();
+    FFTFrame::initialize();
 
     m_destinationNode = DefaultAudioDestinationNode::create(*this, contextOptions.sampleRate);
 
-    // Initialize the destination node's muted state to match the page's current muted state.
-    pageMutedStateDidChange();
-
-    document.addAudioProducer(*this);
-    document.registerForVisibilityStateChangedCallbacks(*this);
-
     // Unlike OfflineAudioContext, AudioContext does not require calling resume() to start rendering.
     // Lazy initialization starts rendering so we schedule a task here to make sure lazy initialization
     // ends up happening, even if no audio node gets constructed.
@@ -164,30 +154,14 @@
 #endif
     , m_worklet(AudioWorklet::create(*this))
     , m_isOfflineContext(true)
-    , m_mediaSession(PlatformMediaSession::create(PlatformMediaSessionManager::sharedManager(), *this))
     , m_renderTarget(WTFMove(renderTarget))
 {
-    constructCommon();
+    FFTFrame::initialize();
 
     // Create a new destination for offline rendering.
     m_destinationNode = OfflineAudioDestinationNode::create(*this, numberOfChannels, m_renderTarget.copyRef());
 }
 
-void BaseAudioContext::constructCommon()
-{
-    FFTFrame::initialize();
-
-    ASSERT(document());
-    if (document()->audioPlaybackRequiresUserGesture())
-        addBehaviorRestriction(RequireUserGestureForAudioStartRestriction);
-    else
-        m_restrictions = NoRestrictions;
-
-#if PLATFORM(COCOA)
-    addBehaviorRestriction(RequirePageConsentForAudioStartRestriction);
-#endif
-}
-
 BaseAudioContext::~BaseAudioContext()
 {
 #if DEBUG_AUDIONODE_REFERENCES
@@ -203,11 +177,6 @@
         m_renderingAutomaticPullNodes.resize(m_automaticPullNodes.size());
     ASSERT(m_renderingAutomaticPullNodes.isEmpty());
     // FIXME: Can we assert that m_deferredBreakConnectionList is empty?
-
-    if (!isOfflineContext() && scriptExecutionContext()) {
-        document()->removeAudioProducer(*this);
-        document()->unregisterForVisibilityStateChangedCallbacks(*this);
-    }
 }
 
 void BaseAudioContext::lazyInitialize()
@@ -336,22 +305,6 @@
     clear();
 }
 
-void BaseAudioContext::suspend(ReasonForSuspension)
-{
-    if (state() == State::Running) {
-        m_mediaSession->beginInterruption(PlatformMediaSession::PlaybackSuspended);
-        document()->updateIsPlayingMedia();
-    }
-}
-
-void BaseAudioContext::resume()
-{
-    if (state() == State::Interrupted) {
-        m_mediaSession->endInterruption(PlatformMediaSession::MayResumePlaying);
-        document()->updateIsPlayingMedia();
-    }
-}
-
 const char* BaseAudioContext::activeDOMObjectName() const
 {
     return "AudioContext";
@@ -362,41 +315,11 @@
     return downcast<Document>(m_scriptExecutionContext);
 }
 
-DocumentIdentifier BaseAudioContext::hostingDocumentIdentifier() const
-{
-    auto* document = downcast<Document>(m_scriptExecutionContext);
-    return document ? document->identifier() : DocumentIdentifier { };
-}
-
 float BaseAudioContext::sampleRate() const
 {
     return m_destinationNode ? m_destinationNode->sampleRate() : AudioDestination::hardwareSampleRate();
 }
 
-bool BaseAudioContext::isSuspended() const
-{
-    return !document() || document()->activeDOMObjectsAreSuspended() || document()->activeDOMObjectsAreStopped();
-}
-
-void BaseAudioContext::visibilityStateChanged()
-{
-    // Do not suspend if audio is audible.
-    if (!document() || mediaState() == MediaProducer::IsPlayingAudio || m_isStopScheduled)
-        return;
-
-    if (document()->hidden()) {
-        if (state() == State::Running) {
-            RELEASE_LOG_IF_ALLOWED("visibilityStateChanged() Suspending playback after going to the background");
-            m_mediaSession->beginInterruption(PlatformMediaSession::EnteringBackground);
-        }
-    } else {
-        if (state() == State::Interrupted) {
-            RELEASE_LOG_IF_ALLOWED("visibilityStateChanged() Resuming playback after entering foreground");
-            m_mediaSession->endInterruption(PlatformMediaSession::MayResumePlaying);
-        }
-    }
-}
-
 bool BaseAudioContext::wouldTaintOrigin(const URL& url) const
 {
     if (url.protocolIsData())
@@ -993,64 +916,6 @@
     return ActiveDOMObject::scriptExecutionContext();
 }
 
-static bool shouldDocumentAllowWebAudioToAutoPlay(const Document& document)
-{
-    if (document.processingUserGestureForMedia() || document.isCapturing())
-        return true;
-    return document.quirks().shouldAutoplayWebAudioForArbitraryUserGesture() && document.topDocument().hasHadUserInteraction();
-}
-
-bool BaseAudioContext::willBeginPlayback()
-{
-    auto* document = this->document();
-    if (!document)
-        return false;
-
-    if (userGestureRequiredForAudioStart()) {
-        if (!shouldDocumentAllowWebAudioToAutoPlay(*document)) {
-            ALWAYS_LOG(LOGIDENTIFIER, "returning false, not processing user gesture or capturing");
-            return false;
-        }
-        removeBehaviorRestriction(BaseAudioContext::RequireUserGestureForAudioStartRestriction);
-    }
-
-    if (pageConsentRequiredForAudioStart()) {
-        auto* page = document->page();
-        if (page && !page->canStartMedia()) {
-            document->addMediaCanStartListener(*this);
-            ALWAYS_LOG(LOGIDENTIFIER, "returning false, page doesn't allow media to start");
-            return false;
-        }
-        removeBehaviorRestriction(BaseAudioContext::RequirePageConsentForAudioStartRestriction);
-    }
-    
-    auto willBegin = m_mediaSession->clientWillBeginPlayback();
-    ALWAYS_LOG(LOGIDENTIFIER, "returning ", willBegin);
-    
-    return willBegin;
-}
-
-void BaseAudioContext::mediaCanStart(Document& document)
-{
-    ASSERT_UNUSED(document, &document == this->document());
-    removeBehaviorRestriction(BaseAudioContext::RequirePageConsentForAudioStartRestriction);
-    mayResumePlayback(true);
-}
-
-MediaProducer::MediaStateFlags BaseAudioContext::mediaState() const
-{
-    if (!m_isStopScheduled && m_destinationNode && m_destinationNode->isPlayingAudio())
-        return MediaProducer::IsPlayingAudio;
-
-    return MediaProducer::IsNotPlaying;
-}
-
-void BaseAudioContext::pageMutedStateDidChange()
-{
-    if (m_destinationNode && document() && document()->page())
-        m_destinationNode->setMuted(document()->page()->isAudioMuted());
-}
-
 void BaseAudioContext::isPlayingAudioDidChange()
 {
     // Make sure to call Document::updateIsPlayingMedia() on the main thread, since
@@ -1123,45 +988,6 @@
     setState(State::Suspended);
 }
 
-void BaseAudioContext::suspendPlayback()
-{
-    if (!m_destinationNode || m_state == State::Closed)
-        return;
-
-    if (m_state == State::Suspended) {
-        if (m_mediaSession->state() == PlatformMediaSession::Interrupted)
-            setState(State::Interrupted);
-        return;
-    }
-
-    lazyInitialize();
-
-    m_destinationNode->suspend([this, protectedThis = makeRef(*this)] {
-        bool interrupted = m_mediaSession->state() == PlatformMediaSession::Interrupted;
-        setState(interrupted ? State::Interrupted : State::Suspended);
-    });
-}
-
-void BaseAudioContext::mayResumePlayback(bool shouldResume)
-{
-    if (!m_destinationNode || m_state == State::Closed || m_state == State::Running)
-        return;
-
-    if (!shouldResume) {
-        setState(State::Suspended);
-        return;
-    }
-
-    if (!willBeginPlayback())
-        return;
-
-    lazyInitialize();
-
-    m_destinationNode->resume([this, protectedThis = makeRef(*this)] {
-        setState(State::Running);
-    });
-}
-
 void BaseAudioContext::postTask(WTF::Function<void()>&& task)
 {
     ASSERT(isMainThread());

Modified: trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.h (268599 => 268600)


--- trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.h	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/Modules/webaudio/BaseAudioContext.h	2020-10-16 18:31:58 UTC (rev 268600)
@@ -34,13 +34,9 @@
 #include "AudioDestinationNode.h"
 #include "EventTarget.h"
 #include "JSDOMPromiseDeferred.h"
-#include "MediaCanStartListener.h"
-#include "MediaProducer.h"
 #include "OscillatorType.h"
 #include "PeriodicWaveConstraints.h"
-#include "PlatformMediaSession.h"
 #include "ScriptExecutionContext.h"
-#include "VisibilityChangeClient.h"
 #include <_javascript_Core/ConsoleTypes.h>
 #include <_javascript_Core/Float32Array.h>
 #include <atomic>
@@ -99,21 +95,15 @@
     : public ActiveDOMObject
     , public ThreadSafeRefCounted<BaseAudioContext>
     , public EventTargetWithInlineData
-    , public MediaCanStartListener
-    , public MediaProducer
+    , public CanMakeWeakPtr<BaseAudioContext>
 #if !RELEASE_LOG_DISABLED
     , public LoggerHelper
 #endif
-    , private PlatformMediaSessionClient
-    , private VisibilityChangeClient
 {
     WTF_MAKE_ISO_ALLOCATED(BaseAudioContext);
 public:
     virtual ~BaseAudioContext();
 
-    using WeakValueType = MediaCanStartListener::WeakValueType;
-    using MediaCanStartListener::weakPtrFactory;
-
     // Reconcile ref/deref which are defined both in ThreadSafeRefCounted and EventTarget.
     using ThreadSafeRefCounted::ref;
     using ThreadSafeRefCounted::deref;
@@ -124,8 +114,6 @@
     bool isOfflineContext() const { return m_isOfflineContext; }
     virtual bool isWebKitAudioContext() const { return false; }
 
-    DocumentIdentifier hostingDocumentIdentifier() const final;
-
     AudioDestinationNode* destination() { return m_destinationNode.get(); }
     size_t currentSampleFrame() const { return m_destinationNode ? m_destinationNode->currentSampleFrame() : 0; }
     double currentTime() const { return m_destinationNode ? m_destinationNode->currentTime() : 0.; }
@@ -259,23 +247,12 @@
 
     static unsigned s_hardwareContextCount;
 
-    // Restrictions to change default behaviors.
-    enum BehaviorRestrictionFlags {
-        NoRestrictions = 0,
-        RequireUserGestureForAudioStartRestriction = 1 << 0,
-        RequirePageConsentForAudioStartRestriction = 1 << 1,
-    };
-    typedef unsigned BehaviorRestrictions;
-    BehaviorRestrictions behaviorRestrictions() const { return m_restrictions; }
-    void addBehaviorRestriction(BehaviorRestrictions restriction) { m_restrictions |= restriction; }
-    void removeBehaviorRestriction(BehaviorRestrictions restriction) { m_restrictions &= ~restriction; }
-
     void isPlayingAudioDidChange();
 
     virtual void nodeWillBeginPlayback() { }
 
 #if !RELEASE_LOG_DISABLED
-    const Logger& logger() const final { return m_logger.get(); }
+    const Logger& logger() const override { return m_logger.get(); }
     const void* logIdentifier() const final { return m_logIdentifier; }
     WTFLogChannel& logChannel() const final;
     const void* nextAudioNodeLogIdentifier() { return childLogIdentifier(m_logIdentifier, ++m_nextAudioNodeIdentifier); }
@@ -333,8 +310,6 @@
 
     AudioDestinationNode* destinationNode() const { return m_destinationNode.get(); }
 
-    bool willBeginPlayback();
-
     virtual void uninitialize();
 
 #if !RELEASE_LOG_DISABLED
@@ -346,29 +321,15 @@
 
     virtual void didFinishOfflineRendering(ExceptionOr<Ref<AudioBuffer>>&&) { }
 
-    bool userGestureRequiredForAudioStart() const { return !isOfflineContext() && m_restrictions & RequireUserGestureForAudioStartRestriction; }
-    bool pageConsentRequiredForAudioStart() const { return !isOfflineContext() && m_restrictions & RequirePageConsentForAudioStartRestriction; }
-
-    PlatformMediaSession* mediaSession() const { return m_mediaSession.get(); }
 private:
-    void constructCommon();
-
     void clear();
 
     void scheduleNodeDeletion();
 
-    void mediaCanStart(Document&) override;
-
     // EventTarget
     void dispatchEvent(Event&) final;
 
-    // MediaProducer
-    MediaProducer::MediaStateFlags mediaState() const override;
-    void pageMutedStateDidChange() override;
-
     // ActiveDOMObject API.
-    void suspend(ReasonForSuspension) final;
-    void resume() final;
     void stop() override;
     const char* activeDOMObjectName() const override;
 
@@ -376,20 +337,6 @@
     // Make sure to dereference them here.
     void derefUnfinishedSourceNodes();
 
-    // PlatformMediaSessionClient
-    PlatformMediaSession::MediaType mediaType() const override { return PlatformMediaSession::MediaType::WebAudio; }
-    PlatformMediaSession::MediaType presentationType() const override { return PlatformMediaSession::MediaType::WebAudio; }
-    void mayResumePlayback(bool shouldResume) override;
-    void suspendPlayback() override;
-    bool canReceiveRemoteControlCommands() const override { return false; }
-    void didReceiveRemoteControlCommand(PlatformMediaSession::RemoteControlCommandType, const PlatformMediaSession::RemoteCommandArgument*) override { }
-    bool supportsSeeking() const override { return false; }
-    bool shouldOverrideBackgroundPlaybackRestriction(PlatformMediaSession::InterruptionType) const override { return false; }
-    bool canProduceAudio() const final { return true; }
-    bool isSuspended() const final;
-
-    void visibilityStateChanged() final;
-
     void handleDirtyAudioSummingJunctions();
     void handleDirtyAudioNodeOutputs();
 
@@ -438,8 +385,6 @@
     Vector<AudioNode*> m_deferredBreakConnectionList;
     Vector<Vector<DOMPromiseDeferred<void>>> m_stateReactions;
 
-    std::unique_ptr<PlatformMediaSession> m_mediaSession;
-
     RefPtr<AudioBuffer> m_renderTarget;
     RefPtr<AudioDestinationNode> m_destinationNode;
     RefPtr<AudioListener> m_listener;
@@ -462,8 +407,6 @@
     // Number of AudioBufferSourceNodes that are active (playing).
     std::atomic<int> m_activeSourceCount { 0 };
 
-    BehaviorRestrictions m_restrictions { NoRestrictions };
-
     State m_state { State::Suspended };
     RefPtr<PendingActivity<BaseAudioContext>> m_pendingActivity;
 

Modified: trunk/Source/WebCore/Modules/webaudio/OfflineAudioContext.cpp (268599 => 268600)


--- trunk/Source/WebCore/Modules/webaudio/OfflineAudioContext.cpp	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/Modules/webaudio/OfflineAudioContext.cpp	2020-10-16 18:31:58 UTC (rev 268600)
@@ -99,11 +99,6 @@
         return;
     }
 
-    if (!willBeginPlayback()) {
-        promise->reject(Exception { InvalidStateError, "Refusing to start rendering for security reasons" });
-        return;
-    }
-
     if (!renderTarget()) {
         promise->reject(Exception { InvalidStateError, "Failed to create audio buffer"_s });
         return;

Modified: trunk/Source/WebCore/testing/Internals.cpp (268599 => 268600)


--- trunk/Source/WebCore/testing/Internals.cpp	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/testing/Internals.cpp	2020-10-16 18:31:58 UTC (rev 268600)
@@ -4284,10 +4284,10 @@
 #endif // ENABLE(VIDEO)
 
 #if ENABLE(WEB_AUDIO)
-void Internals::setAudioContextRestrictions(const Variant<RefPtr<BaseAudioContext>, RefPtr<WebKitAudioContext>>& contextVariant, StringView restrictionsString)
+void Internals::setAudioContextRestrictions(const Variant<RefPtr<AudioContext>, RefPtr<WebKitAudioContext>>& contextVariant, StringView restrictionsString)
 {
-    RefPtr<BaseAudioContext> context;
-    switchOn(contextVariant, [&](RefPtr<BaseAudioContext> entry) {
+    RefPtr<AudioContext> context;
+    switchOn(contextVariant, [&](RefPtr<AudioContext> entry) {
         context = entry;
     }, [&](RefPtr<WebKitAudioContext> entry) {
         context = entry;
@@ -4296,15 +4296,15 @@
     auto restrictions = context->behaviorRestrictions();
     context->removeBehaviorRestriction(restrictions);
 
-    restrictions = BaseAudioContext::NoRestrictions;
+    restrictions = AudioContext::NoRestrictions;
 
     for (StringView restrictionString : restrictionsString.split(',')) {
         if (equalLettersIgnoringASCIICase(restrictionString, "norestrictions"))
-            restrictions |= BaseAudioContext::NoRestrictions;
+            restrictions |= AudioContext::NoRestrictions;
         if (equalLettersIgnoringASCIICase(restrictionString, "requireusergestureforaudiostart"))
-            restrictions |= BaseAudioContext::RequireUserGestureForAudioStartRestriction;
+            restrictions |= AudioContext::RequireUserGestureForAudioStartRestriction;
         if (equalLettersIgnoringASCIICase(restrictionString, "requirepageconsentforaudiostart"))
-            restrictions |= BaseAudioContext::RequirePageConsentForAudioStartRestriction;
+            restrictions |= AudioContext::RequirePageConsentForAudioStartRestriction;
     }
     context->addBehaviorRestriction(restrictions);
 }

Modified: trunk/Source/WebCore/testing/Internals.h (268599 => 268600)


--- trunk/Source/WebCore/testing/Internals.h	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/testing/Internals.h	2020-10-16 18:31:58 UTC (rev 268600)
@@ -48,8 +48,8 @@
 namespace WebCore {
 
 class AnimationTimeline;
+class AudioContext;
 class AudioTrack;
-class BaseAudioContext;
 class CacheStorageConnection;
 class DOMRect;
 class DOMRectList;
@@ -669,7 +669,7 @@
 #endif
 
 #if ENABLE(WEB_AUDIO)
-    void setAudioContextRestrictions(const Variant<RefPtr<BaseAudioContext>, RefPtr<WebKitAudioContext>>&, StringView restrictionsString);
+    void setAudioContextRestrictions(const Variant<RefPtr<AudioContext>, RefPtr<WebKitAudioContext>>&, StringView restrictionsString);
     void useMockAudioDestinationCocoa();
 #endif
 

Modified: trunk/Source/WebCore/testing/Internals.idl (268599 => 268600)


--- trunk/Source/WebCore/testing/Internals.idl	2020-10-16 18:29:40 UTC (rev 268599)
+++ trunk/Source/WebCore/testing/Internals.idl	2020-10-16 18:31:58 UTC (rev 268600)
@@ -691,7 +691,7 @@
     [Conditional=VIDEO, MayThrowException] undefined setMediaSessionRestrictions(DOMString mediaType, DOMString restrictions);
     [Conditional=VIDEO, MayThrowException] DOMString mediaSessionRestrictions(DOMString mediaType);
     [Conditional=VIDEO] undefined setMediaElementRestrictions(HTMLMediaElement element, DOMString restrictions);
-    [Conditional=WEB_AUDIO] undefined setAudioContextRestrictions((BaseAudioContext or WebKitAudioContext) context, DOMString restrictions);
+    [Conditional=WEB_AUDIO] undefined setAudioContextRestrictions((AudioContext or WebKitAudioContext) context, DOMString restrictions);
     [Conditional=VIDEO, MayThrowException] undefined postRemoteControlCommand(DOMString command, optional unrestricted float argument = 0);
     [Conditional=WIRELESS_PLAYBACK_TARGET] undefined setMockMediaPlaybackTargetPickerEnabled(boolean enabled);
     [Conditional=WIRELESS_PLAYBACK_TARGET, MayThrowException] undefined setMockMediaPlaybackTargetPickerState(DOMString deviceName, DOMString deviceState);
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to