Title: [213880] trunk
Revision
213880
Author
[email protected]
Date
2017-03-13 17:30:48 -0700 (Mon, 13 Mar 2017)

Log Message

[MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
https://bugs.webkit.org/show_bug.cgi?id=169474
<rdar://problem/30976747>

Reviewed by Youenn Fablet.

Source/WebCore:

Every video capture source has extremely similar code to render the current frame to
a graphics context. Because the media player gets every video sample buffer, have it
hang onto the most recent frame so it can implement paintCurrentFrameInContext directly.
Fix an existing race condition that occasionally caused the readyState to advance to
"have enough data" before a video was ready to paint by defining a MediaStreamTrackPrivate
readyState and observing that.

No new tests, covered by existing tests. These changes uncovered a bug in
fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html, which
was updated.

* Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp:
(WebCore::CanvasCaptureMediaStreamTrack::Source::captureCanvas):
(WebCore::CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext): Deleted.
(WebCore::CanvasCaptureMediaStreamTrack::Source::currentFrameImage): Deleted.
* Modules/mediastream/CanvasCaptureMediaStreamTrack.h:

* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
Drive-by change - don't pass status to parent callback, it is a property of the layer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Drive-by cleanup - we don't
use AVSampleBufferRenderSynchronizer so don't fail if it isn't available.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample): Hang onto new frame,
invalidate cached image, update readyState.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange): No more "updatePausedImage".
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Drive-by cleanup - Add an early
 return if there is no need for a layer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): renderingModeChanged -> updateRenderingMode.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Minor cleanup.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Renamed from renderingModeChanged,
add a bool return to signal when the mode changes.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): No more m_haveEverPlayed. Update display
mode immediately.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): No more paused image.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): Only return HaveNothing, HaveMetadata,
or HaveEnoughData. Don't return HaveEnoughData until all enabled tracks are providing data and never
drop back to HaveMetadata.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode): Renamed from renderingModeChanged.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::characteristicsChanged): Update intrinsic
size directly.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): No more m_hasReceivedMedia.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack): Reset imagePainter
when active video track changes.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext): Paint current
frame image.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset): New.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable): Deleted.

* platform/mediastream/MediaStreamPrivate.cpp:
(WebCore::MediaStreamPrivate::paintCurrentFrameInContext): Deleted.
(WebCore::MediaStreamPrivate::currentFrameImage): Deleted.
* platform/mediastream/MediaStreamPrivate.h:

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate):
(WebCore::MediaStreamTrackPrivate::endTrack): Update readyState.
(WebCore::MediaStreamTrackPrivate::clone): Clone readyState.
(WebCore::MediaStreamTrackPrivate::sourceStopped): Update readyState.
(WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Ditto.
(WebCore::MediaStreamTrackPrivate::audioSamplesAvailable): Ditto.
(WebCore::MediaStreamTrackPrivate::updateReadyState): New, update readyState and notify observers.
(WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
* platform/mediastream/MediaStreamTrackPrivate.h:

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
* platform/mediastream/RealtimeMediaSource.h:
(WebCore::RealtimeMediaSource::currentFrameImage): Deleted.
(WebCore::RealtimeMediaSource::paintCurrentFrameInContext): Deleted.

* platform/mediastream/mac/AVMediaCaptureSource.mm:
(-[WebCoreAVMediaCaptureSourceObserver disconnect]): Drive-by fix - clear m_callback
after calling removeNotificationObservers.
(-[WebCoreAVMediaCaptureSourceObserver removeNotificationObservers]): Drive-by fix - remove
the correct listener.
(-[WebCoreAVMediaCaptureSourceObserver endSessionInterrupted:]):

* platform/mediastream/mac/AVVideoCaptureSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::currentFrameImage): Deleted.
(WebCore::AVVideoCaptureSource::currentFrameCGImage): Deleted.
(WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Deleted.

* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::drawImage): Deleted.
(WebCore::RealtimeIncomingVideoSource::currentFrameImage): Deleted.
(WebCore::RealtimeIncomingVideoSource::paintCurrentFrameInContext): Deleted.
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:

* platform/mock/MockRealtimeVideoSource.cpp:
(WebCore::MockRealtimeVideoSource::paintCurrentFrameInContext): Deleted.
(WebCore::MockRealtimeVideoSource::currentFrameImage): Deleted.
* platform/mock/MockRealtimeVideoSource.h:

LayoutTests:

* fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt:
* fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html: Fix
bug uncovered by patch.

Modified Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (213879 => 213880)


--- trunk/LayoutTests/ChangeLog	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/LayoutTests/ChangeLog	2017-03-14 00:30:48 UTC (rev 213880)
@@ -1,3 +1,15 @@
+2017-03-13  Eric Carlson  <[email protected]>
+
+        [MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=169474
+        <rdar://problem/30976747>
+
+        Reviewed by Youenn Fablet.
+
+        * fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt:
+        * fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html: Fix 
+        bug uncovered by patch.
+
 2017-03-13  Ryan Haddad  <[email protected]>
 
         Skip WebGPU tests on ios-simulator.

Modified: trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt (213879 => 213880)


--- trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt	2017-03-14 00:30:48 UTC (rev 213880)
@@ -7,25 +7,26 @@
 video.src = ""
 
  === beginning round of pixel tests ===
-PASS pixel was black
+PASS pixel was black.
 
  === all video tracks disabled ===
 PASS pixel was black.
 
- === video track reenabled ===
-PASS pixel was white.
+ === video track reenabled, should NOT render current frame ===
+PASS pixel was black.
 
  ===== play video =====
 video.play()
 
  === beginning round of pixel tests ===
-PASS pixel was black
+PASS pixel was white.
 
  === all video tracks disabled ===
 PASS pixel was black.
 
- === video track reenabled ===
+ === video track reenabled, should render current frame ===
 PASS pixel was white.
+
 PASS successfullyParsed is true
 
 TEST COMPLETE

Modified: trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html (213879 => 213880)


--- trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html	2017-03-14 00:30:48 UTC (rev 213880)
@@ -33,8 +33,13 @@
         return pixel[0] === 255 && pixel[1] === 255 && pixel[2] === 255 && pixel[3] === 255;
     }
 
-    function attempt(numberOfTries, call, callback, successMessage)
+    function canvasShouldBeBlack()
     {
+        return !(mediaStream.getVideoTracks()[0].enabled && havePlayed);
+    }
+    
+    function attempt(numberOfTries, call, callback)
+    {
         if (numberOfTries <= 0) {
             testFailed('Pixel check did not succeed after multiple tries.');
             return;
@@ -42,13 +47,13 @@
 
         let attemptSucceeded = call();
         if (attemptSucceeded) {
-            testPassed(successMessage);
+            testPassed(canvasShouldBeBlack() ? 'pixel was black.' : 'pixel was white.');
             callback();
 
             return;
         }
         
-        setTimeout(() => { attempt(--numberOfTries, call, callback, successMessage); }, 50);
+        setTimeout(() => { attempt(--numberOfTries, call, callback); }, 50);
     }
 
     function repeatWithVideoPlayingAndFinishTest()
@@ -56,20 +61,23 @@
         if (video.paused) {
             debug('<br> ===== play video =====');
             evalAndLog('video.play()');
+            havePlayed = true;
             beginTestRound();
-        } else
+        } else {
+            debug('');
             video.pause();
             finishJSTest();
+        }
     }
 
     function reenableTrack()
     {
         mediaStream.getVideoTracks()[0].enabled = true;
-        debug('<br> === video track reenabled ===');
+        debug(`<br> === video track reenabled, should${havePlayed ? "" : " NOT"} render current frame ===`);
 
         // The video is not guaranteed to render non-black frames before the canvas is drawn to and the pixels are checked.
         // A timeout is used to ensure that the pixel check is done after the video renders non-black frames.
-        attempt(10, checkPixels, repeatWithVideoPlayingAndFinishTest, 'pixel was white.');
+        attempt(10, checkPixels, repeatWithVideoPlayingAndFinishTest);
     }
 
     function checkPixels()
@@ -76,14 +84,13 @@
     {
         context.clearRect(0, 0, canvas.width, canvas.height);
         buffer = context.getImageData(30, 242, 1, 1).data;
-        if(!isPixelTransparent(buffer)) {
+        if (!isPixelTransparent(buffer))
             testFailed('pixel was not transparent after clearing canvas.');
-        }
 
         context.drawImage(video, 0, 0, canvas.width, canvas.height);
         buffer = context.getImageData(30, 242, 1, 1).data;
 
-        if (mediaStream.getVideoTracks()[0].enabled && havePlayed)
+        if (!canvasShouldBeBlack())
             return isPixelWhite(buffer);
         else
             return isPixelBlack(buffer);
@@ -96,13 +103,13 @@
         
         // The video is not guaranteed to render black frames before the canvas is drawn to and the pixels are checked.
         // A timeout is used to ensure that the pixel check is done after the video renders black frames.
-        attempt(10, checkPixels, reenableTrack, 'pixel was black.');
+        attempt(10, checkPixels, reenableTrack);
     }
 
     function beginTestRound()
     {
         debug('<br> === beginning round of pixel tests ===');
-        attempt(1, checkPixels, disableAllTracks, 'pixel was black');
+        attempt(10, checkPixels, disableAllTracks);
     }
 
     function canplay()

Modified: trunk/Source/WebCore/ChangeLog (213879 => 213880)


--- trunk/Source/WebCore/ChangeLog	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/ChangeLog	2017-03-14 00:30:48 UTC (rev 213880)
@@ -1,3 +1,112 @@
+2017-03-13  Eric Carlson  <[email protected]>
+
+        [MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=169474
+        <rdar://problem/30976747>
+
+        Reviewed by Youenn Fablet.
+
+        Every video capture source has extremely similar code to render the current frame to
+        a graphics context. Because the media player gets every video sample buffer, have it
+        hang onto the most recent frame so it can implement paintCurrentFrameInContext directly.
+        Fix an existing race condition that occasionally caused the readyState to advance to 
+        "have enough data" before a video was ready to paint by defining a MediaStreamTrackPrivate
+        readyState and observing that.
+
+        No new tests, covered by existing tests. These changes uncovered a bug in 
+        fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html, which 
+        was updated.
+
+        * Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp:
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::captureCanvas):
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext): Deleted.
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::currentFrameImage): Deleted.
+        * Modules/mediastream/CanvasCaptureMediaStreamTrack.h:
+
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        Drive-by change - don't pass status to parent callback, it is a property of the layer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Drive-by cleanup - we don't
+        use AVSampleBufferRenderSynchronizer so don't fail if it isn't available.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample): Hang onto new frame,
+        invalidate cached image, update readyState.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange): No more "updatePausedImage".
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Drive-by cleanup - Add an early
+         return if there is no need for a layer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): renderingModeChanged -> updateRenderingMode.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Minor cleanup.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Renamed from renderingModeChanged,
+        add a bool return to signal when the mode changes.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): No more m_haveEverPlayed. Update display
+        mode immediately.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): No more paused image.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): Only return HaveNothing, HaveMetadata,
+        or HaveEnoughData. Don't return HaveEnoughData until all enabled tracks are providing data and never
+        drop back to HaveMetadata.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode): Renamed from renderingModeChanged.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::characteristicsChanged): Update intrinsic
+        size directly.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): No more m_hasReceivedMedia.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack): Reset imagePainter
+        when active video track changes.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext): Paint current
+        frame image.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset): New.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable): Deleted.
+
+        * platform/mediastream/MediaStreamPrivate.cpp:
+        (WebCore::MediaStreamPrivate::paintCurrentFrameInContext): Deleted.
+        (WebCore::MediaStreamPrivate::currentFrameImage): Deleted.
+        * platform/mediastream/MediaStreamPrivate.h:
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate):
+        (WebCore::MediaStreamTrackPrivate::endTrack): Update readyState.
+        (WebCore::MediaStreamTrackPrivate::clone): Clone readyState.
+        (WebCore::MediaStreamTrackPrivate::sourceStopped): Update readyState.
+        (WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Ditto.
+        (WebCore::MediaStreamTrackPrivate::audioSamplesAvailable): Ditto.
+        (WebCore::MediaStreamTrackPrivate::updateReadyState): New, update readyState and notify observers.
+        (WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/MediaStreamTrackPrivate.h:
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/RealtimeMediaSource.h:
+        (WebCore::RealtimeMediaSource::currentFrameImage): Deleted.
+        (WebCore::RealtimeMediaSource::paintCurrentFrameInContext): Deleted.
+
+        * platform/mediastream/mac/AVMediaCaptureSource.mm:
+        (-[WebCoreAVMediaCaptureSourceObserver disconnect]): Drive-by fix - clear m_callback
+        after calling removeNotificationObservers.
+        (-[WebCoreAVMediaCaptureSourceObserver removeNotificationObservers]): Drive-by fix - remove 
+        the correct listener.
+        (-[WebCoreAVMediaCaptureSourceObserver endSessionInterrupted:]):
+
+        * platform/mediastream/mac/AVVideoCaptureSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::currentFrameImage): Deleted.
+        (WebCore::AVVideoCaptureSource::currentFrameCGImage): Deleted.
+        (WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Deleted.
+
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+        (WebCore::drawImage): Deleted.
+        (WebCore::RealtimeIncomingVideoSource::currentFrameImage): Deleted.
+        (WebCore::RealtimeIncomingVideoSource::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+
+        * platform/mock/MockRealtimeVideoSource.cpp:
+        (WebCore::MockRealtimeVideoSource::paintCurrentFrameInContext): Deleted.
+        (WebCore::MockRealtimeVideoSource::currentFrameImage): Deleted.
+        * platform/mock/MockRealtimeVideoSource.h:
+
 2017-03-13  Carlos Alberto Lopez Perez  <[email protected]>
 
         [GTK][SOUP] Fix build after r213877

Modified: trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp (213879 => 213880)


--- trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp	2017-03-14 00:30:48 UTC (rev 213880)
@@ -142,9 +142,6 @@
     if (!m_canvas->originClean())
         return;
 
-    // FIXME: This is probably not efficient.
-    m_currentImage = m_canvas->copiedImage();
-
     auto sample = m_canvas->toMediaSample();
     if (!sample)
         return;
@@ -152,31 +149,6 @@
     videoSampleAvailable(*sample);
 }
 
-void CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (!m_canvas)
-        return;
-
-    if (context.paintingDisabled())
-        return;
-
-    auto image = currentFrameImage();
-    if (!image)
-        return;
-
-    FloatRect fullRect(0, 0, m_canvas->width(), m_canvas->height());
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    context.drawImage(*image, rect);
 }
 
-RefPtr<Image> CanvasCaptureMediaStreamTrack::Source::currentFrameImage()
-{
-    return m_currentImage;
-}
-
-}
-
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.h (213879 => 213880)


--- trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -64,8 +64,6 @@
         bool isProducingData() const { return m_isProducingData; }
         RefPtr<RealtimeMediaSourceCapabilities> capabilities() const final { return nullptr; }
         const RealtimeMediaSourceSettings& settings() const final { return m_settings; }
-        RefPtr<Image> currentFrameImage() final;
-        void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
         bool applySize(const IntSize&) final { return true; }
 
         void captureCanvas();

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (213879 => 213880)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -49,6 +49,7 @@
 class AVVideoCaptureSource;
 class Clock;
 class MediaSourcePrivateClient;
+class PixelBufferConformerCV;
 class VideoTrackPrivateMediaStream;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
@@ -77,7 +78,7 @@
     void ensureLayer();
     void destroyLayer();
 
-    void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
+    void layerStatusDidChange(AVSampleBufferDisplayLayer*);
 
 private:
     // MediaPlayerPrivateInterface
@@ -135,7 +136,6 @@
     MediaTime calculateTimelineOffset(const MediaSample&, double);
     
     void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&);
-    bool shouldEnqueueVideoSampleBuffer() const;
     void flushAndRemoveVideoSampleBuffers();
     void requestNotificationWhenReadyForVideoData();
 
@@ -161,9 +161,8 @@
     MediaPlayer::ReadyState currentReadyState();
     void updateReadyState();
 
-    void updateIntrinsicSize(const FloatSize&);
     void updateTracks();
-    void renderingModeChanged();
+    void updateRenderingMode();
     void checkSelectedVideoTrack();
 
     void scheduleDeferredTask(Function<void ()>&&);
@@ -175,8 +174,8 @@
         LivePreview,
     };
     DisplayMode currentDisplayMode() const;
-    void updateDisplayMode();
-    void updatePausedImage();
+    bool updateDisplayMode();
+    void updateCurrentFrameImage();
 
     // MediaStreamPrivate::Observer
     void activeStatusChanged() override;
@@ -190,7 +189,7 @@
     void trackSettingsChanged(MediaStreamTrackPrivate&) override { };
     void trackEnabledChanged(MediaStreamTrackPrivate&) override { };
     void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) override;
-    void audioSamplesAvailable(MediaStreamTrackPrivate&) override;
+    void readyStateChanged(MediaStreamTrackPrivate&) override;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override;
@@ -212,8 +211,17 @@
     std::unique_ptr<Clock> m_clock;
 
     MediaTime m_pausedTime;
-    RetainPtr<CGImageRef> m_pausedImage;
 
+    struct CurrentFramePainter {
+        CurrentFramePainter() = default;
+        void reset();
+
+        RetainPtr<CGImageRef> cgImage;
+        RefPtr<MediaSample> mediaSample;
+        std::unique_ptr<PixelBufferConformerCV> pixelBufferConformer;
+    };
+    CurrentFramePainter m_imagePainter;
+
     HashMap<String, RefPtr<AudioTrackPrivateMediaStreamCocoa>> m_audioTrackMap;
     HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
     PendingSampleQueue m_pendingVideoSampleQueue;
@@ -220,17 +228,16 @@
 
     MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
     MediaPlayer::ReadyState m_readyState { MediaPlayer::HaveNothing };
+    MediaPlayer::ReadyState m_previousReadyState { MediaPlayer::HaveNothing };
     FloatSize m_intrinsicSize;
     float m_volume { 1 };
     DisplayMode m_displayMode { None };
     bool m_playing { false };
     bool m_muted { false };
-    bool m_haveEverPlayed { false };
     bool m_ended { false };
     bool m_hasEverEnqueuedVideoFrame { false };
-    bool m_hasReceivedMedia { false };
-    bool m_isFrameDisplayed { false };
     bool m_pendingSelectedTrackCheck { false };
+    bool m_shouldDisplayFirstVideoFrame { false };
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     std::unique_ptr<VideoFullscreenLayerManager> m_videoFullscreenLayerManager;

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (213879 => 213880)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2017-03-14 00:30:48 UTC (rev 213880)
@@ -31,11 +31,11 @@
 #import "AVFoundationSPI.h"
 #import "AudioTrackPrivateMediaStreamCocoa.h"
 #import "Clock.h"
-#import "CoreMediaSoftLink.h"
-#import "GraphicsContext.h"
+#import "GraphicsContextCG.h"
 #import "Logging.h"
 #import "MediaStreamPrivate.h"
 #import "MediaTimeAVFoundation.h"
+#import "PixelBufferConformerCV.h"
 #import "VideoTrackPrivateMediaStream.h"
 #import <AVFoundation/AVSampleBufferDisplayLayer.h>
 #import <QuartzCore/CALayer.h>
@@ -50,14 +50,13 @@
 
 #pragma mark - Soft Linking
 
+#import "CoreMediaSoftLink.h"
+#import "CoreVideoSoftLink.h"
+
 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
 
-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*)
-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*)
-
 #define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
 #define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
 
@@ -134,11 +133,11 @@
         ASSERT(_layers.contains(layer.get()));
         ASSERT([keyPath isEqualToString:@"status"]);
 
-        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer), status = WTFMove(status)] {
+        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer)] {
             if (!protectedSelf->_parent)
                 return;
 
-            protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get());
+            protectedSelf->_parent->layerStatusDidChange(layer.get());
         });
 
     } else
@@ -201,11 +200,6 @@
     if (!AVFoundationLibrary() || !isCoreMediaFrameworkAvailable() || !getAVSampleBufferDisplayLayerClass())
         return false;
 
-#if PLATFORM(MAC)
-    if (!getAVSampleBufferRenderSynchronizerClass())
-        return false;
-#endif
-
     return true;
 }
 
@@ -281,9 +275,14 @@
     if (&track != m_mediaStreamPrivate->activeVideoTrack())
         return;
 
-    m_hasReceivedMedia = true;
-    updateReadyState();
-    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_isFrameDisplayed))
+    if (!m_imagePainter.mediaSample || m_displayMode != PausedImage) {
+        m_imagePainter.mediaSample = &sample;
+        m_imagePainter.cgImage = nullptr;
+        if (m_readyState < MediaPlayer::ReadyState::HaveEnoughData)
+            updateReadyState();
+    }
+
+    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_imagePainter.mediaSample))
         return;
 
     auto videoTrack = m_videoTrackMap.get(track.id());
@@ -306,11 +305,8 @@
         [m_sampleBufferDisplayLayer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
     }
 
-    m_isFrameDisplayed = true;
     if (!m_hasEverEnqueuedVideoFrame) {
         m_hasEverEnqueuedVideoFrame = true;
-        if (m_displayMode == PausedImage)
-            updatePausedImage();
         m_player->firstVideoFrameAvailable();
     }
 }
@@ -338,11 +334,12 @@
     return nullptr;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
+void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer)
 {
-    if (status.integerValue != AVQueuedSampleBufferRenderingStatusRendering)
+    LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(%p) - status = %d", this, (int)layer.status);
+
+    if (layer.status != AVQueuedSampleBufferRenderingStatusRendering)
         return;
-
     if (!m_sampleBufferDisplayLayer || !m_activeVideoTrack || layer != m_sampleBufferDisplayLayer)
         return;
 
@@ -357,21 +354,9 @@
         [m_sampleBufferDisplayLayer flush];
 }
 
-bool MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer() const
-{
-    if (m_displayMode == LivePreview)
-        return true;
-
-    if (m_displayMode == PausedImage && !m_isFrameDisplayed)
-        return true;
-
-    return false;
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::flushAndRemoveVideoSampleBuffers()
 {
     [m_sampleBufferDisplayLayer flushAndRemoveImage];
-    m_isFrameDisplayed = false;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
@@ -379,7 +364,15 @@
     if (m_sampleBufferDisplayLayer)
         return;
 
+    if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->activeVideoTrack() || !m_mediaStreamPrivate->activeVideoTrack()->enabled())
+        return;
+
     m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
+    if (!m_sampleBufferDisplayLayer) {
+        LOG_ERROR("MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers: +[AVSampleBufferDisplayLayer alloc] failed.");
+        return;
+    }
+
 #ifndef NDEBUG
     [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
 #endif
@@ -386,7 +379,7 @@
     m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
     [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
 
-    renderingModeChanged();
+    updateRenderingMode();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
@@ -405,7 +398,7 @@
         [m_sampleBufferDisplayLayer flush];
     }
 
-    renderingModeChanged();
+    updateRenderingMode();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     m_videoFullscreenLayerManager->didDestroyVideoLayer();
@@ -480,8 +473,10 @@
     if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
         return None;
 
-    if (m_mediaStreamPrivate->activeVideoTrack() && !m_mediaStreamPrivate->activeVideoTrack()->enabled())
-        return PaintItBlack;
+    if (auto* track = m_mediaStreamPrivate->activeVideoTrack()) {
+        if (!m_shouldDisplayFirstVideoFrame || !track->enabled() || track->muted())
+            return PaintItBlack;
+    }
 
     if (m_playing) {
         if (!m_mediaStreamPrivate->isProducingData())
@@ -492,30 +487,19 @@
     return PausedImage;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
+bool MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
 {
     DisplayMode displayMode = currentDisplayMode();
 
     if (displayMode == m_displayMode)
-        return;
+        return false;
+
     m_displayMode = displayMode;
 
     if (m_displayMode < PausedImage && m_sampleBufferDisplayLayer)
         flushAndRemoveVideoSampleBuffers();
-}
 
-void MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage()
-{
-    if (m_displayMode < PausedImage)
-        return;
-
-    RefPtr<Image> image = m_mediaStreamPrivate->currentFrameImage();
-    ASSERT(image);
-    if (!image)
-        return;
-
-    m_pausedImage = image->nativeImage();
-    ASSERT(m_pausedImage);
+    return true;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::play()
@@ -532,9 +516,10 @@
     for (const auto& track : m_audioTrackMap.values())
         track->play();
 
-    m_haveEverPlayed = true;
+    m_shouldDisplayFirstVideoFrame = true;
+    updateDisplayMode();
+
     scheduleDeferredTask([this] {
-        updateDisplayMode();
         updateReadyState();
     });
 }
@@ -553,7 +538,6 @@
         track->pause();
 
     updateDisplayMode();
-    updatePausedImage();
     flushRenderers();
 }
 
@@ -632,20 +616,26 @@
 
 MediaPlayer::ReadyState MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState()
 {
-    if (!m_mediaStreamPrivate)
+    if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->active() || !m_mediaStreamPrivate->tracks().size())
         return MediaPlayer::ReadyState::HaveNothing;
 
-    // https://w3c.github.io/mediacapture-main/ Change 8. from July 4, 2013.
-    // FIXME: Only update readyState to HAVE_ENOUGH_DATA when all active tracks have sent a sample buffer.
-    if (m_mediaStreamPrivate->active() && m_hasReceivedMedia)
-        return MediaPlayer::ReadyState::HaveEnoughData;
+    bool allTracksAreLive = true;
+    for (auto& track : m_mediaStreamPrivate->tracks()) {
+        if (!track->enabled() || track->readyState() != MediaStreamTrackPrivate::ReadyState::Live) {
+            allTracksAreLive = false;
+            break;
+        }
 
-    updateDisplayMode();
+        if (track == m_mediaStreamPrivate->activeVideoTrack() && !m_imagePainter.mediaSample) {
+            allTracksAreLive = false;
+            break;
+        }
+    }
 
-    if (m_displayMode == PausedImage)
-        return MediaPlayer::ReadyState::HaveCurrentData;
+    if (!allTracksAreLive && m_previousReadyState == MediaPlayer::ReadyState::HaveNothing)
+        return MediaPlayer::ReadyState::HaveMetadata;
 
-    return MediaPlayer::ReadyState::HaveMetadata;
+    return MediaPlayer::ReadyState::HaveEnoughData;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::updateReadyState()
@@ -674,17 +664,11 @@
     });
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize(const FloatSize& size)
+void MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode()
 {
-    if (size == m_intrinsicSize)
+    if (!updateDisplayMode())
         return;
 
-    m_intrinsicSize = size;
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged()
-{
-    updateDisplayMode();
     scheduleDeferredTask([this] {
         if (m_player)
             m_player->client().mediaPlayerRenderingModeChanged(m_player);
@@ -698,7 +682,7 @@
 
     FloatSize intrinsicSize = m_mediaStreamPrivate->intrinsicSize();
     if (intrinsicSize.height() != m_intrinsicSize.height() || intrinsicSize.width() != m_intrinsicSize.width()) {
-        updateIntrinsicSize(intrinsicSize);
+        m_intrinsicSize = intrinsicSize;
         sizeChanged = true;
     }
 
@@ -734,12 +718,7 @@
     ASSERT(mediaSample.platformSample().type == PlatformSample::CMSampleBufferType);
     ASSERT(m_mediaStreamPrivate);
 
-    if (!m_hasReceivedMedia) {
-        m_hasReceivedMedia = true;
-        updateReadyState();
-    }
-
-    if (!m_playing || streamTime().toDouble() < 0)
+    if (streamTime().toDouble() < 0)
         return;
 
     switch (track.type()) {
@@ -755,12 +734,8 @@
     }
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable(MediaStreamTrackPrivate&)
+void MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged(MediaStreamTrackPrivate&)
 {
-    if (m_hasReceivedMedia)
-        return;
-    m_hasReceivedMedia = true;
-
     scheduleDeferredTask([this] {
         updateReadyState();
     });
@@ -834,6 +809,7 @@
 
     m_pendingSelectedTrackCheck = true;
     scheduleDeferredTask([this] {
+        auto oldVideoTrack = m_activeVideoTrack;
         bool hideVideoLayer = true;
         m_activeVideoTrack = nullptr;
         if (m_mediaStreamPrivate->activeVideoTrack()) {
@@ -847,9 +823,12 @@
             }
         }
 
+        if (oldVideoTrack != m_activeVideoTrack)
+            m_imagePainter.reset();
         ensureLayer();
         m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
         m_pendingSelectedTrackCheck = false;
+        updateDisplayMode();
     });
 }
 
@@ -914,25 +893,40 @@
     paintCurrentFrameInContext(context, rect);
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
+void MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage()
 {
+    if (m_imagePainter.cgImage || !m_imagePainter.mediaSample)
+        return;
+
+    if (!m_imagePainter.pixelBufferConformer)
+        m_imagePainter.pixelBufferConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)@{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) });
+
+    ASSERT(m_imagePainter.pixelBufferConformer);
+    if (!m_imagePainter.pixelBufferConformer)
+        return;
+
+    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_imagePainter.mediaSample->platformSample().sample.cmSampleBuffer));
+    m_imagePainter.cgImage = m_imagePainter.pixelBufferConformer->createImageFromPixelBuffer(pixelBuffer);
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& destRect)
+{
     if (m_displayMode == None || !metaDataAvailable() || context.paintingDisabled())
         return;
 
-    if (m_displayMode == LivePreview)
-        m_mediaStreamPrivate->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        context.scale(FloatSize(1, -1));
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.setImageInterpolationQuality(InterpolationLow);
+    GraphicsContextStateSaver stateSaver(context);
 
-        if (m_displayMode == PausedImage && m_pausedImage)
-            CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_pausedImage.get());
-        else
-            context.fillRect(paintRect, Color::black);
+    if (m_displayMode != PaintItBlack && m_imagePainter.mediaSample)
+        updateCurrentFrameImage();
+
+    if (m_displayMode == PaintItBlack || !m_imagePainter.cgImage || !m_imagePainter.mediaSample) {
+        context.fillRect(IntRect(IntPoint(), IntSize(destRect.width(), destRect.height())), Color::black);
+        return;
     }
+
+    auto image = m_imagePainter.cgImage.get();
+    FloatRect imageRect(0, 0, CGImageGetWidth(image), CGImageGetHeight(image));
+    context.drawNativeImage(image, imageRect.size(), destRect, imageRect);
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged()
@@ -959,6 +953,7 @@
     if (m_readyState == readyState)
         return;
 
+    m_previousReadyState = m_readyState;
     m_readyState = readyState;
     characteristicsChanged();
 
@@ -985,6 +980,13 @@
     });
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset()
+{
+    cgImage = nullptr;
+    mediaSample = nullptr;
+    pixelBufferConformer = nullptr;
 }
 
+}
+
 #endif

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp	2017-03-14 00:30:48 UTC (rev 213880)
@@ -239,30 +239,6 @@
     return size;
 }
 
-void MediaStreamPrivate::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled())
-        return;
-
-    if (active() && m_activeVideoTrack)
-        m_activeVideoTrack->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        context.scale(FloatSize(1, -1));
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.fillRect(paintRect, Color::black);
-    }
-}
-
-RefPtr<Image> MediaStreamPrivate::currentFrameImage()
-{
-    if (!active() || !m_activeVideoTrack)
-        return nullptr;
-
-    return m_activeVideoTrack->source().currentFrameImage();
-}
-
 void MediaStreamPrivate::updateActiveVideoTrack()
 {
     m_activeVideoTrack = nullptr;

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -93,9 +93,6 @@
     void stopProducingData();
     bool isProducingData() const;
 
-    RefPtr<Image> currentFrameImage();
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
-
     bool hasVideo() const;
     bool hasAudio() const;
     bool muted() const;

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2017-03-14 00:30:48 UTC (rev 213880)
@@ -49,8 +49,6 @@
 MediaStreamTrackPrivate::MediaStreamTrackPrivate(Ref<RealtimeMediaSource>&& source, String&& id)
     : m_source(WTFMove(source))
     , m_id(WTFMove(id))
-    , m_isEnabled(true)
-    , m_isEnded(false)
 {
     m_source->addObserver(*this);
 }
@@ -115,6 +113,7 @@
     // only track using the source and it does stop, we will only call each observer's
     // trackEnded method once.
     m_isEnded = true;
+    updateReadyState();
 
     m_source->requestStop(this);
 
@@ -127,6 +126,7 @@
     auto clonedMediaStreamTrackPrivate = create(m_source.copyRef());
     clonedMediaStreamTrackPrivate->m_isEnabled = this->m_isEnabled;
     clonedMediaStreamTrackPrivate->m_isEnded = this->m_isEnded;
+    clonedMediaStreamTrackPrivate->updateReadyState();
 
     return clonedMediaStreamTrackPrivate;
 }
@@ -146,21 +146,6 @@
     return m_source->capabilities();
 }
 
-void MediaStreamTrackPrivate::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || m_source->type() != RealtimeMediaSource::Type::Video || ended())
-        return;
-
-    if (!m_source->muted())
-        m_source->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.fillRect(paintRect, Color::black);
-    }
-}
-
 void MediaStreamTrackPrivate::applyConstraints(const MediaConstraints& constraints, RealtimeMediaSource::SuccessHandler successHandler, RealtimeMediaSource::FailureHandler failureHandler)
 {
     m_source->applyConstraints(constraints, successHandler, failureHandler);
@@ -177,6 +162,7 @@
         return;
 
     m_isEnded = true;
+    updateReadyState();
 
     for (auto& observer : m_observers)
         observer->trackEnded(*this);
@@ -208,6 +194,11 @@
 
 void MediaStreamTrackPrivate::videoSampleAvailable(MediaSample& mediaSample)
 {
+    if (!m_haveProducedData) {
+        m_haveProducedData = true;
+        updateReadyState();
+    }
+
     mediaSample.setTrackID(id());
     for (auto& observer : m_observers)
         observer->sampleBufferUpdated(*this, mediaSample);
@@ -215,10 +206,33 @@
 
 void MediaStreamTrackPrivate::audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t)
 {
+    if (!m_haveProducedData) {
+        m_haveProducedData = true;
+        updateReadyState();
+    }
+
     for (auto& observer : m_observers)
         observer->audioSamplesAvailable(*this);
 }
 
+
+void MediaStreamTrackPrivate::updateReadyState()
+{
+    ReadyState state = ReadyState::None;
+
+    if (m_isEnded)
+        state = ReadyState::Ended;
+    else if (m_haveProducedData)
+        state = ReadyState::Live;
+
+    if (state == m_readyState)
+        return;
+
+    m_readyState = state;
+    for (auto& observer : m_observers)
+        observer->readyStateChanged(*this);
+}
+
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -49,6 +49,7 @@
         virtual void trackEnabledChanged(MediaStreamTrackPrivate&) = 0;
         virtual void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) { };
         virtual void audioSamplesAvailable(MediaStreamTrackPrivate&) { };
+        virtual void readyStateChanged(MediaStreamTrackPrivate&) { };
     };
 
     static Ref<MediaStreamTrackPrivate> create(Ref<RealtimeMediaSource>&&);
@@ -93,6 +94,9 @@
 
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
 
+    enum class ReadyState { None, Live, Ended };
+    ReadyState readyState() const { return m_readyState; }
+
 private:
     MediaStreamTrackPrivate(Ref<RealtimeMediaSource>&&, String&& id);
 
@@ -105,12 +109,16 @@
     void videoSampleAvailable(MediaSample&) final;
     void audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) final;
 
+    void updateReadyState();
+
     Vector<Observer*> m_observers;
     Ref<RealtimeMediaSource> m_source;
 
     String m_id;
-    bool m_isEnabled;
-    bool m_isEnded;
+    ReadyState m_readyState { ReadyState::None };
+    bool m_isEnabled { true };
+    bool m_isEnded { false };
+    bool m_haveProducedData { false };
 };
 
 typedef Vector<RefPtr<MediaStreamTrackPrivate>> MediaStreamTrackPrivateVector;

Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -141,9 +141,6 @@
 
     virtual AudioSourceProvider* audioSourceProvider() { return nullptr; }
 
-    virtual RefPtr<Image> currentFrameImage() { return nullptr; }
-    virtual void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) { }
-
     void setWidth(int);
     void setHeight(int);
     const IntSize& size() const { return m_size; }

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm	2017-03-14 00:30:48 UTC (rev 213880)
@@ -347,8 +347,8 @@
 - (void)disconnect
 {
     [NSObject cancelPreviousPerformRequestsWithTarget:self];
-    m_callback = 0;
     [self removeNotificationObservers];
+    m_callback = nullptr;
 }
 
 - (void)addNotificationObservers
@@ -368,8 +368,7 @@
 - (void)removeNotificationObservers
 {
 #if PLATFORM(IOS)
-    ASSERT(m_callback);
-    [[NSNotificationCenter defaultCenter] removeObserver:m_callback->session()];
+    [[NSNotificationCenter defaultCenter] removeObserver:self];
 #endif
 }
 
@@ -426,7 +425,7 @@
 
 - (void)endSessionInterrupted:(NSNotification*)notification
 {
-    LOG(Media, "WebCoreAVMediaCaptureSourceObserver::endSessionInterrupted(%p) ", self);
+    LOG(Media, "WebCoreAVMediaCaptureSourceObserver::endSessionInterrupted(%p)", self);
 
     if (m_callback)
         m_callback->captureSessionEndInterruption(notification);

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -77,11 +77,6 @@
     void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) final;
     void processNewFrame(RetainPtr<CMSampleBufferRef>);
 
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
-
-    RetainPtr<CGImageRef> currentFrameCGImage();
-    RefPtr<Image> currentFrameImage() final;
-
     RetainPtr<NSString> m_pendingPreset;
     RetainPtr<CMSampleBufferRef> m_buffer;
     RetainPtr<CGImageRef> m_lastImage;

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2017-03-14 00:30:48 UTC (rev 213880)
@@ -103,7 +103,11 @@
 
 namespace WebCore {
 
+#if PLATFORM(MAC)
 const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8Planar;
+#else
+const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange;
+#endif
 
 RefPtr<AVMediaCaptureSource> AVVideoCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint)
 {
@@ -436,58 +440,6 @@
     });
 }
 
-RefPtr<Image> AVVideoCaptureSource::currentFrameImage()
-{
-    if (!currentFrameCGImage())
-        return nullptr;
-
-    FloatRect imageRect(0, 0, m_width, m_height);
-    std::unique_ptr<ImageBuffer> imageBuffer = ImageBuffer::create(imageRect.size(), Unaccelerated);
-
-    if (!imageBuffer)
-        return nullptr;
-
-    paintCurrentFrameInContext(imageBuffer->context(), imageRect);
-
-    return ImageBuffer::sinkIntoImage(WTFMove(imageBuffer));
-}
-
-RetainPtr<CGImageRef> AVVideoCaptureSource::currentFrameCGImage()
-{
-    if (m_lastImage)
-        return m_lastImage;
-
-    if (!m_buffer)
-        return nullptr;
-
-    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    ASSERT(CVPixelBufferGetPixelFormatType(pixelBuffer) == videoCaptureFormat);
-
-    if (!m_pixelBufferConformer)
-        m_pixelBufferConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)@{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) });
-
-    ASSERT(m_pixelBufferConformer);
-    if (!m_pixelBufferConformer)
-        return nullptr;
-
-    m_lastImage = m_pixelBufferConformer->createImageFromPixelBuffer(pixelBuffer);
-
-    return m_lastImage;
-}
-
-void AVVideoCaptureSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || !currentFrameCGImage())
-        return;
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.translate(rect.x(), rect.y() + rect.height());
-    context.scale(FloatSize(1, -1));
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_lastImage.get());
-}
-
 NSString* AVVideoCaptureSource::bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height) const
 {
     if (!width && !height)

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp	2017-03-14 00:30:48 UTC (rev 213880)
@@ -150,52 +150,6 @@
     videoSampleAvailable(MediaSampleAVFObjC::create(sample));
 }
 
-static inline void drawImage(ImageBuffer& imageBuffer, CGImageRef image, const FloatRect& rect)
-{
-    auto& context = imageBuffer.context();
-    GraphicsContextStateSaver stateSaver(context);
-    context.translate(rect.x(), rect.y() + rect.height());
-    context.scale(FloatSize(1, -1));
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), image);
-}
-
-RefPtr<Image> RealtimeIncomingVideoSource::currentFrameImage()
-{
-    if (!m_buffer)
-        return nullptr;
-
-    FloatRect rect(0, 0, m_currentSettings.width(), m_currentSettings.height());
-    auto imageBuffer = ImageBuffer::create(rect.size(), Unaccelerated);
-
-    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    drawImage(*imageBuffer, m_conformer.createImageFromPixelBuffer(pixelBuffer).get(), rect);
-
-    return ImageBuffer::sinkIntoImage(WTFMove(imageBuffer));
-}
-
-void RealtimeIncomingVideoSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled())
-        return;
-
-    if (!m_buffer)
-        return;
-
-    // FIXME: Can we optimize here the painting?
-    FloatRect fullRect(0, 0, m_currentSettings.width(), m_currentSettings.height());
-    auto imageBuffer = ImageBuffer::create(fullRect.size(), Unaccelerated);
-
-    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    drawImage(*imageBuffer, m_conformer.createImageFromPixelBuffer(pixelBuffer).get(), fullRect);
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    context.drawImage(*imageBuffer->copyImage(DontCopyBackingStore), rect);
-}
-
 RefPtr<RealtimeMediaSourceCapabilities> RealtimeIncomingVideoSource::capabilities() const
 {
     return m_capabilities;

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -63,10 +63,7 @@
     RealtimeMediaSourceSupportedConstraints& supportedConstraints();
 
     void processNewSample(CMSampleBufferRef, unsigned, unsigned);
-    RefPtr<Image> currentFrameImage() final;
 
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
-
     bool isProducingData() const final { return m_isProducingData && m_buffer; }
     bool applySize(const IntSize&) final { return true; }
 

Modified: trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp (213879 => 213880)


--- trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp	2017-03-14 00:30:48 UTC (rev 213880)
@@ -357,26 +357,6 @@
     return m_imageBuffer.get();
 }
 
-void MockRealtimeVideoSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || !imageBuffer())
-        return;
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-
-    context.drawImage(*m_imageBuffer->copyImage(DontCopyBackingStore), rect);
-}
-
-RefPtr<Image> MockRealtimeVideoSource::currentFrameImage()
-{
-    if (!imageBuffer())
-        return nullptr;
-
-    return m_imageBuffer->copyImage(DontCopyBackingStore);
-}
-
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h (213879 => 213880)


--- trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h	2017-03-14 00:18:16 UTC (rev 213879)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h	2017-03-14 00:30:48 UTC (rev 213880)
@@ -76,9 +76,6 @@
     bool applyFacingMode(RealtimeMediaSourceSettings::VideoFacingMode) override { return true; }
     bool applyAspectRatio(double) override { return true; }
 
-    RefPtr<Image> currentFrameImage() override;
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
-
     void generateFrame();
 
     float m_baseFontSize { 0 };
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to