Title: [210621] trunk/Source
Revision
210621
Author
[email protected]
Date
2017-01-11 21:22:32 -0800 (Wed, 11 Jan 2017)

Log Message

[MediaStream, Mac] Render media stream audio buffers
https://bugs.webkit.org/show_bug.cgi?id=159836
<rdar://problem/27380390>

Reviewed by Jer Noble.

No new tests, it isn't possible to test audio rendering directly. A follow-up patch will
add a mock audio source that will enable audio testing.

* platform/cf/CoreMediaSoftLink.cpp: Include new functions used.
* platform/cf/CoreMediaSoftLink.h:

* WebCore.xcodeproj/project.pbxproj: Remove references to the deleted previews.

* platform/Logging.h: Add MediaCaptureSamples.

* platform/MediaSample.h: Add outputPresentationTime and outputDuration.

* platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferGetOutputDuration, CMSampleBufferGetOutputPresentationTimeStamp,
CMTimeConvertScale, CMTimebaseGetEffectiveRate, CMAudioSampleBufferCreateWithPacketDescriptions,
CMSampleBufferSetDataBufferFromAudioBufferList, CMSampleBufferSetDataReady,
CMAudioFormatDescriptionCreate, CMClockGetHostTimeClock, and CMClockGetTime.
* platform/cf/CoreMediaSoftLink.h:

Create and use an AVSampleBufferAudioRenderer each audio stream track, when it is available,
to render for audio samples. Store the offset between the first sample received from a track's
output presentation and the synchronizer time so we can adjust sample timestamps to be
relative to the synchronizer's timeline regardless of their source. Remove the use of source
previews because not all sources will have them.

* platform/graphics/avfoundation/MediaSampleAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:

Add an ObjC helper to catch renderer status changes.
(-[WebAVSampleBufferStatusChangeListener initWithParent:]):
(-[WebAVSampleBufferStatusChangeListener dealloc]):
(-[WebAVSampleBufferStatusChangeListener invalidate]):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]):
(-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]):
(-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]):
(-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]):
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume): Deleted.

* platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm:
(WebCore::MediaSampleAVFObjC::outputPresentationTime): New.
(WebCore::MediaSampleAVFObjC::outputDuration): New.
(WebCore::MediaSampleAVFObjC::dump): Log outputPresentationTime.

* platform/mediastream/AudioTrackPrivateMediaStream.h: Add timelineOffset.

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::setEnabled): No more m_preview.
(WebCore::MediaStreamTrackPrivate::endTrack): Ditto.
(WebCore::MediaStreamTrackPrivate::preview): Deleted.
* platform/mediastream/MediaStreamTrackPrivate.h:

* platform/mediastream/RealtimeMediaSource.h:
(WebCore::RealtimeMediaSource::preview): Deleted.

* platform/mediastream/RealtimeMediaSourcePreview.h: Removed.

* platform/mediastream/VideoTrackPrivateMediaStream.h: Add timelineOffset.

* platform/mediastream/mac/AVAudioCaptureSource.h:
* platform/mediastream/mac/AVAudioCaptureSource.mm:
(WebCore::AVAudioCaptureSource::updateSettings):
(WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Pass the
sample buffer up the chain.
(WebCore::AVAudioSourcePreview::create): Deleted.
(WebCore::AVAudioSourcePreview::AVAudioSourcePreview): Deleted.
(WebCore::AVAudioSourcePreview::invalidate): Deleted.
(WebCore::AVAudioSourcePreview::play): Deleted.
(WebCore::AVAudioSourcePreview::pause): Deleted.
(WebCore::AVAudioSourcePreview::setEnabled): Deleted.
(WebCore::AVAudioSourcePreview::setVolume): Deleted.
(WebCore::AVAudioSourcePreview::updateState): Deleted.
(WebCore::AVAudioCaptureSource::createPreview): Deleted.

* platform/mediastream/mac/AVMediaCaptureSource.h:
(WebCore::AVMediaSourcePreview): Deleted.
(WebCore::AVMediaCaptureSource::createWeakPtr): Deleted.

* platform/mediastream/mac/AVMediaCaptureSource.mm:
(WebCore::AVMediaCaptureSource::AVMediaCaptureSource): No more preview.
(WebCore::AVMediaCaptureSource::reset):
(WebCore::AVMediaCaptureSource::preview): Deleted.
(WebCore::AVMediaCaptureSource::removePreview): Deleted.
(WebCore::AVMediaSourcePreview::AVMediaSourcePreview): Deleted.
(WebCore::AVMediaSourcePreview::~AVMediaSourcePreview): Deleted.
(WebCore::AVMediaSourcePreview::invalidate): Deleted.

* platform/mediastream/mac/AVVideoCaptureSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Don't set the "display immediately" attachment.
(WebCore::AVVideoSourcePreview::create): Deleted.
(WebCore::AVVideoSourcePreview::AVVideoSourcePreview): Deleted.
(WebCore::AVVideoSourcePreview::backgroundLayerBoundsChanged): Deleted.
(WebCore::AVVideoSourcePreview::invalidate): Deleted.
(WebCore::AVVideoSourcePreview::play): Deleted.
(WebCore::AVVideoSourcePreview::pause): Deleted.
(WebCore::AVVideoSourcePreview::setPaused): Deleted.
(WebCore::AVVideoSourcePreview::setEnabled): Deleted.
(WebCore::AVVideoCaptureSource::createPreview): Deleted.
(-[WebCoreAVVideoCaptureSourceObserver setParent:]): Deleted.
(-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): Deleted.

* platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
(WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Use a more typical video
time scale. Set the sample decode time.
(WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Use a static for colorspace
instead of fetching it for every frame.

* platform/mock/mediasource/MockSourceBufferPrivate.cpp: Add outputPresentationTime and outputDuration.

Modified Paths

Removed Paths

Diff

Modified: trunk/Source/WebCore/ChangeLog (210620 => 210621)


--- trunk/Source/WebCore/ChangeLog	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/ChangeLog	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,3 +1,146 @@
+2017-01-11  Eric Carlson  <[email protected]>
+
+        [MediaStream, Mac] Render media stream audio buffers
+        https://bugs.webkit.org/show_bug.cgi?id=159836
+        <rdar://problem/27380390>
+
+        Reviewed by Jer Noble.
+
+        No new tests, it isn't possible to test audio rendering directly. A follow-up patch will
+        add a mock audio source that will enable audio testing.
+
+        * platform/cf/CoreMediaSoftLink.cpp: Include new functions used.
+        * platform/cf/CoreMediaSoftLink.h:
+
+        * WebCore.xcodeproj/project.pbxproj: Remove references to the deleted previews.
+
+        * platform/Logging.h: Add MediaCaptureSamples.
+
+        * platform/MediaSample.h: Add outputPresentationTime and outputDuration.
+
+        * platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferGetOutputDuration, CMSampleBufferGetOutputPresentationTimeStamp,
+        CMTimeConvertScale, CMTimebaseGetEffectiveRate, CMAudioSampleBufferCreateWithPacketDescriptions, 
+        CMSampleBufferSetDataBufferFromAudioBufferList, CMSampleBufferSetDataReady, 
+        CMAudioFormatDescriptionCreate, CMClockGetHostTimeClock, and CMClockGetTime.
+        * platform/cf/CoreMediaSoftLink.h:
+
+        Create and use an AVSampleBufferAudioRenderer each audio stream track, when it is available,
+        to render for audio samples. Store the offset between the first sample received from a track's
+        output presentation and the synchronizer time so we can adjust sample timestamps to be 
+        relative to the synchronizer's timeline regardless of their source. Remove the use of source
+        previews because not all sources will have them.
+
+        * platform/graphics/avfoundation/MediaSampleAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        
+        Add an ObjC helper to catch renderer status changes.
+        (-[WebAVSampleBufferStatusChangeListener initWithParent:]): 
+        (-[WebAVSampleBufferStatusChangeListener dealloc]):
+        (-[WebAVSampleBufferStatusChangeListener invalidate]):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]):
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]):
+        (-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]):
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume): Deleted.
+
+        * platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm:
+        (WebCore::MediaSampleAVFObjC::outputPresentationTime): New.
+        (WebCore::MediaSampleAVFObjC::outputDuration): New.
+        (WebCore::MediaSampleAVFObjC::dump): Log outputPresentationTime.
+
+        * platform/mediastream/AudioTrackPrivateMediaStream.h: Add timelineOffset.
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::setEnabled): No more m_preview.
+        (WebCore::MediaStreamTrackPrivate::endTrack): Ditto.
+        (WebCore::MediaStreamTrackPrivate::preview): Deleted.
+        * platform/mediastream/MediaStreamTrackPrivate.h:
+
+        * platform/mediastream/RealtimeMediaSource.h:
+        (WebCore::RealtimeMediaSource::preview): Deleted.
+
+        * platform/mediastream/RealtimeMediaSourcePreview.h: Removed.
+
+        * platform/mediastream/VideoTrackPrivateMediaStream.h: Add timelineOffset.
+
+        * platform/mediastream/mac/AVAudioCaptureSource.h:
+        * platform/mediastream/mac/AVAudioCaptureSource.mm:
+        (WebCore::AVAudioCaptureSource::updateSettings):
+        (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Pass the
+        sample buffer up the chain.
+        (WebCore::AVAudioSourcePreview::create): Deleted.
+        (WebCore::AVAudioSourcePreview::AVAudioSourcePreview): Deleted.
+        (WebCore::AVAudioSourcePreview::invalidate): Deleted.
+        (WebCore::AVAudioSourcePreview::play): Deleted.
+        (WebCore::AVAudioSourcePreview::pause): Deleted.
+        (WebCore::AVAudioSourcePreview::setEnabled): Deleted.
+        (WebCore::AVAudioSourcePreview::setVolume): Deleted.
+        (WebCore::AVAudioSourcePreview::updateState): Deleted.
+        (WebCore::AVAudioCaptureSource::createPreview): Deleted.
+
+        * platform/mediastream/mac/AVMediaCaptureSource.h:
+        (WebCore::AVMediaSourcePreview): Deleted.
+        (WebCore::AVMediaCaptureSource::createWeakPtr): Deleted.
+
+        * platform/mediastream/mac/AVMediaCaptureSource.mm:
+        (WebCore::AVMediaCaptureSource::AVMediaCaptureSource): No more preview.
+        (WebCore::AVMediaCaptureSource::reset):
+        (WebCore::AVMediaCaptureSource::preview): Deleted.
+        (WebCore::AVMediaCaptureSource::removePreview): Deleted.
+        (WebCore::AVMediaSourcePreview::AVMediaSourcePreview): Deleted.
+        (WebCore::AVMediaSourcePreview::~AVMediaSourcePreview): Deleted.
+        (WebCore::AVMediaSourcePreview::invalidate): Deleted.
+
+        * platform/mediastream/mac/AVVideoCaptureSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::processNewFrame): Don't set the "display immediately" attachment.
+        (WebCore::AVVideoSourcePreview::create): Deleted.
+        (WebCore::AVVideoSourcePreview::AVVideoSourcePreview): Deleted.
+        (WebCore::AVVideoSourcePreview::backgroundLayerBoundsChanged): Deleted.
+        (WebCore::AVVideoSourcePreview::invalidate): Deleted.
+        (WebCore::AVVideoSourcePreview::play): Deleted.
+        (WebCore::AVVideoSourcePreview::pause): Deleted.
+        (WebCore::AVVideoSourcePreview::setPaused): Deleted.
+        (WebCore::AVVideoSourcePreview::setEnabled): Deleted.
+        (WebCore::AVVideoCaptureSource::createPreview): Deleted.
+        (-[WebCoreAVVideoCaptureSourceObserver setParent:]): Deleted.
+        (-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): Deleted.
+
+        * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
+        (WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Use a more typical video
+        time scale. Set the sample decode time.
+        (WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Use a static for colorspace
+        instead of fetching it for every frame.
+
+        * platform/mock/mediasource/MockSourceBufferPrivate.cpp: Add outputPresentationTime and outputDuration.
+
 2017-01-11  Youenn Fablet  <[email protected]>
 
         Remove request.formData property until it gets implemented

Modified: trunk/Source/WebCore/Modules/webaudio/ScriptProcessorNode.cpp (210620 => 210621)


--- trunk/Source/WebCore/Modules/webaudio/ScriptProcessorNode.cpp	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/Modules/webaudio/ScriptProcessorNode.cpp	2017-01-12 05:22:32 UTC (rev 210621)
@@ -213,6 +213,9 @@
             m_isRequestOutstanding = true;
 
             callOnMainThread([this] {
+                if (!m_hasAudioProcessListener)
+                    return;
+
                 fireProcessEvent();
 
                 // De-reference to match the ref() call in process().

Modified: trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj (210620 => 210621)


--- trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2017-01-12 05:22:32 UTC (rev 210621)
@@ -279,7 +279,6 @@
 		07C1C0E21BFB600100BD2256 /* MediaTrackSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E01BFB600100BD2256 /* MediaTrackSupportedConstraints.h */; };
 		07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E41BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h */; settings = {ATTRIBUTES = (Private, ); }; };
 		07CE77D516712A6A00C55A47 /* InbandTextTrackPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */; settings = {ATTRIBUTES = (Private, ); }; };
-		07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */; settings = {ATTRIBUTES = (Private, ); }; };
 		07D637401BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */; };
 		07D637411BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */; };
 		07D6A4EF1BECF2D200174146 /* MockRealtimeMediaSource.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 07D6A4ED1BECF2D200174146 /* MockRealtimeMediaSource.cpp */; };
@@ -7253,7 +7252,6 @@
 		07C8AD111D073D630087C5CE /* AVFoundationMIMETypeCache.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVFoundationMIMETypeCache.mm; sourceTree = "<group>"; };
 		07C8AD121D073D630087C5CE /* AVFoundationMIMETypeCache.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVFoundationMIMETypeCache.h; sourceTree = "<group>"; };
 		07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = InbandTextTrackPrivateClient.h; sourceTree = "<group>"; };
-		07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RealtimeMediaSourcePreview.h; sourceTree = "<group>"; };
 		07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebAudioSourceProviderAVFObjC.h; sourceTree = "<group>"; };
 		07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebAudioSourceProviderAVFObjC.mm; sourceTree = "<group>"; };
 		07D6A4ED1BECF2D200174146 /* MockRealtimeMediaSource.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MockRealtimeMediaSource.cpp; sourceTree = "<group>"; };
@@ -15200,7 +15198,6 @@
 				4A4F656D1AA997F100E38CDD /* RealtimeMediaSourceCapabilities.h */,
 				4A0FFA9F1AAF5EA20062803B /* RealtimeMediaSourceCenter.cpp */,
 				4A0FFAA01AAF5EA20062803B /* RealtimeMediaSourceCenter.h */,
-				07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */,
 				4A4F656E1AA997F100E38CDD /* RealtimeMediaSourceSettings.cpp */,
 				4A4F656F1AA997F100E38CDD /* RealtimeMediaSourceSettings.h */,
 				2EC41DE21C0410A300D294FE /* RealtimeMediaSourceSupportedConstraints.cpp */,
@@ -27226,7 +27223,6 @@
 				4A4F65721AA997F100E38CDD /* RealtimeMediaSourceCapabilities.h in Headers */,
 				4A0FFAA21AAF5EA20062803B /* RealtimeMediaSourceCenter.h in Headers */,
 				4A0FFAA61AAF5EF60062803B /* RealtimeMediaSourceCenterMac.h in Headers */,
-				07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */,
 				4A4F65741AA997F100E38CDD /* RealtimeMediaSourceSettings.h in Headers */,
 				07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */,
 				BC4368E80C226E32005EFB5F /* Rect.h in Headers */,

Modified: trunk/Source/WebCore/platform/Logging.h (210620 => 210621)


--- trunk/Source/WebCore/platform/Logging.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/Logging.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -62,6 +62,7 @@
     M(Media) \
     M(MediaSource) \
     M(MediaSourceSamples) \
+    M(MediaCaptureSamples) \
     M(MemoryPressure) \
     M(Network) \
     M(NotYetImplemented) \

Modified: trunk/Source/WebCore/platform/MediaSample.h (210620 => 210621)


--- trunk/Source/WebCore/platform/MediaSample.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/MediaSample.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -54,8 +54,10 @@
     virtual ~MediaSample() { }
 
     virtual MediaTime presentationTime() const = 0;
+    virtual MediaTime outputPresentationTime() const { return presentationTime(); }
     virtual MediaTime decodeTime() const = 0;
     virtual MediaTime duration() const = 0;
+    virtual MediaTime outputDuration() const { return duration(); }
     virtual AtomicString trackID() const = 0;
     virtual void setTrackID(const String&) = 0;
     virtual size_t sizeInBytes() const = 0;

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp (210620 => 210621)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2017-01-12 05:22:32 UTC (rev 210621)
@@ -85,8 +85,11 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetImageBuffer, CVImageBufferRef, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfoArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount timingArrayEntries, CMSampleTimingInfo *timingArrayOut, CMItemCount *timingArrayEntriesNeededOut), (sbuf, timingArrayEntries, timingArrayOut, timingArrayEntriesNeededOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetTotalSampleSize, size_t, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSetAttachment, void, (CMAttachmentBearerRef target, CFStringRef key, CFTypeRef value, CMAttachmentMode attachmentMode), (target, key, value, attachmentMode))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseCreateWithMasterClock, OSStatus, (CFAllocatorRef allocator, CMClockRef masterClock, CMTimebaseRef *timebaseOut), (allocator, masterClock, timebaseOut))
@@ -93,6 +96,7 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetTime, CMTime, (CMTimebaseRef timebase), (timebase))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetRate, OSStatus, (CMTimebaseRef timebase, Float64 rate), (timebase, rate))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetDimensions, CMVideoDimensions, (CMVideoFormatDescriptionRef videoDesc), (videoDesc))
@@ -114,6 +118,13 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCallBlockForEachSample, OSStatus, (CMSampleBufferRef sbuf, OSStatus (^handler)(CMSampleBufferRef, CMItemCount)), (sbuf, handler))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleSizeArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount sizeArrayEntries, size_t* sizeArrayOut, CMItemCount* sizeArrayEntriesNeededOut), (sbuf, sizeArrayEntries, sizeArrayOut, sizeArrayEntriesNeededOut))
+
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ())
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock))
 #endif // PLATFORM(COCOA)
 
 #if PLATFORM(IOS)

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h (210620 => 210621)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -47,6 +47,8 @@
 #define CMSampleBufferGetFormatDescription softLink_CoreMedia_CMSampleBufferGetFormatDescription
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfo, OSStatus, (CMSampleBufferRef sbuf, CMItemIndex sampleIndex, CMSampleTimingInfo* timingInfoOut), (sbuf, sampleIndex, timingInfoOut))
 #define CMSampleBufferGetSampleTimingInfo softLink_CoreMedia_CMSampleBufferGetSampleTimingInfo
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method))
+#define CMTimeConvertScale softLink_CoreMedia_CMTimeConvertScale
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeAdd, CMTime, (CMTime time1, CMTime time2), (time1, time2))
 #define CMTimeAdd softLink_CoreMedia_CMTimeAdd
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCompare, int32_t, (CMTime time1, CMTime time2), (time1, time2))
@@ -137,6 +139,10 @@
 #define CMSampleBufferGetImageBuffer softLink_CoreMedia_CMSampleBufferGetImageBuffer
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 #define CMSampleBufferGetPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetPresentationTimeStamp
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
+#define CMSampleBufferGetOutputDuration softLink_CoreMedia_CMSampleBufferGetOutputDuration
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
+#define CMSampleBufferGetOutputPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetOutputPresentationTimeStamp
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary))
 #define CMSampleBufferGetSampleAttachmentsArray softLink_CoreMedia_CMSampleBufferGetSampleAttachmentsArray
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfoArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount timingArrayEntries, CMSampleTimingInfo *timingArrayOut, CMItemCount *timingArrayEntriesNeededOut), (sbuf, timingArrayEntries, timingArrayOut, timingArrayEntriesNeededOut))
@@ -153,6 +159,8 @@
 #define CMTimebaseSetRate softLink_CoreMedia_CMTimebaseSetRate
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
 #define CMTimebaseSetTime softLink_CoreMedia_CMTimebaseSetTime
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
+#define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef *outDesc), (allocator, imageBuffer, outDesc))
@@ -193,6 +201,18 @@
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleSizeArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount sizeArrayEntries, size_t* sizeArrayOut, CMItemCount* sizeArrayEntriesNeededOut), (sbuf, sizeArrayEntries, sizeArrayOut, sizeArrayEntriesNeededOut))
 #define CMSampleBufferGetSampleSizeArray softLink_CoreMedia_CMSampleBufferGetSampleSizeArray
 
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut))
+#define CMAudioSampleBufferCreateWithPacketDescriptions softLink_CoreMedia_CMAudioSampleBufferCreateWithPacketDescriptions
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList))
+#define CMSampleBufferSetDataBufferFromAudioBufferList softLink_CoreMedia_CMSampleBufferSetDataBufferFromAudioBufferList
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf))
+#define CMSampleBufferSetDataReady softLink_CoreMedia_CMSampleBufferSetDataReady
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc))
+#define CMAudioFormatDescriptionCreate softLink_CoreMedia_CMAudioFormatDescriptionCreate
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ())
+#define CMClockGetHostTimeClock  softLink_CoreMedia_CMClockGetHostTimeClock
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock))
+#define CMClockGetTime  softLink_CoreMedia_CMClockGetTime
 #endif // PLATFORM(COCOA)
 
 #if PLATFORM(IOS)

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h (210620 => 210621)


--- trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -54,8 +54,10 @@
     virtual ~MediaSampleAVFObjC() { }
 
     MediaTime presentationTime() const override;
+    MediaTime outputPresentationTime() const override;
     MediaTime decodeTime() const override;
     MediaTime duration() const override;
+    MediaTime outputDuration() const override;
 
     AtomicString trackID() const override { return m_id; }
     void setTrackID(const String& id) override { m_id = id; }

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (210620 => 210621)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -39,6 +39,8 @@
 OBJC_CLASS AVSampleBufferDisplayLayer;
 OBJC_CLASS AVSampleBufferRenderSynchronizer;
 OBJC_CLASS AVStreamSession;
+OBJC_CLASS NSNumber;
+OBJC_CLASS WebAVSampleBufferStatusChangeListener;
 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
 
 namespace WebCore {
@@ -53,6 +55,10 @@
 class VideoFullscreenLayerManager;
 #endif
 
+#if __has_include(<AVFoundation/AVSampleBufferRenderSynchronizer.h>)
+#define USE_RENDER_SYNCHRONIZER 1
+#endif
+
 class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer {
 public:
     explicit MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer*);
@@ -75,6 +81,9 @@
     void ensureLayer();
     void destroyLayer();
 
+    void rendererStatusDidChange(AVSampleBufferAudioRenderer*, NSNumber*);
+    void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
+
 private:
     // MediaPlayerPrivateInterface
 
@@ -97,7 +106,6 @@
     bool paused() const override;
 
     void setVolume(float) override;
-    void internalSetVolume(float, bool);
     void setMuted(bool) override;
     bool supportsMuting() const override { return true; }
 
@@ -122,14 +130,27 @@
 
     void setSize(const IntSize&) override { /* No-op */ }
 
-    void enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&);
+    void flushRenderers();
 
-    void prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&);
-    void enqueueVideoSampleBuffer(MediaSample&);
+    using PendingSampleQueue = Deque<Ref<MediaSample>>;
+    void addSampleToPendingQueue(PendingSampleQueue&, MediaSample&);
+    void removeOldSamplesFromPendingQueue(PendingSampleQueue&);
+
+    void updateSampleTimes(MediaSample&, const MediaTime&, const char*);
+    MediaTime calculateTimelineOffset(const MediaSample&, double);
+    
+    void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&);
     bool shouldEnqueueVideoSampleBuffer() const;
     void flushAndRemoveVideoSampleBuffers();
-    void requestNotificationWhenReadyForMediaData();
+    void requestNotificationWhenReadyForVideoData();
 
+    void enqueueAudioSample(MediaStreamTrackPrivate&, MediaSample&);
+    void createAudioRenderer(AtomicString);
+    void destroyAudioRenderer(AVSampleBufferAudioRenderer*);
+    void destroyAudioRenderer(AtomicString);
+    void destroyAudioRenderers();
+    void requestNotificationWhenReadyForAudioData(AtomicString);
+
     void paint(GraphicsContext&, const FloatRect&) override;
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
     bool metaDataAvailable() const { return m_mediaStreamPrivate && m_readyState >= MediaPlayer::HaveMetadata; }
@@ -155,6 +176,7 @@
     void updateIntrinsicSize(const FloatSize&);
     void updateTracks();
     void renderingModeChanged();
+    void checkSelectedVideoTrack();
 
     void scheduleDeferredTask(Function<void ()>&&);
 
@@ -186,26 +208,36 @@
     void setVideoFullscreenFrame(FloatRect) override;
 #endif
 
-    bool haveVideoLayer() const { return m_sampleBufferDisplayLayer || m_videoPreviewPlayer; }
+    MediaTime streamTime() const;
 
+#if USE(RENDER_SYNCHRONIZER)
+    AudioSourceProvider* audioSourceProvider() final;
+#endif
+
     MediaPlayer* m_player { nullptr };
     WeakPtrFactory<MediaPlayerPrivateMediaStreamAVFObjC> m_weakPtrFactory;
     RefPtr<MediaStreamPrivate> m_mediaStreamPrivate;
 
-    RefPtr<RealtimeMediaSourcePreview> m_videoPreviewPlayer;
-    RefPtr<MediaStreamTrackPrivate> m_videoTrack;
+    RefPtr<MediaStreamTrackPrivate> m_activeVideoTrack;
 
+    RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
     RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
-#if PLATFORM(MAC)
+#if USE(RENDER_SYNCHRONIZER)
+    HashMap<String, RetainPtr<AVSampleBufferAudioRenderer>> m_audioRenderers;
     RetainPtr<AVSampleBufferRenderSynchronizer> m_synchronizer;
+#else
+    std::unique_ptr<Clock> m_clock;
 #endif
+
+    MediaTime m_pausedTime;
     RetainPtr<CGImageRef> m_pausedImage;
-    double m_pausedTime { 0 };
-    std::unique_ptr<Clock> m_clock;
 
     HashMap<String, RefPtr<AudioTrackPrivateMediaStream>> m_audioTrackMap;
     HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
-    Deque<Ref<MediaSample>> m_sampleQueue;
+    PendingSampleQueue m_pendingVideoSampleQueue;
+#if USE(RENDER_SYNCHRONIZER)
+    PendingSampleQueue m_pendingAudioSampleQueue;
+#endif
 
     MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
     MediaPlayer::ReadyState m_readyState { MediaPlayer::HaveNothing };
@@ -219,6 +251,7 @@
     bool m_hasEverEnqueuedVideoFrame { false };
     bool m_hasReceivedMedia { false };
     bool m_isFrameDisplayed { false };
+    bool m_pendingSelectedTrackCheck { false };
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     std::unique_ptr<VideoFullscreenLayerManager> m_videoFullscreenLayerManager;

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
+ * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -52,18 +52,149 @@
 
 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer)
 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
 
+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*)
+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*)
+
+#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
+#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
+
+using namespace WebCore;
+
+@interface WebAVSampleBufferStatusChangeListener : NSObject {
+    MediaPlayerPrivateMediaStreamAVFObjC* _parent;
+    Vector<RetainPtr<AVSampleBufferDisplayLayer>> _layers;
+    Vector<RetainPtr<AVSampleBufferAudioRenderer>> _renderers;
+}
+
+- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
+- (void)invalidate;
+- (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
+- (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
+- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
+- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
+@end
+
+@implementation WebAVSampleBufferStatusChangeListener
+
+- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)parent
+{
+    if (!(self = [super init]))
+        return nil;
+
+    _parent = parent;
+    return self;
+}
+
+- (void)dealloc
+{
+    [self invalidate];
+    [super dealloc];
+}
+
+- (void)invalidate
+{
+    for (auto& layer : _layers)
+        [layer removeObserver:self forKeyPath:@"status"];
+    _layers.clear();
+
+    for (auto& renderer : _renderers)
+        [renderer removeObserver:self forKeyPath:@"status"];
+    _renderers.clear();
+
+    [[NSNotificationCenter defaultCenter] removeObserver:self];
+
+    _parent = nullptr;
+}
+
+- (void)beginObservingLayer:(AVSampleBufferDisplayLayer*)layer
+{
+    ASSERT(_parent);
+    ASSERT(!_layers.contains(layer));
+
+    _layers.append(layer);
+    [layer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
+}
+
+- (void)stopObservingLayer:(AVSampleBufferDisplayLayer*)layer
+{
+    ASSERT(_parent);
+    ASSERT(_layers.contains(layer));
+
+    [layer removeObserver:self forKeyPath:@"status"];
+    _layers.remove(_layers.find(layer));
+}
+
+- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
+{
+    ASSERT(_parent);
+    ASSERT(!_renderers.contains(renderer));
+
+    _renderers.append(renderer);
+    [renderer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
+}
+
+- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
+{
+    ASSERT(_parent);
+    ASSERT(_renderers.contains(renderer));
+
+    [renderer removeObserver:self forKeyPath:@"status"];
+    _renderers.remove(_renderers.find(renderer));
+}
+
+- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
+{
+    UNUSED_PARAM(context);
+    UNUSED_PARAM(keyPath);
+    ASSERT(_parent);
+
+    RetainPtr<WebAVSampleBufferStatusChangeListener> protectedSelf = self;
+    if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) {
+        RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object;
+        RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
+
+        ASSERT(_layers.contains(layer.get()));
+        ASSERT([keyPath isEqualToString:@"status"]);
+
+        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer), status = WTFMove(status)] {
+            protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get());
+        });
+
+    } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) {
+        RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object;
+        RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
+
+        ASSERT(_renderers.contains(renderer.get()));
+        ASSERT([keyPath isEqualToString:@"status"]);
+
+        callOnMainThread([protectedSelf = WTFMove(protectedSelf), renderer = WTFMove(renderer), status = WTFMove(status)] {
+            protectedSelf->_parent->rendererStatusDidChange(renderer.get(), status.get());
+        });
+    } else
+        ASSERT_NOT_REACHED();
+}
+@end
+
 namespace WebCore {
 
 #pragma mark -
 #pragma mark MediaPlayerPrivateMediaStreamAVFObjC
 
+static const double rendererLatency = 0.02;
+
 MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer* player)
     : m_player(player)
     , m_weakPtrFactory(this)
+    , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
+#if USE(RENDER_SYNCHRONIZER)
+    , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]))
+#else
     , m_clock(Clock::create())
+#endif
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create())
 #endif
@@ -81,10 +212,13 @@
             track->removeObserver(*this);
     }
 
+    destroyLayer();
+#if USE(RENDER_SYNCHRONIZER)
+    destroyAudioRenderers();
+#endif
+
     m_audioTrackMap.clear();
     m_videoTrackMap.clear();
-
-    destroyLayer();
 }
 
 #pragma mark -
@@ -127,34 +261,107 @@
 #pragma mark -
 #pragma mark AVSampleBuffer Methods
 
-void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&)
+void MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue(PendingSampleQueue& queue)
 {
-    // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
+    MediaTime now = streamTime();
+    while (!queue.isEmpty()) {
+        if (queue.first()->decodeTime() > now)
+            break;
+        queue.removeFirst();
+    };
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData()
+void MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue(PendingSampleQueue& queue, MediaSample& sample)
 {
-    [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
-        [m_sampleBufferDisplayLayer stopRequestingMediaData];
+    removeOldSamplesFromPendingQueue(queue);
+    queue.append(sample);
+}
 
-        while (!m_sampleQueue.isEmpty()) {
-            if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
-                requestNotificationWhenReadyForMediaData();
-                return;
-            }
+void MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes(MediaSample& sample, const MediaTime& timelineOffset, const char* loggingPrefix)
+{
+    LOG(MediaCaptureSamples, "%s(%p): original sample = %s", loggingPrefix, this, toString(sample).utf8().data());
+    sample.offsetTimestampsBy(timelineOffset);
+    LOG(MediaCaptureSamples, "%s(%p): adjusted sample = %s", loggingPrefix, this, toString(sample).utf8().data());
 
-            auto sample = m_sampleQueue.takeFirst();
-            enqueueVideoSampleBuffer(sample.get());
-        }
-    }];
+#if !LOG_DISABLED
+    MediaTime now = streamTime();
+    double delta = (sample.presentationTime() - now).toDouble();
+    if (delta < 0)
+        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %f seconds late", loggingPrefix, this, toString(now).utf8().data(), -delta);
+    else if (delta < .01)
+        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is only %s seconds early", loggingPrefix, this, toString(now).utf8().data(), delta);
+    else if (delta > .3)
+        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %s seconds early!", loggingPrefix, this, toString(now).utf8().data(), delta);
+#else
+    UNUSED_PARAM(loggingPrefix);
+#endif
+
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer(MediaSample& sample)
+MediaTime MediaPlayerPrivateMediaStreamAVFObjC::calculateTimelineOffset(const MediaSample& sample, double latency)
 {
+    MediaTime sampleTime = sample.outputPresentationTime();
+    if (!sampleTime || !sampleTime.isValid())
+        sampleTime = sample.presentationTime();
+    MediaTime timelineOffset = streamTime() - sampleTime + MediaTime::createWithDouble(latency);
+    if (timelineOffset.timeScale() != sampleTime.timeScale())
+        timelineOffset = toMediaTime(CMTimeConvertScale(toCMTime(timelineOffset), sampleTime.timeScale(), kCMTimeRoundingMethod_Default));
+    return timelineOffset;
+}
+
+#if USE(RENDER_SYNCHRONIZER)
+void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample(MediaStreamTrackPrivate& track, MediaSample& sample)
+{
+    ASSERT(m_audioTrackMap.contains(track.id()));
+    ASSERT(m_audioRenderers.contains(sample.trackID()));
+
+    auto audioTrack = m_audioTrackMap.get(track.id());
+    MediaTime timelineOffset = audioTrack->timelineOffset();
+    if (timelineOffset == MediaTime::invalidTime()) {
+        timelineOffset = calculateTimelineOffset(sample, rendererLatency);
+        audioTrack->setTimelineOffset(timelineOffset);
+        LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample: timeline offset for track %s set to (%lld/%d)", track.id().utf8().data(), timelineOffset.timeValue(), timelineOffset.timeScale());
+    }
+
+    updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample");
+
+    auto renderer = m_audioRenderers.get(sample.trackID());
+    if (![renderer isReadyForMoreMediaData]) {
+        addSampleToPendingQueue(m_pendingAudioSampleQueue, sample);
+        requestNotificationWhenReadyForAudioData(sample.trackID());
+        return;
+    }
+
+    [renderer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
+}
+#endif
+
+void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate& track, MediaSample& sample)
+{
+    ASSERT(m_videoTrackMap.contains(track.id()));
+
+    if (&track != m_mediaStreamPrivate->activeVideoTrack())
+        return;
+
+    m_hasReceivedMedia = true;
+    updateReadyState();
+    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_isFrameDisplayed))
+        return;
+
+    auto videoTrack = m_videoTrackMap.get(track.id());
+    MediaTime timelineOffset = videoTrack->timelineOffset();
+    if (timelineOffset == MediaTime::invalidTime()) {
+        timelineOffset = calculateTimelineOffset(sample, rendererLatency);
+        videoTrack->setTimelineOffset(timelineOffset);
+        LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample: timeline offset for track %s set to %f", track.id().utf8().data(), timelineOffset.toDouble());
+    }
+
+    updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample");
+
     if (m_sampleBufferDisplayLayer) {
         if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
-            m_sampleQueue.append(sample);
-            requestNotificationWhenReadyForMediaData();
+            addSampleToPendingQueue(m_pendingVideoSampleQueue, sample);
+            requestNotificationWhenReadyForVideoData();
             return;
         }
 
@@ -164,19 +371,144 @@
     m_isFrameDisplayed = true;
     if (!m_hasEverEnqueuedVideoFrame) {
         m_hasEverEnqueuedVideoFrame = true;
+        if (m_displayMode == PausedImage)
+            updatePausedImage();
         m_player->firstVideoFrameAvailable();
-        updatePausedImage();
     }
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate& track, MediaSample& sample)
+void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData()
 {
-    if (&track != m_mediaStreamPrivate->activeVideoTrack() || !shouldEnqueueVideoSampleBuffer())
+    [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
+        [m_sampleBufferDisplayLayer stopRequestingMediaData];
+
+        while (!m_pendingVideoSampleQueue.isEmpty()) {
+            if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
+                requestNotificationWhenReadyForVideoData();
+                return;
+            }
+
+            auto sample = m_pendingVideoSampleQueue.takeFirst();
+            enqueueVideoSample(*m_activeVideoTrack.get(), sample.get());
+        }
+    }];
+}
+
+#if USE(RENDER_SYNCHRONIZER)
+void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData(AtomicString trackID)
+{
+    if (!m_audioRenderers.contains(trackID))
         return;
 
-    enqueueVideoSampleBuffer(sample);
+    auto renderer = m_audioRenderers.get(trackID);
+    [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
+        [renderer stopRequestingMediaData];
+
+        auto audioTrack = m_audioTrackMap.get(trackID);
+        while (!m_pendingAudioSampleQueue.isEmpty()) {
+            if (![renderer isReadyForMoreMediaData]) {
+                requestNotificationWhenReadyForAudioData(trackID);
+                return;
+            }
+
+            auto sample = m_pendingAudioSampleQueue.takeFirst();
+            enqueueAudioSample(audioTrack->streamTrack(), sample.get());
+        }
+    }];
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer(AtomicString trackID)
+{
+    ASSERT(!m_audioRenderers.contains(trackID));
+    auto renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]);
+    [renderer setAudioTimePitchAlgorithm:(m_player->preservesPitch() ? AVAudioTimePitchAlgorithmSpectral : AVAudioTimePitchAlgorithmVarispeed)];
+    m_audioRenderers.set(trackID, renderer);
+    [m_synchronizer addRenderer:renderer.get()];
+    [m_statusChangeListener beginObservingRenderer:renderer.get()];
+    if (m_audioRenderers.size() == 1)
+        renderingModeChanged();
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AVSampleBufferAudioRenderer* renderer)
+{
+    [m_statusChangeListener stopObservingRenderer:renderer];
+    [renderer flush];
+    [renderer stopRequestingMediaData];
+
+    CMTime now = CMTimebaseGetTime([m_synchronizer timebase]);
+    [m_synchronizer removeRenderer:renderer atTime:now withCompletionHandler:^(BOOL) { }];
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AtomicString trackID)
+{
+    if (!m_audioRenderers.contains(trackID))
+        return;
+
+    destroyAudioRenderer(m_audioRenderers.get(trackID).get());
+    m_audioRenderers.remove(trackID);
+    if (!m_audioRenderers.size())
+        renderingModeChanged();
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers()
+{
+    m_pendingAudioSampleQueue.clear();
+    for (auto& renderer : m_audioRenderers.values())
+        destroyAudioRenderer(renderer.get());
+    m_audioRenderers.clear();
+}
+
+AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
+{
+    // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305
+    for (const auto& track : m_audioTrackMap.values()) {
+        if (track->streamTrack().ended() || !track->streamTrack().enabled() || track->streamTrack().muted())
+            continue;
+
+        return track->streamTrack().audioSourceProvider();
+    }
+    return nullptr;
+}
+#endif
+
+void MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange(AVSampleBufferAudioRenderer* renderer, NSNumber* status)
+{
+#if USE(RENDER_SYNCHRONIZER)
+    String trackID;
+    for (auto& pair : m_audioRenderers) {
+        if (pair.value == renderer) {
+            trackID = pair.key;
+            break;
+        }
+    }
+    ASSERT(!trackID.isEmpty());
+    if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
+        m_audioTrackMap.get(trackID)->setTimelineOffset(MediaTime::invalidTime());
+#else
+    UNUSED_PARAM(renderer);
+    UNUSED_PARAM(status);
+#endif
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
+{
+    ASSERT_UNUSED(layer, layer == m_sampleBufferDisplayLayer);
+    ASSERT(m_activeVideoTrack);
+    if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
+        m_videoTrackMap.get(m_activeVideoTrack->id())->setTimelineOffset(MediaTime::invalidTime());
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers()
+{
+    if (m_sampleBufferDisplayLayer)
+        [m_sampleBufferDisplayLayer flush];
+
+#if USE(RENDER_SYNCHRONIZER)
+    for (auto& renderer : m_audioRenderers.values())
+        [renderer flush];
+#endif
+}
+
 bool MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer() const
 {
     if (m_displayMode == LivePreview)
@@ -196,48 +528,38 @@
 
 void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
 {
-    if (!m_mediaStreamPrivate || haveVideoLayer())
+    if (m_sampleBufferDisplayLayer)
         return;
 
-    CALayer *videoLayer = nil;
-    if (m_mediaStreamPrivate->activeVideoTrack()) {
-        m_videoPreviewPlayer = m_mediaStreamPrivate->activeVideoTrack()->preview();
-        if (m_videoPreviewPlayer)
-            videoLayer = m_videoPreviewPlayer->platformLayer();
-    }
-
-    if (!videoLayer) {
-        m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
-        videoLayer = m_sampleBufferDisplayLayer.get();
+    m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
 #ifndef NDEBUG
-        [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
+    [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
 #endif
-        m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
+    [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
 
-#if PLATFORM(MAC)
-        m_synchronizer = adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]);
-        [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
+#if USE(RENDER_SYNCHRONIZER)
+    [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
 #endif
-    }
 
     renderingModeChanged();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-    m_videoFullscreenLayerManager->setVideoLayer(videoLayer, snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
+    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
 #endif
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
 {
-    if (!haveVideoLayer())
+    if (!m_sampleBufferDisplayLayer)
         return;
 
-    m_videoPreviewPlayer = nullptr;
-
     if (m_sampleBufferDisplayLayer) {
+        m_pendingVideoSampleQueue.clear();
+        [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
         [m_sampleBufferDisplayLayer stopRequestingMediaData];
         [m_sampleBufferDisplayLayer flush];
-#if PLATFORM(MAC)
+#if USE(RENDER_SYNCHRONIZER)
         CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);
         [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL) {
             // No-op.
@@ -305,14 +627,12 @@
 
 PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
 {
-    if (!haveVideoLayer() || m_displayMode == None)
+    if (!m_sampleBufferDisplayLayer || m_displayMode == None)
         return nullptr;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     return m_videoFullscreenLayerManager->videoInlineLayer();
 #else
-    if (m_videoPreviewPlayer)
-        return m_videoPreviewPlayer->platformLayer();
 
     return m_sampleBufferDisplayLayer.get();
 #endif
@@ -320,7 +640,7 @@
 
 MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
 {
-    if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !haveVideoLayer())
+    if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
         return None;
 
     if (m_mediaStreamPrivate->activeVideoTrack() && !m_mediaStreamPrivate->activeVideoTrack()->enabled())
@@ -368,23 +688,15 @@
     if (!metaDataAvailable() || m_playing || m_ended)
         return;
 
-    m_clock->start();
     m_playing = true;
-
-    if (m_videoPreviewPlayer)
-        m_videoPreviewPlayer->play();
-#if PLATFORM(MAC)
-    else
-        [m_synchronizer setRate:1];
+#if USE(RENDER_SYNCHRONIZER)
+    if (!m_synchronizer.get().rate)
+        [m_synchronizer setRate:1 ]; // streamtime
+#else
+    if (!m_clock->isRunning())
+        m_clock->start();
 #endif
 
-    for (const auto& track : m_audioTrackMap.values()) {
-        if (!track->enabled() || !track->streamTrack().preview())
-            continue;
-
-        track->streamTrack().preview()->play();
-    }
-
     m_haveEverPlayed = true;
     scheduleDeferredTask([this] {
         updateDisplayMode();
@@ -399,25 +711,12 @@
     if (!metaDataAvailable() || !m_playing || m_ended)
         return;
 
-    m_pausedTime = m_clock->currentTime();
+    m_pausedTime = currentMediaTime();
     m_playing = false;
 
-    if (m_videoPreviewPlayer)
-        m_videoPreviewPlayer->pause();
-#if PLATFORM(MAC)
-    else
-        [m_synchronizer setRate:0];
-#endif
-
-    for (const auto& track : m_audioTrackMap.values()) {
-        if (!track->enabled() || !track->streamTrack().preview())
-            continue;
-
-        track->streamTrack().preview()->pause();
-    }
-
     updateDisplayMode();
     updatePausedImage();
+    flushRenderers();
 }
 
 bool MediaPlayerPrivateMediaStreamAVFObjC::paused() const
@@ -425,27 +724,21 @@
     return !m_playing;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume(float volume, bool internal)
+void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume)
 {
-    if (!internal)
-        m_volume = volume;
+    LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::setVolume(%p)", this);
 
-    if (!metaDataAvailable())
+    if (m_volume == volume)
         return;
 
-    for (const auto& track : m_audioTrackMap.values()) {
-        if (!track->enabled() || !track->streamTrack().preview())
-            continue;
+    m_volume = volume;
 
-        track->streamTrack().preview()->setVolume(volume);
-    }
+#if USE(RENDER_SYNCHRONIZER)
+    for (auto& renderer : m_audioRenderers.values())
+        [renderer setVolume:volume];
+#endif
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume)
-{
-    internalSetVolume(volume, false);
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::setMuted(bool muted)
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::setMuted(%p)", this);
@@ -455,7 +748,10 @@
 
     m_muted = muted;
     
-    internalSetVolume(muted ? 0 : m_volume, true);
+#if USE(RENDER_SYNCHRONIZER)
+    for (auto& renderer : m_audioRenderers.values())
+        [renderer setMuted:muted];
+#endif
 }
 
 bool MediaPlayerPrivateMediaStreamAVFObjC::hasVideo() const
@@ -481,9 +777,21 @@
 
 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime() const
 {
-    return MediaTime::createWithDouble(m_playing ? m_clock->currentTime() : m_pausedTime);
+    if (!m_playing)
+        return m_pausedTime;
+
+    return streamTime();
 }
 
+MediaTime MediaPlayerPrivateMediaStreamAVFObjC::streamTime() const
+{
+#if USE(RENDER_SYNCHRONIZER)
+    return toMediaTime(CMTimebaseGetTime([m_synchronizer timebase]));
+#else
+    return MediaTime::createWithDouble(m_clock->currentTime());
+#endif
+}
+
 MediaPlayer::NetworkState MediaPlayerPrivateMediaStreamAVFObjC::networkState() const
 {
     return m_networkState;
@@ -598,25 +906,36 @@
     ASSERT(mediaSample.platformSample().type == PlatformSample::CMSampleBufferType);
     ASSERT(m_mediaStreamPrivate);
 
+    if (!m_hasReceivedMedia) {
+        m_hasReceivedMedia = true;
+        updateReadyState();
+    }
+
+    if (!m_playing || streamTime().toDouble() < 0)
+        return;
+
+#if USE(RENDER_SYNCHRONIZER)
+    if (!CMTimebaseGetEffectiveRate([m_synchronizer timebase]))
+        return;
+#endif
+
     switch (track.type()) {
     case RealtimeMediaSource::None:
         // Do nothing.
         break;
     case RealtimeMediaSource::Audio:
-        // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
+#if USE(RENDER_SYNCHRONIZER)
+        enqueueAudioSample(track, mediaSample);
+#endif
         break;
     case RealtimeMediaSource::Video:
-        prepareVideoSampleBufferFromTrack(track, mediaSample);
-        m_hasReceivedMedia = true;
-        scheduleDeferredTask([this] {
-            updateReadyState();
-        });
+        if (&track == m_activeVideoTrack.get())
+            enqueueVideoSample(track, mediaSample);
         break;
     }
 }
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-
 void MediaPlayerPrivateMediaStreamAVFObjC::setVideoFullscreenLayer(PlatformLayer *videoFullscreenLayer, std::function<void()> completionHandler)
 {
     m_videoFullscreenLayerManager->setVideoFullscreenLayer(videoFullscreenLayer, completionHandler);
@@ -626,11 +945,16 @@
 {
     m_videoFullscreenLayerManager->setVideoFullscreenFrame(frame);
 }
-
 #endif
 
-template <typename RefT, typename PassRefT>
-void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT), std::function<void(RefT, int)> configureCallback, MediaStreamTrackPrivate::Observer* trackObserver)
+typedef enum {
+    Add,
+    Remove,
+    Configure
+} TrackState;
+
+template <typename RefT>
+void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), const Function<void(RefT, int, TrackState)>& configureTrack)
 {
     Vector<RefT> removedTracks;
     Vector<RefT> addedTracks;
@@ -660,18 +984,42 @@
     }
 
     int index = 0;
+    for (auto& track : removedTracks)
+        configureTrack(track, index++, TrackState::Remove);
+
+    index = 0;
+    for (auto& track : addedTracks)
+        configureTrack(track, index++, TrackState::Add);
+
+    index = 0;
     for (const auto& track : trackMap.values())
-        configureCallback(track, index++);
+        configureTrack(track, index++, TrackState::Configure);
+}
 
-    for (auto& track : removedTracks) {
-        (player->*removedFunction)(*track);
-        track->streamTrack().removeObserver(*trackObserver);
-    }
+void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
+{
+    if (m_pendingSelectedTrackCheck)
+        return;
 
-    for (auto& track : addedTracks) {
-        (player->*addedFunction)(*track);
-        track->streamTrack().addObserver(*trackObserver);
-    }
+    m_pendingSelectedTrackCheck = true;
+    scheduleDeferredTask([this] {
+        bool hideVideoLayer = true;
+        m_activeVideoTrack = nullptr;
+        if (m_mediaStreamPrivate->activeVideoTrack()) {
+            for (const auto& track : m_videoTrackMap.values()) {
+                if (&track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack()) {
+                    m_activeVideoTrack = m_mediaStreamPrivate->activeVideoTrack();
+                    if (track->selected())
+                        hideVideoLayer = false;
+                    break;
+                }
+            }
+        }
+
+        ensureLayer();
+        m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
+        m_pendingSelectedTrackCheck = false;
+    });
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::updateTracks()
@@ -678,23 +1026,58 @@
 {
     MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate->tracks();
 
-    std::function<void(RefPtr<AudioTrackPrivateMediaStream>, int)> enableAudioTrack = [this](auto track, int index)
+    Function<void(RefPtr<AudioTrackPrivateMediaStream>, int, TrackState)>  setAudioTrackState = [this](auto track, int index, TrackState state)
     {
-        track->setTrackIndex(index);
-        track->setEnabled(track->streamTrack().enabled() && !track->streamTrack().muted());
+        switch (state) {
+        case TrackState::Remove:
+            track->streamTrack().removeObserver(*this);
+            m_player->removeAudioTrack(*track);
+#if USE(RENDER_SYNCHRONIZER)
+            destroyAudioRenderer(track->id());
+#endif
+            break;
+        case TrackState::Add:
+            track->streamTrack().addObserver(*this);
+            m_player->addAudioTrack(*track);
+#if USE(RENDER_SYNCHRONIZER)
+            createAudioRenderer(track->id());
+#endif
+            break;
+        case TrackState::Configure:
+            track->setTrackIndex(index);
+            bool enabled = track->streamTrack().enabled() && !track->streamTrack().muted();
+            track->setEnabled(enabled);
+#if USE(RENDER_SYNCHRONIZER)
+            auto renderer = m_audioRenderers.get(track->id());
+            ASSERT(renderer);
+            renderer.get().muted = !enabled;
+#endif
+            break;
+        }
     };
-    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack, enableAudioTrack, (MediaStreamTrackPrivate::Observer*) this);
+    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, setAudioTrackState);
 
-    std::function<void(RefPtr<VideoTrackPrivateMediaStream>, int)> enableVideoTrack = [this](auto track, int index)
+    Function<void(RefPtr<VideoTrackPrivateMediaStream>, int, TrackState)> setVideoTrackState = [&](auto track, int index, TrackState state)
     {
-        track->setTrackIndex(index);
-        bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack();
-        track->setSelected(selected);
-
-        if (selected)
-            ensureLayer();
+        switch (state) {
+        case TrackState::Remove:
+            track->streamTrack().removeObserver(*this);
+            m_player->removeVideoTrack(*track);
+            checkSelectedVideoTrack();
+            break;
+        case TrackState::Add:
+            track->streamTrack().addObserver(*this);
+            m_player->addVideoTrack(*track);
+            break;
+        case TrackState::Configure:
+            track->setTrackIndex(index);
+            bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack();
+            track->setSelected(selected);
+            checkSelectedVideoTrack();
+            break;
+        }
     };
-    updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack, enableVideoTrack, (MediaStreamTrackPrivate::Observer*) this);
+    updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, setVideoTrackState);
 }
 
 std::unique_ptr<PlatformTimeRanges> MediaPlayerPrivateMediaStreamAVFObjC::seekable() const

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -37,6 +37,11 @@
     return toMediaTime(CMSampleBufferGetPresentationTimeStamp(m_sample.get()));
 }
 
+MediaTime MediaSampleAVFObjC::outputPresentationTime() const
+{
+    return toMediaTime(CMSampleBufferGetOutputPresentationTimeStamp(m_sample.get()));
+}
+
 MediaTime MediaSampleAVFObjC::decodeTime() const
 {
     return toMediaTime(CMSampleBufferGetDecodeTimeStamp(m_sample.get()));
@@ -47,6 +52,11 @@
     return toMediaTime(CMSampleBufferGetDuration(m_sample.get()));
 }
 
+MediaTime MediaSampleAVFObjC::outputDuration() const
+{
+    return toMediaTime(CMSampleBufferGetOutputDuration(m_sample.get()));
+}
+
 size_t MediaSampleAVFObjC::sizeInBytes() const
 {
     return CMSampleBufferGetTotalSampleSize(m_sample.get());
@@ -111,7 +121,7 @@
 
 void MediaSampleAVFObjC::dump(PrintStream& out) const
 {
-    out.print("{PTS(", presentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}");
+    out.print("{PTS(", presentationTime(), "), OPTS(", outputPresentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}");
 }
 
 void MediaSampleAVFObjC::offsetTimestampsBy(const MediaTime& offset)

Modified: trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -50,11 +50,15 @@
 
     MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); }
 
+    MediaTime timelineOffset() const { return m_timelineOffset; }
+    void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; }
+
 private:
     AudioTrackPrivateMediaStream(MediaStreamTrackPrivate& track)
         : m_streamTrack(track)
         , m_id(track.id())
         , m_label(track.label())
+        , m_timelineOffset(MediaTime::invalidTime())
     {
     }
 
@@ -62,6 +66,7 @@
     AtomicString m_id;
     AtomicString m_label;
     int m_index { 0 };
+    MediaTime m_timelineOffset;
 };
 
 }

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2017-01-12 05:22:32 UTC (rev 210621)
@@ -100,9 +100,6 @@
     // Always update the enabled state regardless of the track being ended.
     m_isEnabled = enabled;
 
-    if (m_preview)
-        m_preview->setEnabled(enabled);
-
     for (auto& observer : m_observers)
         observer->trackEnabledChanged(*this);
 }
@@ -117,7 +114,6 @@
     // trackEnded method once.
     m_isEnded = true;
 
-    m_preview = nullptr;
     m_source->requestStop(this);
 
     for (auto& observer : m_observers)
@@ -163,15 +159,6 @@
     }
 }
 
-RealtimeMediaSourcePreview* MediaStreamTrackPrivate::preview()
-{
-    if (m_preview)
-        return m_preview.get();
-
-    m_preview = m_source->preview();
-    return m_preview.get();
-}
-
 void MediaStreamTrackPrivate::applyConstraints(const MediaConstraints& constraints, RealtimeMediaSource::SuccessHandler successHandler, RealtimeMediaSource::FailureHandler failureHandler)
 {
     m_source->applyConstraints(constraints, successHandler, failureHandler);

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -91,7 +91,6 @@
     AudioSourceProvider* audioSourceProvider();
 
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
-    RealtimeMediaSourcePreview* preview();
 
 private:
     MediaStreamTrackPrivate(Ref<RealtimeMediaSource>&&, String&& id);
@@ -105,7 +104,6 @@
 
     Vector<Observer*> m_observers;
     Ref<RealtimeMediaSource> m_source;
-    RefPtr<RealtimeMediaSourcePreview> m_preview;
 
     String m_id;
     bool m_isEnabled;

Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -42,7 +42,6 @@
 #include "MediaSample.h"
 #include "PlatformLayer.h"
 #include "RealtimeMediaSourceCapabilities.h"
-#include "RealtimeMediaSourcePreview.h"
 #include <wtf/RefCounted.h>
 #include <wtf/Vector.h>
 #include <wtf/WeakPtr.h>
@@ -129,7 +128,6 @@
 
     virtual RefPtr<Image> currentFrameImage() { return nullptr; }
     virtual void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) { }
-    virtual RefPtr<RealtimeMediaSourcePreview> preview() { return nullptr; }
 
     void setWidth(int);
     void setHeight(int);

Deleted: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSourcePreview.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSourcePreview.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSourcePreview.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,69 +0,0 @@
-/*
- * Copyright (C) 2016 Apple Inc. All rights reserved.
- *
- * Redistribution and use in source and binary forms, with or without
- * modification, are permitted provided that the following conditions
- * are met:
- *
- * 1. Redistributions of source code must retain the above copyright
- *    notice, this list of conditions and the following disclaimer.
- * 2. Redistributions in binary form must reproduce the above copyright
- *    notice, this list of conditions and the following disclaimer
- *    in the documentation and/or other materials provided with the
- *    distribution.
- * 3. Neither the name of Ericsson nor the names of its contributors
- *    may be used to endorse or promote products derived from this
- *    software without specific prior written permission.
- *
- * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
- * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
- * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
- * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
- * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
- * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
- * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
- * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
- * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
- * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
- * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
- */
-
-#pragma once
-
-#if ENABLE(MEDIA_STREAM)
-
-#include "PlatformLayer.h"
-#include <wtf/RetainPtr.h>
-#include <wtf/WeakPtr.h>
-
-namespace WebCore {
-
-class RealtimeMediaSourcePreview : public RefCounted<RealtimeMediaSourcePreview> {
-public:
-    virtual ~RealtimeMediaSourcePreview() { }
-
-    virtual void play() const = 0;
-    virtual void pause() const = 0;
-    virtual void setEnabled(bool) = 0;
-
-    virtual PlatformLayer* platformLayer() const = 0;
-    virtual void setVolume(double) const = 0;
-
-    virtual void invalidate() { m_weakPtrFactory.revokeAll(); }
-
-    WeakPtr<RealtimeMediaSourcePreview> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
-
-protected:
-    RealtimeMediaSourcePreview()
-        : m_weakPtrFactory(this)
-    {
-    }
-
-private:
-    WeakPtrFactory<RealtimeMediaSourcePreview> m_weakPtrFactory;
-};
-
-} // namespace WebCore
-
-#endif // ENABLE(MEDIA_STREAM)
-

Modified: trunk/Source/WebCore/platform/mediastream/VideoTrackPrivateMediaStream.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/VideoTrackPrivateMediaStream.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/VideoTrackPrivateMediaStream.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -40,28 +40,33 @@
         return adoptRef(*new VideoTrackPrivateMediaStream(streamTrack));
     }
 
-    Kind kind() const override { return Kind::Main; }
-    AtomicString id() const override { return m_id; }
-    AtomicString label() const override { return m_label; }
-    AtomicString language() const override { return emptyAtom; }
-    int trackIndex() const override { return m_index; }
-
     void setTrackIndex(int index) { m_index = index; }
 
     MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); }
 
+    MediaTime timelineOffset() const { return m_timelineOffset; }
+    void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; }
+
 private:
     VideoTrackPrivateMediaStream(MediaStreamTrackPrivate& track)
         : m_streamTrack(track)
         , m_id(track.id())
         , m_label(track.label())
+        , m_timelineOffset(MediaTime::invalidTime())
     {
     }
 
+    Kind kind() const final { return Kind::Main; }
+    AtomicString id() const final { return m_id; }
+    AtomicString label() const final { return m_label; }
+    AtomicString language() const final { return emptyAtom; }
+    int trackIndex() const final { return m_index; }
+
     Ref<MediaStreamTrackPrivate> m_streamTrack;
     AtomicString m_id;
     AtomicString m_label;
     int m_index { 0 };
+    MediaTime m_timelineOffset;
 };
 
 }

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -67,7 +67,6 @@
     void shutdownCaptureSession() override;
     void updateSettings(RealtimeMediaSourceSettings&) override;
     AudioSourceProvider* audioSourceProvider() override;
-    RefPtr<AVMediaSourcePreview> createPreview() final;
 
     RetainPtr<AVCaptureConnection> m_audioConnection;
 

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -30,7 +30,7 @@
 
 #import "Logging.h"
 #import "MediaConstraints.h"
-#import "NotImplemented.h"
+#import "MediaSampleAVFObjC.h"
 #import "RealtimeMediaSourceSettings.h"
 #import "SoftLinking.h"
 #import "WebAudioSourceProviderAVFObjC.h"
@@ -49,22 +49,15 @@
 typedef AVCaptureDeviceInput AVCaptureDeviceInputType;
 typedef AVCaptureOutput AVCaptureOutputType;
 
-#if !PLATFORM(IOS)
-typedef AVCaptureAudioPreviewOutput AVCaptureAudioPreviewOutputType;
-#endif
-
 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioChannel)
 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioDataOutput)
-SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioPreviewOutput)
 SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection)
 SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice)
 SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput)
 SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput)
 
-#define AVCaptureAudioPreviewOutput getAVCaptureAudioPreviewOutputClass()
-
 #define AVCaptureAudioChannel getAVCaptureAudioChannelClass()
 #define AVCaptureAudioDataOutput getAVCaptureAudioDataOutputClass()
 #define AVCaptureConnection getAVCaptureConnectionClass()
@@ -80,79 +73,6 @@
 
 namespace WebCore {
 
-#if !PLATFORM(IOS)
-class AVAudioSourcePreview: public AVMediaSourcePreview {
-public:
-    static RefPtr<AVMediaSourcePreview> create(AVCaptureSession *, AVAudioCaptureSource*);
-
-private:
-    AVAudioSourcePreview(AVCaptureSession *, AVAudioCaptureSource*);
-
-    void invalidate() final;
-
-    void play() const final;
-    void pause() const final;
-    void setVolume(double) const final;
-    void setEnabled(bool) final;
-    PlatformLayer* platformLayer() const final { return nullptr; }
-
-    void updateState() const;
-
-    RetainPtr<AVCaptureAudioPreviewOutputType> m_audioPreviewOutput;
-    mutable double m_volume { 1 };
-    mutable bool m_paused { false };
-    mutable bool m_enabled { true };
-};
-
-RefPtr<AVMediaSourcePreview> AVAudioSourcePreview::create(AVCaptureSession *session, AVAudioCaptureSource* parent)
-{
-    return adoptRef(new AVAudioSourcePreview(session, parent));
-}
-
-AVAudioSourcePreview::AVAudioSourcePreview(AVCaptureSession *session, AVAudioCaptureSource* parent)
-    : AVMediaSourcePreview(parent)
-{
-    m_audioPreviewOutput = adoptNS([allocAVCaptureAudioPreviewOutputInstance() init]);
-    setVolume(1);
-    [session addOutput:m_audioPreviewOutput.get()];
-}
-
-void AVAudioSourcePreview::invalidate()
-{
-    m_audioPreviewOutput = nullptr;
-    AVMediaSourcePreview::invalidate();
-}
-
-void AVAudioSourcePreview::play() const
-{
-    m_paused = false;
-    updateState();
-}
-
-void AVAudioSourcePreview::pause() const
-{
-    m_paused = true;
-    updateState();
-}
-
-void AVAudioSourcePreview::setEnabled(bool enabled)
-{
-    m_enabled = enabled;
-    updateState();
-}
-
-void AVAudioSourcePreview::setVolume(double volume) const
-{
-    m_volume = volume;
-    m_audioPreviewOutput.get().volume = volume;
-}
-
-void AVAudioSourcePreview::updateState() const
-{
-    m_audioPreviewOutput.get().volume = (!m_enabled || m_paused) ? 0 : m_volume;
-}
-#endif
-
 RefPtr<AVMediaCaptureSource> AVAudioCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint)
 {
     auto source = adoptRef(new AVAudioCaptureSource(device, id));
@@ -190,7 +110,7 @@
 
 void AVAudioCaptureSource::updateSettings(RealtimeMediaSourceSettings& settings)
 {
-    // FIXME: use [AVCaptureAudioPreviewOutput volume] for volume
+    // FIXME: support volume
 
     settings.setDeviceId(id());
 }
@@ -276,6 +196,11 @@
     if (!formatDescription)
         return;
 
+    RetainPtr<CMSampleBufferRef> buffer = sampleBuffer;
+    scheduleDeferredTask([this, buffer] {
+        mediaDataUpdated(MediaSampleAVFObjC::create(buffer.get()));
+    });
+
     std::unique_lock<Lock> lock(m_lock, std::try_to_lock);
     if (!lock.owns_lock()) {
         // Failed to acquire the lock, just return instead of blocking.
@@ -304,15 +229,6 @@
     return m_audioSourceProvider.get();
 }
 
-RefPtr<AVMediaSourcePreview> AVAudioCaptureSource::createPreview()
-{
-#if !PLATFORM(IOS)
-    return AVAudioSourcePreview::create(session(), this);
-#else
-    return nullptr;
-#endif
-}
-    
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -47,19 +47,6 @@
 
 class AVMediaCaptureSource;
 
-class AVMediaSourcePreview: public RealtimeMediaSourcePreview {
-public:
-    virtual ~AVMediaSourcePreview();
-
-    void invalidate() override;
-
-protected:
-    AVMediaSourcePreview(AVMediaCaptureSource*);
-
-private:
-    WeakPtr<AVMediaCaptureSource> m_parent;
-};
-
 class AVMediaCaptureSource : public RealtimeMediaSource {
 public:
     virtual ~AVMediaCaptureSource();
@@ -76,10 +63,6 @@
     void stopProducingData() final;
     bool isProducingData() const final { return m_isRunning; }
 
-    RefPtr<RealtimeMediaSourcePreview> preview() final;
-    void removePreview(AVMediaSourcePreview*);
-    WeakPtr<AVMediaCaptureSource> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
-
 protected:
     AVMediaCaptureSource(AVCaptureDevice*, const AtomicString&, RealtimeMediaSource::Type);
 
@@ -99,8 +82,6 @@
     void setVideoSampleBufferDelegate(AVCaptureVideoDataOutput*);
     void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*);
 
-    virtual RefPtr<AVMediaSourcePreview> createPreview() = 0;
-
 private:
     void setupSession();
     void reset() final;
@@ -117,8 +98,6 @@
     RefPtr<RealtimeMediaSourceCapabilities> m_capabilities;
     RetainPtr<AVCaptureSession> m_session;
     RetainPtr<AVCaptureDevice> m_device;
-    Vector<WeakPtr<RealtimeMediaSourcePreview>> m_previews;
-    WeakPtrFactory<AVMediaCaptureSource> m_weakPtrFactory;
     bool m_isRunning { false};
 };
 

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -129,7 +129,6 @@
     : RealtimeMediaSource(id, type, emptyString())
     , m_objcObserver(adoptNS([[WebCoreAVMediaCaptureSourceObserver alloc] initWithCallback:this]))
     , m_device(device)
-    , m_weakPtrFactory(this)
 {
     setName(device.localizedName);
     setPersistentID(device.uniqueID);
@@ -240,12 +239,6 @@
     for (NSString *keyName in sessionKVOProperties())
         [m_session removeObserver:m_objcObserver.get() forKeyPath:keyName];
 
-    for (const auto& preview : m_previews) {
-        if (preview)
-            preview->invalidate();
-    }
-    m_previews.clear();
-
     shutdownCaptureSession();
     m_session = nullptr;
 }
@@ -277,45 +270,6 @@
     return nullptr;
 }
 
-RefPtr<RealtimeMediaSourcePreview> AVMediaCaptureSource::preview()
-{
-    RefPtr<AVMediaSourcePreview> preview = createPreview();
-    if (!preview)
-        return nullptr;
-
-    m_previews.append(preview->createWeakPtr());
-    return preview.leakRef();
-}
-
-void AVMediaCaptureSource::removePreview(AVMediaSourcePreview* preview)
-{
-    size_t index;
-    for (index = 0; index < m_previews.size(); ++index) {
-        if (m_previews[index].get() == preview)
-            break;
-    }
-
-    if (index < m_previews.size())
-        m_previews.remove(index);
-}
-
-AVMediaSourcePreview::AVMediaSourcePreview(AVMediaCaptureSource* parent)
-    : m_parent(parent->createWeakPtr())
-{
-}
-
-AVMediaSourcePreview::~AVMediaSourcePreview()
-{
-    if (m_parent)
-        m_parent->removePreview(this);
-}
-
-void AVMediaSourcePreview::invalidate()
-{
-    m_parent = nullptr;
-    RealtimeMediaSourcePreview::invalidate();
-}
-
 NSArray* sessionKVOProperties()
 {
     static NSArray* keys = [@[@"running"] retain];

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h	2017-01-12 05:22:32 UTC (rev 210621)
@@ -79,7 +79,6 @@
 
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
 
-    RefPtr<AVMediaSourcePreview> createPreview() final;
     RetainPtr<CGImageRef> currentFrameCGImage();
     RefPtr<Image> currentFrameImage() final;
 

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -38,7 +38,6 @@
 #import "NotImplemented.h"
 #import "PlatformLayer.h"
 #import "RealtimeMediaSourceCenter.h"
-#import "RealtimeMediaSourcePreview.h"
 #import "RealtimeMediaSourceSettings.h"
 #import "WebActionDisablingCALayerDelegate.h"
 #import <AVFoundation/AVCaptureDevice.h>
@@ -101,110 +100,8 @@
 
 using namespace WebCore;
 
-@interface WebCoreAVVideoCaptureSourceObserver : NSObject<CALayerDelegate> {
-    AVVideoSourcePreview *_parent;
-    BOOL _hasObserver;
-}
-
-- (void)setParent:(AVVideoSourcePreview *)parent;
-- (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context;
-@end
-
 namespace WebCore {
 
-class AVVideoSourcePreview: public AVMediaSourcePreview {
-public:
-    static RefPtr<AVMediaSourcePreview> create(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);
-
-    void backgroundLayerBoundsChanged();
-    PlatformLayer* platformLayer() const final { return m_previewBackgroundLayer.get(); }
-
-private:
-    AVVideoSourcePreview(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);
-
-    void invalidate() final;
-
-    void play() const final;
-    void pause() const final;
-    void setVolume(double) const final { };
-    void setEnabled(bool) final;
-    void setPaused(bool) const;
-
-    RetainPtr<AVCaptureVideoPreviewLayerType> m_previewLayer;
-    RetainPtr<PlatformLayer> m_previewBackgroundLayer;
-    RetainPtr<AVCaptureDeviceTypedef> m_device;
-    RetainPtr<WebCoreAVVideoCaptureSourceObserver> m_objcObserver;
-};
-
-RefPtr<AVMediaSourcePreview> AVVideoSourcePreview::create(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)
-{
-    return adoptRef(new AVVideoSourcePreview(session, device, parent));
-}
-
-AVVideoSourcePreview::AVVideoSourcePreview(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)
-    : AVMediaSourcePreview(parent)
-    , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] init]))
-{
-    m_device = device;
-    m_previewLayer = adoptNS([allocAVCaptureVideoPreviewLayerInstance() initWithSession:session]);
-    m_previewLayer.get().contentsGravity = kCAGravityResize;
-    m_previewLayer.get().anchorPoint = CGPointZero;
-    [m_previewLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];
-
-    m_previewBackgroundLayer = adoptNS([[CALayer alloc] init]);
-    m_previewBackgroundLayer.get().contentsGravity = kCAGravityResizeAspect;
-    m_previewBackgroundLayer.get().anchorPoint = CGPointZero;
-    m_previewBackgroundLayer.get().needsDisplayOnBoundsChange = YES;
-    [m_previewBackgroundLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];
-
-#ifndef NDEBUG
-    m_previewLayer.get().name = @"AVVideoCaptureSource preview layer";
-    m_previewBackgroundLayer.get().name = @"AVVideoSourcePreview parent layer";
-#endif
-
-    [m_previewBackgroundLayer addSublayer:m_previewLayer.get()];
-
-    [m_objcObserver.get() setParent:this];
-}
-
-void AVVideoSourcePreview::backgroundLayerBoundsChanged()
-{
-    if (m_previewBackgroundLayer && m_previewLayer)
-        [m_previewLayer.get() setBounds:m_previewBackgroundLayer.get().bounds];
-}
-
-void AVVideoSourcePreview::invalidate()
-{
-    [m_objcObserver.get() setParent:nil];
-    m_objcObserver = nullptr;
-    m_previewLayer = nullptr;
-    m_previewBackgroundLayer = nullptr;
-    m_device = nullptr;
-    AVMediaSourcePreview::invalidate();
-}
-
-void AVVideoSourcePreview::play() const
-{
-    setPaused(false);
-}
-
-void AVVideoSourcePreview::pause() const
-{
-    setPaused(true);
-}
-
-void AVVideoSourcePreview::setPaused(bool paused) const
-{
-    [m_device lockForConfiguration:nil];
-    m_previewLayer.get().connection.enabled = !paused;
-    [m_device unlockForConfiguration];
-}
-
-void AVVideoSourcePreview::setEnabled(bool enabled)
-{
-    m_previewLayer.get().hidden = !enabled;
-}
-
 const OSType videoCaptureFormat = kCVPixelFormatType_32BGRA;
 
 RefPtr<AVMediaCaptureSource> AVVideoCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint)
@@ -512,20 +409,7 @@
         return;
 
     updateFramerate(sampleBuffer.get());
-
-    CMSampleBufferRef newSampleBuffer = 0;
-    CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer.get(), &newSampleBuffer);
-    ASSERT(newSampleBuffer);
-
-    CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(newSampleBuffer, true);
-    if (attachmentsArray) {
-        for (CFIndex i = 0; i < CFArrayGetCount(attachmentsArray); ++i) {
-            CFMutableDictionaryRef attachments = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachmentsArray, i);
-            CFDictionarySetValue(attachments, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
-        }
-    }
-
-    m_buffer = adoptCF(newSampleBuffer);
+    m_buffer = sampleBuffer;
     m_lastImage = nullptr;
 
     bool settingsChanged = false;
@@ -605,11 +489,6 @@
     CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_lastImage.get());
 }
 
-RefPtr<AVMediaSourcePreview> AVVideoCaptureSource::createPreview()
-{
-    return AVVideoSourcePreview::create(session(), device(), this);
-}
-
 NSString* AVVideoCaptureSource::bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height) const
 {
     if (!width && !height)
@@ -656,46 +535,4 @@
 
 } // namespace WebCore
 
-@implementation WebCoreAVVideoCaptureSourceObserver
-
-static NSString * const KeyValueBoundsChangeKey = @"bounds";
-
-- (void)setParent:(AVVideoSourcePreview *)parent
-{
-    if (_parent && _hasObserver && _parent->platformLayer()) {
-        _hasObserver = false;
-        [_parent->platformLayer() removeObserver:self forKeyPath:KeyValueBoundsChangeKey];
-    }
-
-    _parent = parent;
-
-    if (_parent && _parent->platformLayer()) {
-        _hasObserver = true;
-        [_parent->platformLayer() addObserver:self forKeyPath:KeyValueBoundsChangeKey options:0 context:nullptr];
-    }
-}
-
-- (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
-{
-    UNUSED_PARAM(context);
-
-    if (!_parent)
-        return;
-
-    if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
-        return;
-
-#if PLATFORM(IOS)
-    WebThreadRun(^ {
-        if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())
-            _parent->backgroundLayerBoundsChanged();
-    });
-#else
-    if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())
-        _parent->backgroundLayerBoundsChanged();
-#endif
-}
-
-@end
-
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm (210620 => 210621)


--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm	2017-01-12 05:22:32 UTC (rev 210621)
@@ -48,6 +48,8 @@
 
 namespace WebCore {
 
+static const int videoSampleRate = 90000;
+
 RefPtr<MockRealtimeVideoSource> MockRealtimeVideoSource::create(const String& name, const MediaConstraints* constraints)
 {
     auto source = adoptRef(new MockRealtimeVideoSourceMac(name));
@@ -74,12 +76,9 @@
     if (!pixelBuffer)
         return nullptr;
 
-    CMSampleTimingInfo timingInfo;
+    CMTime sampleTime = CMTimeMake((elapsedTime() + .1) * videoSampleRate, videoSampleRate);
+    CMSampleTimingInfo timingInfo = { kCMTimeInvalid, sampleTime, sampleTime };
 
-    timingInfo.presentationTimeStamp = CMTimeMake(elapsedTime() * 1000, 1000);
-    timingInfo.decodeTimeStamp = kCMTimeInvalid;
-    timingInfo.duration = kCMTimeInvalid;
-
     CMVideoFormatDescriptionRef formatDescription = nullptr;
     OSStatus status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, (CVImageBufferRef)pixelBuffer, &formatDescription);
     if (status != noErr) {
@@ -100,6 +99,8 @@
 
 RetainPtr<CVPixelBufferRef> MockRealtimeVideoSourceMac::pixelBufferFromCGImage(CGImageRef image) const
 {
+    static CGColorSpaceRef deviceRGBColorSpace = CGColorSpaceCreateDeviceRGB();
+
     CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
     CFDictionaryRef options = (__bridge CFDictionaryRef) @{
         (__bridge NSString *)kCVPixelBufferCGImageCompatibilityKey: @(NO),
@@ -112,8 +113,7 @@
 
     CVPixelBufferLockBaseAddress(pixelBuffer, 0);
     void* data = ""
-    auto rgbColorSpace = adoptCF(CGColorSpaceCreateDeviceRGB());
-    auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace.get(), (CGBitmapInfo) kCGImageAlphaNoneSkipFirst));
+    auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), deviceRGBColorSpace, (CGBitmapInfo) kCGImageAlphaNoneSkipFirst));
     CGContextDrawImage(context.get(), CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
     CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
 

Modified: trunk/Source/WebKit2/WebProcess/com.apple.WebProcess.sb.in (210620 => 210621)


--- trunk/Source/WebKit2/WebProcess/com.apple.WebProcess.sb.in	2017-01-12 05:15:48 UTC (rev 210620)
+++ trunk/Source/WebKit2/WebProcess/com.apple.WebProcess.sb.in	2017-01-12 05:22:32 UTC (rev 210621)
@@ -448,3 +448,8 @@
         (iokit-user-client-class "IOUSBDeviceUserClientV2")
         (iokit-user-client-class "IOUSBInterfaceUserClientV2"))
     (allow device-camera))
+
+;; @@@@@
+(allow device-microphone)
+;; @@@@@
+
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to