- Revision
- 289049
- Author
- [email protected]
- Date
- 2022-02-03 06:07:43 -0800 (Thu, 03 Feb 2022)
Log Message
RealtimeIncomingVideoSourceCocoa should not need to create IOSurfaces
https://bugs.webkit.org/show_bug.cgi?id=235952
Reviewed by Eric Carlson.
Source/WebCore:
Test: webrtc/vp8-then-h264.html
* platform/mediastream/mac/RealtimeIncomingVideoSourceCocoa.mm:
Instead of using an IOSurface-based buffer pool, we use a regular memory buffer pool.
A follow-up should probably remove the need to convert YUV420 webrtc video frames to NV12 CVPixelBuffers in RealtimeIncomingVideoSourceCocoa,
so as to leave that to when copying the webrtc video frames into shared memory buffers used for IPC.
LayoutTests:
* webrtc/vp8-then-h264-expected.txt: Added.
* webrtc/vp8-then-h264.html: Added.
Modified Paths
Added Paths
Diff
Modified: trunk/LayoutTests/ChangeLog (289048 => 289049)
--- trunk/LayoutTests/ChangeLog 2022-02-03 14:02:26 UTC (rev 289048)
+++ trunk/LayoutTests/ChangeLog 2022-02-03 14:07:43 UTC (rev 289049)
@@ -1,3 +1,13 @@
+2022-02-03 Youenn Fablet <[email protected]>
+
+ RealtimeIncomingVideoSourceCocoa should not need to create IOSurfaces
+ https://bugs.webkit.org/show_bug.cgi?id=235952
+
+ Reviewed by Eric Carlson.
+
+ * webrtc/vp8-then-h264-expected.txt: Added.
+ * webrtc/vp8-then-h264.html: Added.
+
2022-02-03 Martin Robinson <[email protected]>
Transform interpolation should blend between shared transform function primitives
Added: trunk/LayoutTests/webrtc/vp8-then-h264-expected.txt (0 => 289049)
--- trunk/LayoutTests/webrtc/vp8-then-h264-expected.txt (rev 0)
+++ trunk/LayoutTests/webrtc/vp8-then-h264-expected.txt 2022-02-03 14:07:43 UTC (rev 289049)
@@ -0,0 +1,7 @@
+Video should be running, go to black and running.
+Following, should be a snapshot of the video, a black frame and a snapshot of the video.
+
+
+PASS Setting video exchange with VP8
+PASS Setting exchange with H264 using VP8 decoded stream as input
+
Added: trunk/LayoutTests/webrtc/vp8-then-h264.html (0 => 289049)
--- trunk/LayoutTests/webrtc/vp8-then-h264.html (rev 0)
+++ trunk/LayoutTests/webrtc/vp8-then-h264.html 2022-02-03 14:07:43 UTC (rev 289049)
@@ -0,0 +1,62 @@
+<!doctype html>
+<html>
+ <head>
+ <meta charset="utf-8">
+ <title>Testing muting video</title>
+ <script src=""
+ <script src=""
+ </head>
+ <body>
+ <div>Video should be running, go to black and running.</div>
+ <div>Following, should be a snapshot of the video, a black frame and a snapshot of the video.</div>
+ <video id="video1" autoplay playsInline width="320" height="240"></video>
+ <video id="video2" autoplay playsInline width="320" height="240"></video>
+ <canvas id="canvas1" width="320" height="240"></canvas>
+ <canvas id="canvas2" width="320" height="240"></canvas>
+ <canvas id="canvas3" width="320" height="240"></canvas>
+ <script src =""
+ <script>
+var track;
+var remoteTrack;
+var receivingConnection;
+promise_test((test) => {
+ return navigator.mediaDevices.getUserMedia({video: {width: 320, height: 240 }}).then((localStream) => {
+ return new Promise((resolve, reject) => {
+ track = localStream.getVideoTracks()[0];
+
+ createConnections((firstConnection) => {
+ firstConnection.addTrack(track, localStream);
+ firstConnection.getTransceivers()[0].setCodecPreferences([{mimeType: "video/VP8", clockRate: 90000}]);
+ }, (secondConnection) => {
+ receivingConnection = secondConnection;
+ secondConnection._ontrack_ = (trackEvent) => {
+ remoteTrack = trackEvent.track;
+ resolve(trackEvent.streams[0]);
+ };
+ });
+ setTimeout(() => reject("Test timed out"), 5000);
+ });
+ }).then((remoteStream) => {
+ video1.srcObject = remoteStream;
+ return video1.play();
+ });
+}, "Setting video exchange with VP8");
+
+promise_test(async () => {
+ video2.srcObject = await new Promise((resolve, reject) => {
+ createConnections((firstConnection) => {
+ firstConnection.addTrack(video1.srcObject.getVideoTracks()[0], video1.srcObject);
+ }, (secondConnection) => {
+ receivingConnection = secondConnection;
+ secondConnection._ontrack_ = (trackEvent) => {
+ remoteTrack = trackEvent.track;
+ resolve(trackEvent.streams[0]);
+ };
+ });
+ setTimeout(() => reject("Test timed out"), 5000);
+ });
+ await video2.play();
+}, "Setting exchange with H264 using VP8 decoded stream as input");
+ </script>
+ </body>
+</html>
Modified: trunk/Source/WebCore/ChangeLog (289048 => 289049)
--- trunk/Source/WebCore/ChangeLog 2022-02-03 14:02:26 UTC (rev 289048)
+++ trunk/Source/WebCore/ChangeLog 2022-02-03 14:07:43 UTC (rev 289049)
@@ -1,3 +1,17 @@
+2022-02-03 Youenn Fablet <[email protected]>
+
+ RealtimeIncomingVideoSourceCocoa should not need to create IOSurfaces
+ https://bugs.webkit.org/show_bug.cgi?id=235952
+
+ Reviewed by Eric Carlson.
+
+ Test: webrtc/vp8-then-h264.html
+
+ * platform/mediastream/mac/RealtimeIncomingVideoSourceCocoa.mm:
+ Instead of using an IOSurface-based buffer pool, we use a regular memory buffer pool.
+ A follow-up should probably remove the need to convert YUV420 webrtc video frames to NV12 CVPixelBuffers in RealtimeIncomingVideoSourceCocoa,
+ so as to leave that to when copying the webrtc video frames into shared memory buffers used for IPC.
+
2022-02-03 Antoine Quint <[email protected]>
Incorrect KeyframesEffect generated for background
Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSourceCocoa.mm (289048 => 289049)
--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSourceCocoa.mm 2022-02-03 14:02:26 UTC (rev 289048)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSourceCocoa.mm 2022-02-03 14:07:43 UTC (rev 289049)
@@ -99,13 +99,21 @@
break;
case webrtc::BufferType::I010:
poolBufferType = kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange;
+ break;
+ default:
+ return nullptr;
}
- if (auto pool = createIOSurfaceCVPixelBufferPool(width, height, poolBufferType)) {
- m_pixelBufferPool = WTFMove(*pool);
- m_pixelBufferPoolWidth = width;
- m_pixelBufferPoolHeight = height;
- m_pixelBufferPoolBufferType = bufferType;
+
+ auto result = createInMemoryCVPixelBufferPool(width, height, poolBufferType);
+ if (!result) {
+ RELEASE_LOG_ERROR(WebRTC, "RealtimeIncomingVideoSourceCocoa failed creating buffer pool with error %d", result.error());
+ return nullptr;
}
+
+ m_pixelBufferPool = WTFMove(*result);
+ m_pixelBufferPoolWidth = width;
+ m_pixelBufferPoolHeight = height;
+ m_pixelBufferPoolBufferType = bufferType;
}
return m_pixelBufferPool.get();
}
@@ -121,6 +129,8 @@
return m_blackFrame.get();
}
+ // In case of in memory samples, we have non interleaved YUV data while CVPixelBuffers prefer interleaved YUV data.
+ // Maybe we should introduce a MediaSample that would represent non interleaved YUV data as an optimization.
return adoptCF(webrtc::createPixelBufferFromFrame(frame, [this](size_t width, size_t height, webrtc::BufferType bufferType) -> CVPixelBufferRef {
auto pixelBufferPool = this->pixelBufferPool(width, height, bufferType);
if (!pixelBufferPool)