Title: [287180] trunk
Revision
287180
Author
[email protected]
Date
2021-12-17 04:47:56 -0800 (Fri, 17 Dec 2021)

Log Message

[GStreamer][WebRTC] Huge memory leak
https://bugs.webkit.org/show_bug.cgi?id=234134

Patch by Philippe Normand <[email protected]> on 2021-12-17
Reviewed by Xabier Rodriguez-Calvar.

Source/WebCore:

The main issue was RealtimeOutgoingVideoSourceLibWebRTC leaking GstSamples. Fixing this lead
me to further clean-ups in the GstSample<->LibWebRTCVideoFrame handling. Native frames
should not consume GstSamples, but reference them, otherwise there are crashes where the
dangling GstSamples would be passed to the libwebrtc video decoder...

Also the video decoders were not reporting their initialization status correctly,
WEBRTC_VIDEO_CODEC_OK is 0 hence false...

* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
(WebCore::MediaSampleGStreamer::initializeFromBuffer):
* platform/graphics/gstreamer/MediaSampleGStreamer.h:
(WebCore::MediaSampleGStreamer::createWrappedSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp:
(WebCore::GStreamerVideoDecoder::pullSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp:
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp:
(WebCore::convertLibWebRTCVideoFrameToGStreamerSample):
(WebCore::convertGStreamerSampleToLibWebRTCVideoFrame):
(WebCore::GStreamerVideoFrameLibWebRTC::create):
(WebCore::GStreamerSampleFromLibWebRTCVideoFrame): Deleted.
(WebCore::LibWebRTCVideoFrameFromGStreamerSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h:
(WebCore::GStreamerVideoFrameLibWebRTC::GStreamerVideoFrameLibWebRTC):
(WebCore::GStreamerVideoFrameLibWebRTC::getSample const):
(WebCore::GStreamerVideoFrameLibWebRTC::takeSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
* platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeOutgoingVideoSourceLibWebRTC::videoSampleAvailable):

LayoutTests:

* platform/glib/TestExpectations: Update test expectations, #233740 was mostly due to video
decoder initialization wrongly reported as failed.
webrtc/captureCanvas-webrtc-software-h264-baseline.html no longer times out.

Modified Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (287179 => 287180)


--- trunk/LayoutTests/ChangeLog	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/LayoutTests/ChangeLog	2021-12-17 12:47:56 UTC (rev 287180)
@@ -1,3 +1,14 @@
+2021-12-17  Philippe Normand  <[email protected]>
+
+        [GStreamer][WebRTC] Huge memory leak
+        https://bugs.webkit.org/show_bug.cgi?id=234134
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        * platform/glib/TestExpectations: Update test expectations, #233740 was mostly due to video
+        decoder initialization wrongly reported as failed.
+        webrtc/captureCanvas-webrtc-software-h264-baseline.html no longer times out.
+
 2021-12-16  Tyler Wilcock  <[email protected]>
 
         AX: Make aria-multiline.html, clipped-text-under-element.html, mixed-checkbox.html, and selection-states.html pass in isolated tree mode

Modified: trunk/LayoutTests/platform/glib/TestExpectations (287179 => 287180)


--- trunk/LayoutTests/platform/glib/TestExpectations	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/LayoutTests/platform/glib/TestExpectations	2021-12-17 12:47:56 UTC (rev 287180)
@@ -1349,8 +1349,7 @@
 
 webkit.org/b/208125 webrtc/peerconnection-new-candidate-page-cache.html [ Pass Crash ]
 
-# Was Slow Failure before libwebrtc bumped to M96 in r285577
-webkit.org/b/216538 webrtc/captureCanvas-webrtc-software-h264-baseline.html [ Timeout ]
+webkit.org/b/216538 webrtc/captureCanvas-webrtc-software-h264-baseline.html [ Slow Failure ]
 
 webkit.org/b/216763 webrtc/captureCanvas-webrtc-software-h264-high.html [ Crash Failure ]
 
@@ -1370,10 +1369,6 @@
 
 webkit.org/b/229055 http/wpt/webrtc/sframe-transform-error.html [ Failure ]
 
-webkit.org/b/233740 http/wpt/webrtc/video-script-transform-keyframe-only.html [ Failure ]
-webkit.org/b/233740 webrtc/vp9-svc.html [ Timeout ]
-webkit.org/b/233740 webrtc/vp9.html [ Timeout ]
-
 webkit.org/b/233879 fast/mediastream/get-display-media-settings.html [ Failure ]
 
 #////////////////////////////////////////////////////////////////////////////////////////

Modified: trunk/Source/WebCore/ChangeLog (287179 => 287180)


--- trunk/Source/WebCore/ChangeLog	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/ChangeLog	2021-12-17 12:47:56 UTC (rev 287180)
@@ -1,3 +1,41 @@
+2021-12-17  Philippe Normand  <[email protected]>
+
+        [GStreamer][WebRTC] Huge memory leak
+        https://bugs.webkit.org/show_bug.cgi?id=234134
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        The main issue was RealtimeOutgoingVideoSourceLibWebRTC leaking GstSamples. Fixing this lead
+        me to further clean-ups in the GstSample<->LibWebRTCVideoFrame handling. Native frames
+        should not consume GstSamples, but reference them, otherwise there are crashes where the
+        dangling GstSamples would be passed to the libwebrtc video decoder...
+
+        Also the video decoders were not reporting their initialization status correctly,
+        WEBRTC_VIDEO_CODEC_OK is 0 hence false...
+
+        * platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
+        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
+        (WebCore::MediaSampleGStreamer::initializeFromBuffer):
+        * platform/graphics/gstreamer/MediaSampleGStreamer.h:
+        (WebCore::MediaSampleGStreamer::createWrappedSample):
+        * platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp:
+        (WebCore::GStreamerVideoDecoder::pullSample):
+        * platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp:
+        * platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp:
+        (WebCore::convertLibWebRTCVideoFrameToGStreamerSample):
+        (WebCore::convertGStreamerSampleToLibWebRTCVideoFrame):
+        (WebCore::GStreamerVideoFrameLibWebRTC::create):
+        (WebCore::GStreamerSampleFromLibWebRTCVideoFrame): Deleted.
+        (WebCore::LibWebRTCVideoFrameFromGStreamerSample): Deleted.
+        * platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h:
+        (WebCore::GStreamerVideoFrameLibWebRTC::GStreamerVideoFrameLibWebRTC):
+        (WebCore::GStreamerVideoFrameLibWebRTC::getSample const):
+        (WebCore::GStreamerVideoFrameLibWebRTC::takeSample): Deleted.
+        * platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
+        (WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
+        * platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp:
+        (WebCore::RealtimeOutgoingVideoSourceLibWebRTC::videoSampleAvailable):
+
 2021-12-17  Antoine Quint  <[email protected]>
 
         ActiveDOMObject::suspendIfNeeded() should not be called within constructors

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -42,7 +42,6 @@
     , m_videoRotation(videoRotation)
     , m_videoMirrored(videoMirrored)
 {
-    const GstClockTime minimumDuration = 1000; // 1 us
     ASSERT(sample);
     GstBuffer* buffer = gst_sample_get_buffer(sample.get());
     RELEASE_ASSERT(buffer);
@@ -50,35 +49,10 @@
     if (metadata)
         buffer = webkitGstBufferSetVideoSampleMetadata(buffer, WTFMove(metadata));
 
-    if (GST_BUFFER_PTS_IS_VALID(buffer))
-        m_pts = fromGstClockTime(GST_BUFFER_PTS(buffer));
-    if (GST_BUFFER_DTS_IS_VALID(buffer) || GST_BUFFER_PTS_IS_VALID(buffer))
-        m_dts = fromGstClockTime(GST_BUFFER_DTS_OR_PTS(buffer));
-    if (GST_BUFFER_DURATION_IS_VALID(buffer)) {
-        // Sometimes (albeit rarely, so far seen only at the end of a track)
-        // frames have very small durations, so small that may be under the
-        // precision we are working with and be truncated to zero.
-        // SourceBuffer algorithms are not expecting frames with zero-duration,
-        // so let's use something very small instead in those fringe cases.
-        m_duration = fromGstClockTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
-    } else {
-        // Unfortunately, sometimes samples don't provide a duration. This can never happen in MP4 because of the way
-        // the format is laid out, but it's pretty common in WebM.
-        // The good part is that durations don't matter for playback, just for buffered ranges and coded frame deletion.
-        // We want to pick something small enough to not cause unwanted frame deletion, but big enough to never be
-        // mistaken for a rounding artifact.
-        m_duration = fromGstClockTime(16666667); // 1/60 seconds
-    }
-
-    m_size = gst_buffer_get_size(buffer);
     m_sample = adoptGRef(gst_sample_new(buffer, gst_sample_get_caps(sample.get()), nullptr,
         gst_sample_get_info(sample.get()) ? gst_structure_copy(gst_sample_get_info(sample.get())) : nullptr));
 
-    if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT))
-        m_flags = MediaSample::None;
-
-    if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DECODE_ONLY))
-        m_flags = static_cast<MediaSample::SampleFlags>(m_flags | MediaSample::IsNonDisplaying);
+    initializeFromBuffer();
 }
 
 MediaSampleGStreamer::MediaSampleGStreamer(const FloatSize& presentationSize, const AtomString& trackId)
@@ -90,6 +64,13 @@
 {
 }
 
+MediaSampleGStreamer::MediaSampleGStreamer(const GRefPtr<GstSample>& sample, VideoRotation videoRotation)
+    : m_sample(sample)
+    , m_videoRotation(videoRotation)
+{
+    initializeFromBuffer();
+}
+
 Ref<MediaSampleGStreamer> MediaSampleGStreamer::createFakeSample(GstCaps*, MediaTime pts, MediaTime dts, MediaTime duration, const FloatSize& presentationSize, const AtomString& trackId)
 {
     MediaSampleGStreamer* gstreamerMediaSample = new MediaSampleGStreamer(presentationSize, trackId);
@@ -154,6 +135,41 @@
     return create(WTFMove(sample), FloatSize(width, height), { }, videoRotation, videoMirrored);
 }
 
+void MediaSampleGStreamer::initializeFromBuffer()
+{
+    const GstClockTime minimumDuration = 1000; // 1 us
+    auto* buffer = gst_sample_get_buffer(m_sample.get());
+    RELEASE_ASSERT(buffer);
+
+    if (GST_BUFFER_PTS_IS_VALID(buffer))
+        m_pts = fromGstClockTime(GST_BUFFER_PTS(buffer));
+    if (GST_BUFFER_DTS_IS_VALID(buffer) || GST_BUFFER_PTS_IS_VALID(buffer))
+        m_dts = fromGstClockTime(GST_BUFFER_DTS_OR_PTS(buffer));
+    if (GST_BUFFER_DURATION_IS_VALID(buffer)) {
+        // Sometimes (albeit rarely, so far seen only at the end of a track)
+        // frames have very small durations, so small that may be under the
+        // precision we are working with and be truncated to zero.
+        // SourceBuffer algorithms are not expecting frames with zero-duration,
+        // so let's use something very small instead in those fringe cases.
+        m_duration = fromGstClockTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
+    } else {
+        // Unfortunately, sometimes samples don't provide a duration. This can never happen in MP4 because of the way
+        // the format is laid out, but it's pretty common in WebM.
+        // The good part is that durations don't matter for playback, just for buffered ranges and coded frame deletion.
+        // We want to pick something small enough to not cause unwanted frame deletion, but big enough to never be
+        // mistaken for a rounding artifact.
+        m_duration = fromGstClockTime(16666667); // 1/60 seconds
+    }
+
+    m_size = gst_buffer_get_size(buffer);
+
+    if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT))
+        m_flags = MediaSample::None;
+
+    if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DECODE_ONLY))
+        m_flags = static_cast<MediaSample::SampleFlags>(m_flags | MediaSample::IsNonDisplaying);
+}
+
 RefPtr<JSC::Uint8ClampedArray> MediaSampleGStreamer::getRGBAImageData() const
 {
     auto* caps = gst_sample_get_caps(m_sample.get());

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h (287179 => 287180)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h	2021-12-17 12:47:56 UTC (rev 287180)
@@ -40,6 +40,11 @@
         return adoptRef(*new MediaSampleGStreamer(WTFMove(sample), presentationSize, trackId, videoRotation, videoMirrored, WTFMove(metadata)));
     }
 
+    static Ref<MediaSampleGStreamer> createWrappedSample(const GRefPtr<GstSample>& sample, VideoRotation videoRotation = VideoRotation::None)
+    {
+        return adoptRef(*new MediaSampleGStreamer(sample, videoRotation));
+    }
+
     static Ref<MediaSampleGStreamer> createFakeSample(GstCaps*, MediaTime pts, MediaTime dts, MediaTime duration, const FloatSize& presentationSize, const AtomString& trackId);
     static Ref<MediaSampleGStreamer> createImageSample(PixelBuffer&&, const IntSize& destinationSize = { }, double frameRate = 1, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& metadata = std::nullopt);
 
@@ -66,11 +71,14 @@
 
 protected:
     MediaSampleGStreamer(GRefPtr<GstSample>&&, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& = std::nullopt);
+    MediaSampleGStreamer(const GRefPtr<GstSample>&, VideoRotation = VideoRotation::None);
     virtual ~MediaSampleGStreamer() = default;
 
 private:
     MediaSampleGStreamer(const FloatSize& presentationSize, const AtomString& trackId);
 
+    void initializeFromBuffer();
+
     MediaTime m_pts;
     MediaTime m_dts;
     MediaTime m_duration;

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -150,7 +150,7 @@
             return WEBRTC_VIDEO_CODEC_ERROR;
         }
 
-        return WEBRTC_VIDEO_CODEC_OK;
+        return true;
     }
 
     int32_t RegisterDecodeCompleteCallback(webrtc::DecodedImageCallback* callback) override
@@ -240,7 +240,7 @@
         auto timestamps = m_dtsPtsMap[GST_BUFFER_PTS(buffer)];
         m_dtsPtsMap.erase(GST_BUFFER_PTS(buffer));
 
-        auto frame(LibWebRTCVideoFrameFromGStreamerSample(WTFMove(sample), webrtc::kVideoRotation_0,
+        auto frame(convertGStreamerSampleToLibWebRTCVideoFrame(sample, webrtc::kVideoRotation_0,
             timestamps.timestamp, timestamps.renderTimeMs));
 
         GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -228,7 +228,7 @@
             return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
         }
 
-        auto sample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
+        auto sample = convertLibWebRTCVideoFrameToGStreamerSample(frame);
         auto buffer = gst_sample_get_buffer(sample.get());
 
         if (!GST_CLOCK_TIME_IS_VALID(m_firstFramePts)) {

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -27,13 +27,9 @@
 
 namespace WebCore {
 
-GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame& frame)
+GRefPtr<GstSample> convertLibWebRTCVideoFrameToGStreamerSample(const webrtc::VideoFrame& frame)
 {
-    if (frame.video_frame_buffer()->type() == webrtc::VideoFrameBuffer::Type::kNative) {
-        auto* framebuffer = static_cast<GStreamerVideoFrameLibWebRTC*>(frame.video_frame_buffer().get());
-        return framebuffer->takeSample();
-    }
-
+    RELEASE_ASSERT(frame.video_frame_buffer()->type() != webrtc::VideoFrameBuffer::Type::kNative);
     auto* i420Buffer = frame.video_frame_buffer()->ToI420().release();
     int height = i420Buffer->height();
     int strides[3] = {
@@ -60,23 +56,22 @@
     return sample;
 }
 
-rtc::scoped_refptr<webrtc::VideoFrameBuffer> GStreamerVideoFrameLibWebRTC::create(GRefPtr<GstSample>&& sample)
+std::unique_ptr<webrtc::VideoFrame> convertGStreamerSampleToLibWebRTCVideoFrame(GRefPtr<GstSample>& sample, webrtc::VideoRotation rotation, int64_t timestamp, int64_t renderTimeMs)
 {
+    auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(sample));
+    return std::unique_ptr<webrtc::VideoFrame>(new webrtc::VideoFrame(frameBuffer, timestamp, renderTimeMs, rotation));
+}
+
+rtc::scoped_refptr<webrtc::VideoFrameBuffer> GStreamerVideoFrameLibWebRTC::create(const GRefPtr<GstSample>& sample)
+{
     GstVideoInfo info;
 
     if (!gst_video_info_from_caps(&info, gst_sample_get_caps(sample.get())))
         ASSERT_NOT_REACHED();
 
-    return rtc::scoped_refptr<webrtc::VideoFrameBuffer>(new GStreamerVideoFrameLibWebRTC(WTFMove(sample), info));
+    return rtc::scoped_refptr<webrtc::VideoFrameBuffer>(new GStreamerVideoFrameLibWebRTC(sample, info));
 }
 
-std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GRefPtr<GstSample>&& sample, webrtc::VideoRotation rotation,
-    int64_t timestamp, int64_t renderTimeMs)
-{
-    auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(WTFMove(sample)));
-    return std::unique_ptr<webrtc::VideoFrame>(new webrtc::VideoFrame(frameBuffer, timestamp, renderTimeMs, rotation));
-}
-
 rtc::scoped_refptr<webrtc::I420BufferInterface> GStreamerVideoFrameLibWebRTC::ToI420()
 {
     GstMappedFrame inFrame(m_sample, GST_MAP_READ);

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h	2021-12-17 12:47:56 UTC (rev 287180)
@@ -31,19 +31,19 @@
 
 namespace WebCore {
 
-WARN_UNUSED_RETURN GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame&);
+WARN_UNUSED_RETURN GRefPtr<GstSample> convertLibWebRTCVideoFrameToGStreamerSample(const webrtc::VideoFrame&);
 
-std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GRefPtr<GstSample>&&, webrtc::VideoRotation, int64_t timestamp, int64_t renderTimeMs);
+std::unique_ptr<webrtc::VideoFrame> convertGStreamerSampleToLibWebRTCVideoFrame(GRefPtr<GstSample>&, webrtc::VideoRotation, int64_t timestamp, int64_t renderTimeMs);
 
 class GStreamerVideoFrameLibWebRTC : public rtc::RefCountedObject<webrtc::VideoFrameBuffer> {
 public:
-    GStreamerVideoFrameLibWebRTC(GRefPtr<GstSample>&& sample, GstVideoInfo info)
-        : m_sample(WTFMove(sample))
+    GStreamerVideoFrameLibWebRTC(const GRefPtr<GstSample>& sample, GstVideoInfo info)
+        : m_sample(sample)
         , m_info(info) { }
 
-    static rtc::scoped_refptr<webrtc::VideoFrameBuffer> create(GRefPtr<GstSample>&&);
+    static rtc::scoped_refptr<webrtc::VideoFrameBuffer> create(const GRefPtr<GstSample>&);
 
-    GRefPtr<GstSample>&& takeSample() { return WTFMove(m_sample); }
+    GstSample* getSample() const { return m_sample.get(); }
     rtc::scoped_refptr<webrtc::I420BufferInterface> ToI420() final;
 
     int width() const final { return GST_VIDEO_INFO_WIDTH(&m_info); }
@@ -55,6 +55,7 @@
     GRefPtr<GstSample> m_sample;
     GstVideoInfo m_info;
 };
+
 }
 
 #endif // USE(GSTREAMER) && USE(LIBWEBRTC)

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -58,12 +58,14 @@
     if (!isProducingData())
         return;
 
-    callOnMainThread([protectedThis = Ref { *this }, frame] {
-        auto gstSample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
+    if (frame.video_frame_buffer()->type() == webrtc::VideoFrameBuffer::Type::kNative) {
+        auto* framebuffer = static_cast<GStreamerVideoFrameLibWebRTC*>(frame.video_frame_buffer().get());
+        videoSampleAvailable(MediaSampleGStreamer::createWrappedSample(framebuffer->getSample(), static_cast<MediaSample::VideoRotation>(frame.rotation())), { });
+    } else {
+        auto gstSample = convertLibWebRTCVideoFrameToGStreamerSample(frame);
         auto metadata = std::make_optional(metadataFromVideoFrame(frame));
-        auto sample = MediaSampleGStreamer::create(WTFMove(gstSample), { }, { }, static_cast<MediaSample::VideoRotation>(frame.rotation()), false, WTFMove(metadata));
-        protectedThis->videoSampleAvailable(sample.get(), { });
-    });
+        videoSampleAvailable(MediaSampleGStreamer::create(WTFMove(gstSample), { }, { }, static_cast<MediaSample::VideoRotation>(frame.rotation()), false, WTFMove(metadata)), { });
+    }
 }
 
 } // namespace WebCore

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp (287179 => 287180)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp	2021-12-17 12:41:14 UTC (rev 287179)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp	2021-12-17 12:47:56 UTC (rev 287180)
@@ -69,7 +69,7 @@
 
     ASSERT(sample.platformSample().type == PlatformSample::GStreamerSampleType);
     auto& mediaSample = static_cast<MediaSampleGStreamer&>(sample);
-    auto frameBuffer = GStreamerVideoFrameLibWebRTC::create(gst_sample_ref(mediaSample.platformSample().sample.gstSample));
+    auto frameBuffer = GStreamerVideoFrameLibWebRTC::create(mediaSample.platformSample().sample.gstSample);
 
     sendFrame(WTFMove(frameBuffer));
 }
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to