Hello Warren and thanks.

>/  On this line VS says: "HEAP: Free Heap block 3043560 modified at 304361c
/>/  after it was freed"
/
What does the stack trace look like when this happens?
Here's the call stack:
MFVideoGenerator Test.exe!_free_base(void * pBlock=0x039505f8) Line 109 + 0x12 bytes C MFVideoGenerator Test.exe!_free_dbg_nolock(void * pUserData=0x03950618, int nBlockUse=0x00000001) Line 1426 + 0x9 bytes C++ MFVideoGenerator Test.exe!_free_dbg(void * pUserData=0x03950618, int nBlockUse=0x00000001) Line 1258 + 0xd bytes C++ MFVideoGenerator Test.exe!operator delete(void * p=0x03950618) Line 373 + 0xb bytes C++ MFVideoGenerator Test.exe!BasicHashTable::`scalar deleting destructor'() + 0x3c bytes C++ MFVideoGenerator Test.exe!RTCPMemberDatabase::~RTCPMemberDatabase() Line 35 + 0x37 bytes C++ MFVideoGenerator Test.exe!RTCPMemberDatabase::`scalar deleting destructor'() + 0x2b bytes C++ MFVideoGenerator Test.exe!RTCPInstance::~RTCPInstance() Line 194 + 0x3a bytes C++ MFVideoGenerator Test.exe!RTCPInstance::`scalar deleting destructor'() + 0x2b bytes C++ MFVideoGenerator Test.exe!MediaLookupTable::remove(const char * name=0x0398bcb8) Line 151 + 0x35 bytes C++ MFVideoGenerator Test.exe!Medium::close(UsageEnvironment & env={...}, const char * name=0x0398bcb8) Line 54 C++ MFVideoGenerator Test.exe!Medium::close(Medium * medium=0x0398bcb0) Line 59 + 0x17 bytes C++ MFVideoGenerator Test.exe!CMulticastH264Streamer::StreamerThread() Line 232 + 0xc bytes C++


It sounds like the opposite of a leak, where you're either
double-free'ing something or freeing something too early.
I posted the code in the first post in this thread. Basically, everything is in one thread (StreamerThread from the stack) and for testing I disabled any methods that are called from other threads from outside (basically, threads that add H264/AAC data to the sources).

Can you run the app on Linux?  If your app has a GUI, you should be able
to strip out the core of the app so that it will build and run on Linux
without the GUI.  (If not, it means you have application logic mixed in
with your display code, which is bad design from the start.)
I have couple of abstraction layers in separate libs until the code is used by GUI (which is there (the GUI) for testing purposes only). live555 is wrapped in separate lib, then one MediaFoundation component that uses it, then MediaFoundation component(lib) that creates and run the stream flow etc. Unfortunately, I cannot run the app on linux, because I do not have any developer experience with it, nor can I run MediaFoundation component for encoding H264/AAC in linux.


Anyway, if you could look, here's my code from StreamerThread

    // Begin by setting up our usage environment:

    // Create 'groupsocks' for RTP and RTCP:

    // Create RTSP server

    // Create server media session

    // Video stream groupsock's

    RTCPInstance* rtcp = NULL;
    // Add H264 stream
    if( m_settings.VideoStream )
    {
        // Create a 'H264 Video RTP' sink from the RTP 'groupsock':

        // Create (and start) a 'RTCP instance' for this RTP sink:

sms->addSubsession( PassiveServerMediaSubsession::createNew( *m_pRtpVideoSink, rtcp ) );
    }

    // Audio stream groupsock's

    RTCPInstance* rtcpAudio = NULL;

    // Add AAC Stream
    if( m_settings.AudioStream )
    {
        // Create (and start) a 'RTCP instance' for this RTP sink:

sms->addSubsession( PassiveServerMediaSubsession::createNew( *m_pRtpAudioSink, rtcpAudio ) );
    }

    rtspServer->addServerMediaSession( sms );

    // Get the streaming URL
    m_streamingURL = rtspServer->rtspURL( sms );
*m_pUsageEnv << "Play this stream using the URL \"" << m_streamingURL << "\"\n";

    if( m_settings.VideoStream )
    {
m_pH264FramedSource = CH264FramedSource::createNew( *m_pUsageEnv, 0, 40000 );

        FramedSource* framedSource = m_pH264FramedSource;

m_pH264DiscreteFramer = H264VideoStreamDiscreteFramer::createNew( *m_pUsageEnv, framedSource );

m_pRtpVideoSink->startPlaying( *m_pH264DiscreteFramer, afterPlayback, this );
    }

    if( m_settings.AudioStream )
    {
        timeval pTime;
        if( m_settings.VideoStream )
        {
            m_pH264FramedSource->GetPresentationTime(pTime);
        }
m_pAacFrameedSource = AACFramedSource::createNew( *m_pUsageEnv, 0, 40000, pTime );


m_pRtpAudioSink->startPlaying( *m_pAacFrameedSource, afterPlayback, this );
    }

m_pUsageEnv->taskScheduler().doEventLoop( &m_doneFlag ); // does not return

    // Close everything
    //
    if(m_pRtpVideoSink != NULL)
    {
        m_pRtpVideoSink->stopPlaying();
    }
    if(m_pRtpAudioSink != NULL)
    {
        m_pRtpAudioSink->stopPlaying();
    }

    Medium::close( m_pRtpVideoSink );
    Medium::close( m_pRtpAudioSink );

    Medium::close( m_pH264FramedSource );
    Medium::close( m_pAacFrameedSource );

    Medium::close(rtcp);
    Medium::close(rtcpAudio);

    rtpVideoGroupsock->removeAllDestinations();
    rtcpVideoGroupsock->removeAllDestinations();
    delete rtpVideoGroupsock;
    delete rtcpVideoGroupsock;

    rtpAudioGroupsock->removeAllDestinations();
    rtcpAudioGroupsock->removeAllDestinations();
    delete rtpAudioGroupsock;
    delete rtcpAudioGroupsock;

    Medium::close(rtspServer);

    m_pUsageEnv->reclaim();

    delete scheduler;

    SetEvent( m_hCloseEvt );

Thanks.



_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to