Dear support,

I have implemented a RTSP live streaming server with multiple stream that is used in our ONVIF server. In this server we create several profiles, every profile set up a live555 streaming server on different port (h264).
I use latest Live555 (25 march 2014).

The problem occurs in the release version when from the client switch from a stream to another. I don't identify exactly where the system crash, but it seem block after a call of serverMediaSubsession::createNewRTPSink. I add some log to understand better and I find to sometimes the system crash in the doGetNextFrame method.
This problem is not present in the debug version.

I tried to insert different passive wait (before calling createNewStreamSource(), createNewRTPSink() and deleteStream()) and the problem occurs far less times.

Do you have any idea of the problem?


Thanks in advance for your help.

Bets regards

Andrea

Here there is the code of my serverMediaSubsession and of the functions doGetNextFrame() and deliverFrame() of our framedSource :


class serverMediaSubsession: public OnDemandServerMediaSubsession
        {

        public :
static serverMediaSubsession* createNew(UsageEnvironment &env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent);


            void checkForAuxSDPLine1();
            void afterPlayingDummy1();

        protected:
serverMediaSubsession(UsageEnvironment &env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent);
            virtual ~serverMediaSubsession();

            void setDoneFlag() { fDoneFlag = ~0; }

        protected:

            //redefined virtual functions
virtual char const* getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource); virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate);
            // "estBitrate" is the stream's estimated bitrate, in kbps
virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource);

virtual void deleteStream(unsigned clientSessionId, void*& streamToken);
        private:

            UsageEnvironment* m_pEnvironment;

            rtspVideoStreamer * m_pParent;
            rtspVideoStreamer::parameters  fParam;

            char* fAuxSDPLine;
            char fDoneFlag; // used when setting up "fAuxSDPLine"
            RTPSink* fDummyRTPSink;
        };

/*--------------------------------------------------------------------*/
rtspVideoStreamer::serverMediaSubsession* rtspVideoStreamer::serverMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent)
    {

return new serverMediaSubsession(env, reuseFirstSource,params,parent);
    }

/*--------------------------------------------------------------------*/
rtspVideoStreamer::serverMediaSubsession ::serverMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent) : OnDemandServerMediaSubsession(env, reuseFirstSource), fParam(params), m_pParent(parent)
    {
        m_pEnvironment = &env;
        fAuxSDPLine = NULL;
        fDoneFlag = 0;
        fDummyRTPSink = NULL;
    }

/*--------------------------------------------------------------------*/
rtspVideoStreamer::serverMediaSubsession::~serverMediaSubsession()
    {
        envir().taskScheduler().unscheduleDelayedTask(nextTask());
        delete[] fAuxSDPLine;
    }

/*--------------------------------------------------------------------*/
    static void afterPlayingDummy(void* clientData)
    {
rtspVideoStreamer::serverMediaSubsession* subsess = (rtspVideoStreamer::serverMediaSubsession*)clientData;
        subsess->afterPlayingDummy1();
    }
/*--------------------------------------------------------------------*/
    void rtspVideoStreamer::serverMediaSubsession::afterPlayingDummy1()
    {
        envir().taskScheduler().unscheduleDelayedTask(nextTask());
        setDoneFlag();
    }

/*--------------------------------------------------------------------*/
    static void checkForAuxSDPLine(void* clientData)
    {
rtspVideoStreamer::serverMediaSubsession* subsess = (rtspVideoStreamer::serverMediaSubsession*)clientData;
        subsess->checkForAuxSDPLine1();
    }

/*--------------------------------------------------------------------*/
    void rtspVideoStreamer::serverMediaSubsession::checkForAuxSDPLine1()
    {
        char const* dasl;
        if (fAuxSDPLine != NULL) {
            setDoneFlag();
} else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
            fAuxSDPLine = strDup(dasl);
            fDummyRTPSink = NULL;

            setDoneFlag();
        } else {
            int uSecsToDelay = 100000; // 100 ms
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay, (TaskFunc*)checkForAuxSDPLine, this);
        }
    }

/*--------------------------------------------------------------------*/
char const* rtspVideoStreamer::serverMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource)
    {
        if (fAuxSDPLine != NULL) return fAuxSDPLine;

        if (fDummyRTPSink == NULL) {
            fDummyRTPSink = rtpSink;

fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);

            checkForAuxSDPLine(this);
        }

        envir().taskScheduler().doEventLoop(&fDoneFlag);

        return fAuxSDPLine;
    }

/*--------------------------------------------------------------------*/
FramedSource* rtspVideoStreamer::serverMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate)
    {
       passiveWait(1000*1000);

        estBitrate = 500;
        m_pParent->lockSource();
        FramedSource* mysource = NULL;

        if (fParam.m_encoderParams.m_videoCodec != mjpeg)
        {
mysource = genericSource::createNew(*m_pEnvironment, fParam, m_pParent);
        }
        else
        {
mysource = new myJPEGVideoSource(*m_pEnvironment, fParam, m_pParent);
        }

        m_pParent->m_bClientConnected = true;
        m_pParent->unlockSource();

        FramedSource* videoES = mysource;

        switch(fParam.m_encoderParams.m_videoCodec)
        {
        case mpeg4:
return MPEG4VideoStreamFramer::createNew(*m_pEnvironment, videoES);
            break;
        case h263p:
return H263plusVideoStreamFramer::createNew(*m_pEnvironment, videoES);
            break;
        case h264:
return H264VideoStreamFramer::createNew(*m_pEnvironment, videoES);
            break;
        case mpeg2:
        case mpeg1:
return MPEG1or2VideoStreamFramer::createNew(*m_pEnvironment, videoES);
            break;
        case mjpeg:
            return mysource;
            break;
        default:
            return NULL;
            break;
        }

    }

/*--------------------------------------------------------------------*/
RTPSink* rtspVideoStreamer::serverMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource)
    {
        passiveWait(1000*1000);
        switch(fParam.m_encoderParams.m_videoCodec)
        {
        case mpeg4:
return MPEG4ESVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, 96);
            break;
        case h263p:
return H263plusVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, 96);
            break;
        case h264:
return H264VideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, rtpPayloadTypeIfDynamic/*96*/);
            break;
        case mpeg2:
        case mpeg1:
return MPEG1or2VideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock);
            break;
        case mjpeg:
return JPEGVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock);
            break;
        default:
            return NULL;
            break;
        }
    }

/*--------------------------------------------------------------------*/
void rtspVideoStreamer::serverMediaSubsession::deleteStream(unsigned clientSessionId, void*& streamToken)
    {
OnDemandServerMediaSubsession::deleteStream(clientSessionId, streamToken);
       passiveWait(1500*1000);

        StreamState* streamState = (StreamState*)streamToken;
        if (streamState != NULL)
            m_pParent->m_bClientConnected = true;
        else
            m_pParent->m_bClientConnected = false;
    }

/*--------------------------------------------------------------------*/
    void rtspVideoStreamer::genericSource::doGetNextFrame()
    {
        bool ret;
        ucImage img;
        m_pParent->getImage(img);
        if (img.isValid())
        {
if ((m_imgSize.width()!=img.getWidth()) || (m_imgSize.height()!=img.getHeight()))
            {
                imgResizer imgres;
                imgResizer::parameters resizeparam;
                resizeparam.m_width = m_imgSize.width();
                resizeparam.m_height = m_imgSize.height();
                imgres.setParameters(resizeparam);
                imgres.apply(img);
            }
            try
            {
ret = m_enc.apply(img, m_pBufferCompressed, m_iCompressedSize);
            }
            catch (...)
            {

            }


        }

        try
        {
            deliverFrame();
        }
        catch (...)
        {
LOG_DBG("rtspVideoStreamer::genericSource::doGetNextFrame()::deliverFrame() ecception");
        }
        if (0 /* the source stops being readable */)
        {
            handleClosure(this);
            return;
        }
    }

/*--------------------------------------------------------------------*/
    void rtspVideoStreamer::genericSource::deliverFrame()
    {
        if ((unsigned)m_iCompressedSize > fMaxSize)
        {
            fNumTruncatedBytes = m_iCompressedSize - fMaxSize;
            fFrameSize = fMaxSize;
            memcpy(fTo, m_pBufferCompressed, fMaxSize);
        }
        else
        {
            memcpy(fTo, m_pBufferCompressed, m_iCompressedSize);
            fFrameSize = m_iCompressedSize;
        }

        gettimeofday(&fPresentationTime, NULL);

if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet

        FramedSource::afterGetting(this);
    }

--
*Andrea Beoldo*
Project Manager/R&D
Technoaware Srl
Corso Buenos Aires 18/11, 16129 Genova (GE)
Ph. +39 010 5539239 Fax. +39 0105539240
Email: andrea.beo...@technoaware.com
Web: www.technoaware.com

------------------------------------------------------------------------
*Privacy*

Le informazioni contenute in questo messaggio sono riservate e confidenziali. 
Il loro utilizzo è consentito esclusivamente al destinatario del messaggio, per 
le finalità indicate nel messaggio stesso. Qualora Lei non fosse la persona a 
cui il presente messaggio è destinato, La invitiamo ad eliminarlo dal Suo 
sistema ed a distruggere le varie copie o stampe, dandocene gentilmente 
comunicazione. Ogni utilizzo improprio è contrario ai principi del D.lgs 196/03 
e alla legislazione europea (Direttiva 2002/58/CE). TechnoAware opera in 
conformità al D.lgs 196/2003 e alla legislazione europea.

The information contained in this message as well as the attached file(s)is 
confidential/privileged and is only intended for the person to whom it is 
addressed. If the reader of this message is not the intended recipient or the 
employee or agent responsible for delivering the message to the intended 
recipient, or you have received this communication in error, please be aware 
that any dissemination, distribution or duplication is strictly prohibited and 
can be illegal. Please notify us immediately and delete all copies from your 
mailbox and other archives. Thank you.

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to