Kenneth Forsythe | 26 Aug 22:33 2014

DelayQueue infinite loop


DelayQueue::synchronize appears to be stuck in that while loop. It appears
curEntry->fDeltatTimeRemaning is always (0,0) and therefor timeSinceLastSync is always higher.

This only happens when I am hosting the libraries within a COM DLL. What is recommended? Can I just detect
this scenario and break the loop?

This is in DelayQueue.cpp at DelayQueue::synchronize
Loop starts at line 214.

DelayQueueEntry* curEntry = head();

while (timeSinceLastSync >= curEntry->fDeltaTimeRemaining) {

timeSinceLastSync -= curEntry->fDeltaTimeRemaining;

curEntry->fDeltaTimeRemaining = DELAY_ZERO;

curEntry = curEntry->fNext;


Chris Richardson (WTI | 26 Aug 18:55 2014

High CPU usage streaming TCP and UDP with OnDemandServerMediaSubsession

Hi Ross,


I have recently updated the LIVE555 libraries I am using, ultimately to version 2014.08.23, though I originally updated to 2014.07.25 and the same problem happens.  The problem is that the library is using a large amount of CPU when streaming both UDP and TCP from the same OnDemandServerMediaSubsession.  I can reproduce this problem both on my embedded system (with my own setup code) and on an Ubuntu Linux machine running an almost un-modified version of testOnDemandRTSPServer (the only change is setting reuseFirstSource to ‘True’).


To test this yourself:


1.       Modify testOnDemandRTSPServer by setting reuseFirstSource to True.

2.       Start an instance of ‘top’ to watch the CPU usage.

3.       Start one copy of openRTSP, streaming using TCP via –t.  CPU usage should be very low.  For me it is < 1%.

4.       Start another copy of openRTSP, streaming using UDP.  CPU usage should now be very high.  For me it is > 90%.


For my testing I was reading pre-recorded H.264 data via ‘h264ESVideoTest’.


I think the core problem is that RTPInterface (via StreamState->fRTCPInstance-> fRTCPInterface) is not properly handling receiving RTCP over both the TCP socket and UDP socket at the same time:


1.       Some RTCP data comes in over the UDP socket.

2.       The code eventually gets into RTPInterface::handleRead, which tries to read the data from the TCP socket, since fNextTCPReadStreamSocketNum is valid.  This read fails with EAGAIN almost all the time, and no data is processed.

3.       On the next iteration through the event loop, the UDP socket is still readable, since no data was read from it, and once again the code gets into RTPInterface::handleRead, which tries to read the data from the TCP socket.  This results in the UDP RTCP socket being kept in a continuously readable state, which then causes the event loop to spin.


I dug through the changes for the previous months and discovered that a change to RTPInterface::handleRead in the 2014.03.25 version causes this problem to occur.  The attached patch undoes the change and fixes this problem.  Applying this patch might cause other problems though, since I think you removed it for the following reason given in the change log for 2014.03.25:


“- Fixed an issue in the "RTPInterface" code that could cause "SetSpecificRRHandler()" to not

work properly when RTP/RTCP is being carried over TCP.”


Thanks for your time, and please let me know if you would like me to test anything else or send additional data.




Chris Richardson


Attachment (RTPInterface.patch): application/octet-stream, 415 bytes
live-devel mailing list
live-devel <at>
S├ębastien HEUZE | 25 Aug 11:15 2014

AAC RTP problem (crackling sound)


I'm using live555 in a swift (iOS) application, and I would like to stream AAC through this phone using RTP.

I have done a lot of test using MP3 and it works fine. (I'm using VLC and/or WireShark to capture the stream)

But I can't use AAC.

I tried to use ADTSAudioFileSource and MPEG4GenericRTPSink but I get a crackling sound instead (Noise) when I try to capture it.

sessionStateAAC.source = ADTSAudioFileSource::createNew(*env, inputFileNameAAC);

sessionStateAAC.sink = MPEG4GenericRTPSink::createNew(*env, sessionStateAAC.rtpGroupsock, 96, 44100, "audio", "AAC-hbr",sessionStateAAC.source->configStr(),2);

How can I get rid of this problem ? Am I doing it wrong ?

Thanks, we are trying to get it work since 3 (working) days now

live-devel mailing list
live-devel <at>
SungJoo Byun | 23 Aug 12:54 2014

patch for qmake build


I add some files for qmake build of live555.

It is not perfect for all platform. but it is helpful for win32, linux qmake user. 
( especially win32 msvc user )

Attachment (live.add-qmake.patch): application/octet-stream, 20 KiB
live-devel mailing list
live-devel <at>
James Huang | 23 Aug 03:10 2014

How to wrap a AAC-HBR RTP source into Transport stream correctly


I'm working on wrapping two RTP stream source (H264 and AAC-HBR ) into a MPEG Transport stream and then forward it to another 3rd party server. The source and the target are both 3rd party so I don't have control of their expectation of input and output format.

The video is working after study the mail list. But I'm not able to find a correct way to convert and feed a correct source into the method MPEG2TransportStreamFromESSource::addNewAudioSource().

My code pieces looks like:

MPEG2TransportStreamFromESSource *pTS = MPEG2TransportStreamFromESSource::createNew(env);
InsertH264StartCodeFilter *pVdoFilter = InsertH264StartCodeFilter::createNew(env, H264VideoRTPSource);
pTS->addNewVideoSource(pVdoFilter, 5);
pTS->addNewAudioSource(MPEG4GenericRTPSource, 4);

MPEG4GenericRTPSource seems not proper source for addNewAudioSource here. But I couldn't figure out how to convert it correctly.

The SDP of audio part looks like:
m=audio 0 RTP/AVP 97
a=rtpmap:97 MPEG4-GENERIC/48000/2
a=fmtp:97 streamtype=5;profile-level-id=1;mode=AAC-hbr;sizelength=13;indexlength=3;indexdeltalength=3;config=1190

Would you provide some hint how can I convert the MPEG4GenericRTPSource into correct data which is good to feed into MPEG2TransportStreamFromESSource::addNewAudioSource()?

Thank you,
live-devel mailing list
live-devel <at>
Barry Folse | 22 Aug 15:34 2014

Trouble with multiple streams

The purpose of my app is to generate 2 video streams from video generated by the same app.  The video is from 2 different eyepoints using OpenGL to create the views. The views are generated and updated concurrently. 

The problem is that only one stream is valid when I use VLC to view them.

In the app, I have set up a Stream class which encapsulates an input view, a thread, an RTSP server, an environment and task scheduler / event loop.

When an Stream object  is created/configured, it starts the thread, which creates the environment / task scheduler / event loop.    When the Stream's input view is finished rendering, it sends a triggerEvent() to the Stream's task scheduler.

Each Stream has a different destination address, RTP/RTCP Port, and RTSP port/URL.

The second Stream's event handler never gets called by its event loop.

Is it possible that one event loop is handling the triggerEvents from both Streams?

If anyone has successfully created an app like this, I would really appreciate any assistance you can give.



live-devel mailing list
live-devel <at>
Jonathan Anderson | 22 Aug 07:07 2014

Live555 Buffer Latency


My name is Jon Anderson. I am working on an application using Live555 to stream L16 data for real-time audio playback at the lowest latency possible. I have stripped down many buffers on the OS side and have gotten my stream to playback with a latency of about 60 - 70 ms. My attention is now focused on Live555. I have notice several buffers built into Live555, but the only one that I am expecting to cause latency is the ReorderingPacketBuffer. For now, my plan is to get the latency down to as low as possible and then work it up for the sake of audio quality. With that in mind, I set the ReoderingPacketBuffer threshold time to 0ms. What other buffers might be causing audio playback delay?
live-devel mailing list
live-devel <at>
Neerav Patel | 20 Aug 15:30 2014

live audio source with onDemandServer

I am trying to setup live555 to stream rtsp with an ondemandserver from a microphone, but I am not sure how to do so. I have attempted to do this by overriding OnDemandServerMediaSubsession and FramedSource, but I am running into issues where I hear a bit of sound for half a second and then quiet, in VLC the Messages say "buffer arrived way too early"...
I am encoding using ffmpeg to encode the audio as mp2.
I have attached what I am doing here:

#ifndef _FRAMED_SOURCE_HH #include "FramedSource.hh" #include "ImageTransfer.h" #endif   class MP2DeviceSource : public FramedSource { public: static MP2DeviceSource* createNew(UsageEnvironment& env, unsigned int stream_id, AudioTransfer * audioTransfer);   public: EventTriggerId eventTriggerId;   protected: MP2DeviceSource(UsageEnvironment& env, ImageTransfer * imageTransfer ); virtual ~MP2DeviceSource(); private: virtual void doGetNextFrame();   private: static void deliverFrame0(void* clientData); void deliverFrame();   private: AudioTransfer * audioTx;   };  

#include "MP2DeviceSource.h" MP2DeviceSource* MP2DeviceSource::createNew(UsageEnvironment& env, unsigned int stream_id, AudioTransfer * audioTransfer) { return new MaxMP2DeviceSource(env, audioTransfer); }   MP2DeviceSource::MP2DeviceSource(UsageEnvironment& env, AudioTransfer * audioTransfer) : FramedSource(env), audioTx(audioTransfer) { if (eventTriggerId == 0) eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0); }   MP2DeviceSource::~MP2DeviceSource() { envir().taskScheduler().deleteEventTrigger(eventTriggerId); eventTriggerId = 0; }   void MP2DeviceSource::doGetNextFrame() { deliverFrame(); }   void MP2DeviceSource::deliverFrame0(void *clientData) { ((MP2DeviceSource*)clientData)->deliverFrame(); }   static const unsigned __int64 epoch = 116444736000000000; int gettimeofday(struct timeval * tp, struct timezone * tzp) { FILETIME file_time; SYSTEMTIME system_time; ULARGE_INTEGER ularge; GetSystemTime(&system_time); SystemTimeToFileTime(&system_time, &file_time); ularge.LowPart = file_time.dwLowDateTime; ularge.HighPart = file_time.dwHighDateTime; tp->tv_sec = (long) ((ularge.QuadPart - epoch) / 10000000L); tp->tv_usec = (long) (system_time.wMilliseconds * 1000); return 0; }   void MP2DeviceSource::deliverFrame() {   gettimeofday(&fPresentationTime, NULL);   audioTx->GetMP2Image( &fTo, &fFrameSize ); fDurationInMicroseconds = 26000; FramedSource::afterGetting(this); }  

#ifndef _ON_DEMAND_SERVER_MEDIA_SUBSESSION_HH #include "OnDemandServerMediaSubsession.hh" #endif   class MP2AudioMediaSubsession: public OnDemandServerMediaSubsession { public: static MP2AudioMediaSubsession* createNew(UsageEnvironment& evn, unsigned int sid, Boolean requestFirstSource, AudioTransfer * audioTransfer); protected: MP2AudioMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer);   virtual ~MP2AudioMediaSubsession();   protected: virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate);   virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupSock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource);   protected: unsigned int id; AudioTransfer * audioTx; };  
#include "MP2MediaSubsession.h" #include "MP2DeviceSource.h" #include "MPEG1or2AudioRTPSink.hh" #include "MPEG1or2AudioStreamFramer.hh"   MP2AudioMediaSubsession* MP2AudioMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer) { return new MP2AudioMediaSubsession(env, reuseFirstSource, imageTransfer ); }   MP2AudioMediaSubsession::MP2AudioMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer) : OnDemandServerMediaSubsession(env, reuseFirstSource), audioTx(audioTransfer) { } FramedSource* MP2AudioMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned &estBitrate) { estBitrate = 44100; MP2DeviceSource *source = MP2DeviceSource::createNew(envir(), id, audioTx); return MPEG1or2AudioStreamFramer::createNew(envir(), source ); }   RTPSink* MP2AudioMediaSubsession::createNewRTPSink(Groupsock* rtpGroupSock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) { return MPEG1or2AudioRTPSink::createNew( envir(), rtpGroupSock ); }   MP2AudioMediaSubsession::~MP2AudioMediaSubsession() { }
live-devel mailing list
live-devel <at>
Kenneth Forsythe | 20 Aug 20:58 2014

Live555 liveMedia wrapped as into a DirectShow filter



We are currently developing a streaming player application that utilizes DirectShow filters. We were interested in wrapping liveMedia into a filter to be able to plug in as a possible RTSP client source filter. Have you done this or could you help us with that?


We would also be very appreciative of a quick response.


Thank You

live-devel mailing list
live-devel <at>
Dnyanesh Gate | 20 Aug 13:46 2014

RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase "maxRTCPPacketSize"


I am working on Live555 proxy server. I have one camera connected to
proxy server and able to open its live stream from proxy server in
first attempt.
But as soon as I disconnect client (openRTSP/VLC), proxy server
started giving this error RTCPInstance error: Hit limit when reading
incoming packet over TCP. Increase "maxRTCPPacketSize".
When I checked the last command sent by proxy server was PAUSE
command, but PAUSE command is not supported by my camera, so I change
it to TEARDOWN just for cross check.
But it still give above error on client disconnection. I did also
tried to open live stream
("rtsp://") which support
PAUSE command, but still no success. after 2-3 attempts it again
started giving above error.

Can you please suggest me anything on this.
Thanks & Regards,
Neerav Patel | 19 Aug 04:48 2014

Live Audio RTSP Streaming


I am trying to send a MP2 encoded frame via RTSP (live555) from a live source, (microphone).  I am using ffmpeg to encode audio stream, and I am sending it overriding FramedSource and for the OnDemandServer I am using MPEG1or2AudioRTPSink.hh and MPEG1or2AudioStreamFramer.hh.

Now in VLC I get to hear maybe half a second worth of sound and then it just stops...  I dont know what I am doing wrong.

Has anyone experienced this problem before?  Am I doing this right, should I be override a different class?

Thanks in advance.

live-devel mailing list
live-devel <at>