Hans Maes | 8 Feb 15:55 2016

support for absolute seek end time in openRTSP testprog (patch provided)


I noticed the openRTSP testprog had support for absolute seek start time 
(-U option, "Range" header) but not for the absolute seek end time, but 
support for the end time was present in the rest of the code.
I needed this for a small project I'm working on so I created this very 
minor patch against playCommon.cpp version 2014.01.13 which adds support 
for the absolute seek end time as well.
Thought I'd share it as it may be useful for someone.

- I got the source from the debian package currently available in 
jessie, not sure if this is the most recent?
- I used command line argument -E since it seemed to still be free, not 
sure if this is ok?
- I'm no c++ expert so code is provided without warranty, but changes 
are minor so I guess it should be ok.



$ diff liblivemedia-2014.01.13-orig/testProgs/playCommon.cpp 
 > char* initialAbsoluteSeekEndTime = NULL;
<        << " [-s <initial-seek-time>]|[-U <absolute-seek-time>] [-z 
<scale>] [-g user-agent]"
(Continue reading)

Vilaysak Thiphavong | 8 Feb 14:29 2016

playSIP crashed after getting 200 OK for invite

Hi all,


I was trying to make call to my IP phone using playSIP with the following arguments


./playSIP –a –A8 sip:Dialogphone <at>


For any reason after answering a phone then playSIP crashed, Any idea please?


Below is the log:

root <at> systemcontroller testProgs]# ./playSIP -a -A 8 sip:Dialog <at>

Sending request: INVITE sip:Dialog <at> SIP/2.0

From: Dialog <sip:Dialog <at>>;tag=541495922

Via: SIP/2.0/UDP

Max-Forwards: 70

To: sip:Dialog <at>

Contact: sip:Dialog <at>

Call-ID: 934922419 <at>


Content-Type: application/sdp

User-Agent: playSIP (LIVE555 Streaming Media v2016.01.29)

Content-Length: 117



o=- 934922419 0 IN IP4

s=playSIP session

c=IN IP4

t=0 0

m=audio 8000 RTP/AVP 8


Received INVITE response: SIP/2.0 180 Ringing

Via: SIP/2.0/UDP

Call-ID: 934922419 <at>

Contact: "Dialog" <sip:Phone <at>>


From: Dialog <sip:Dialog <at>>;tag=541495922

Supported: timer

To: sip:Dialog <at>;tag=00000f88-f0f0ff78

Server: ipDialog SipTone 1.2.0 rc Z_11 UAS


Content-Length: 0



Received INVITE response: SIP/2.0 200 OK

Via: SIP/2.0/UDP

Call-ID: 934922419 <at>

Contact: "Dialog" <sip:Phone <at>>


From: Dialog <sip:Dialog <at>>;tag=541495922

Supported: timer

To: sip:Dialog <at>;tag=00000f88-f0f0ff78

Server: ipDialog SipTone 1.2.0 rc Z_11 UAS


Content-Type: application/sdp

Content-Length: 158



o=Phone 1010280478 1010280478 IN IP4

s=Sip Call

c=IN IP4

t=0 0

m=audio 5020 RTP/AVP 8

a=rtpmap:8 PCMA/8000



Opened URL "sip:Dialog <at>", returning a SDP description:


o=- 934922419 0 IN IP4

s=playSIP session

c=IN IP4

t=0 0

m=audio 8000 RTP/AVP 8


Created receiver for "audio/PCMA" subsession (client ports 8000-8001)

Setup "audio/PCMA" subsession (client ports 8000-8001)

Segmentation fault (core dumped):1


[root <at> systemcontroller testProgs]#







live-devel mailing list
live-devel <at> lists.live555.com
IR@InterStrat.com | 7 Feb 04:59 2016

Starting up

Guidance please on required background for using Live555.

What are good Python sources for streaming media applications?
What level of C++ ?
Any other resources that are recommended to get going?
Thanks in advance

Ben Rush | 6 Feb 22:03 2016

A question regarding the timing of doGetNextFrame


I have been experimenting with live555 for a week now and have had a high level of success in using it to stream h264 video from a camera. Now I want to add audio and I'm having some issues. 

I've read through a lot of the forum content related to this and I'm either doing something peculiar, or I missed something along the way (I'm using Windows, by the way). I have my server setup like this: 

    TaskScheduler* taskSchedular = BasicTaskScheduler::createNew();
    BasicUsageEnvironment* usageEnvironment = BasicUsageEnvironment::createNew(*taskSchedular);
    RTSPServer* rtspServer = RTSPServer::createNew(*usageEnvironment, 8554, NULL);
    if (rtspServer == NULL)
        *usageEnvironment << "Failed to create rtsp server ::" << usageEnvironment->getResultMsg() << "\n";

    std::string streamName = "feynman";
    ServerMediaSession* sms = ServerMediaSession::createNew(*usageEnvironment, streamName.c_str(), streamName.c_str(), "Live H264 Stream");
    H264LiveServerMediaSession *liveSubSession = H264LiveServerMediaSession::createNew(*usageEnvironment, false);
    WindowsAudioMediaSession* audioSession = WindowsAudioMediaSession::createNew(*usageEnvironment, false);
    char* url = rtspServer->rtspURL(sms);
    *usageEnvironment << "Play the stream using url " << url << "\n";
    delete[] url;

When just the WindowsAudioMediaSession session is enabled (I comment out the adding of the H264LiveServerMediaSession as a sub session), audio works just fine. I can hear it, it's clear, and we're all good. doGetNextFrame for my AudioInputDevice gets called many times as a second as well. Again, as expected. 

However, the moment I add back in my video stream (which works just fine on its own as well), my AudioInputDevice's doGetNextFrame() is called, maybe, once a second. Video continues to stream just fine (and my FramedSource for video is called as frequently as before), but the audio fails. 

I have seen from reading the lists that care must be taken to ensure the timing is correct between the two streams, but I believe I'm running into a bigger problem since the server just stops calling my AudioInputDevice as frequently the moment both sessions are enabled . 

Do you have any insight into why this might be happening? What am I doing wrong? Thanks in advance. 
live-devel mailing list
live-devel <at> lists.live555.com
Eric_Hsieh | 2 Feb 18:16 2016

Why VLC show fixed fps when use H264VideoStreamDiscreteFramer to send H264 data?

Dear Ross,

I have a question about the H264VideoStreamDiscreteFramer to stream H264 data.
Using H264VideoStreamDiscreteFramer, I see the code.
At H264or5VideoStreamFramer, there is a fFrameRate, I am sure it is correct fps(30).
But when I use vlc to play the stream, VLC always show the fixed fps(14.985015).

So, my question is, vlc show fixed fps, is SDP problem or fFrameRate?
Here is SDP: sets=Z01AKedAZBzf+BpAGjoEAABd2AAK/IGAAAAL68IAAAdzWU//+MAAAAX14QAAAwO5rKf//Ag=,aP48gA==
How do I to fix this issue? Please advice. Thanks very much.

regards, eric, 02/02

This electronic mail transmission is intended only for the named recipient. It contains information
which may be privileged,confidential and exempt from disclosure under applicable law. Dissemination,
distribution, or copying of this communication by anyone other than the recipient or the recipient's
agent is strictly prohibited. If this electronic mail transmission is received in error, Please notify
us immediately and delete the message and all attachments of it from your computer system. Thank you for
your cooperation.
Jeff Shanab | 31 Jan 18:51 2016

Client only library

   Is it possible, and allowed in the license, to build a client only dll to save size?

If I am pulling from rtsp sources only, to record them for example I do not need the server or proxyserver code and could save a lot of space.
live-devel mailing list
live-devel <at> lists.live555.com
xiaoyiz_2 | 29 Jan 02:40 2016

A question about live555's support on mp4&#8207;

Hi ,
 I have the following question :

We currently use the live555 code to run a RTSP server source from a
camera. The camera generates H264 content as two streams (with different
resolutions high-res & low-res). There is no audio. We stream the low-res
on RTP, and optionally store the high-res to file. We would like to use
the live555 to store the high-res content locally as mp4 container file.
This is on a linux system.

Although we were partially successful with QuickTimeFileSink, the approach
is not clean. We don’t have a complete rtpSource built for the high-res
stream. Hence we had to skip the dereference to that object for RTPSeqNum
as below. Attached was my example code to generate the mp4 file.

void SubsessionIOState::afterGettingFrame(unsigned packetDataSize,

                                    struct timeval presentationTime) {

  // Begin by checking whether there was a gap in the RTP stream.

  // If so, try to compensate for this (if desired):

  unsigned short rtpSeqNum = 0;

  if (fOurSubsession.rtpSource()) {

    rtpSeqNum = fOurSubsession.rtpSource()->curPacketRTPSeqNum();


  if (fOurSink.fPacketLossCompensate && fPrevBuffer->bytesInUse() > 0) {

    short seqNumGap = rtpSeqNum - fLastPacketRTPSeqNum;

    for (short i = 1; i < seqNumGap; ++i) {

      // Insert a copy of the previous frame, to compensate for the loss:




  fLastPacketRTPSeqNum = rtpSeqNum;


Please let us know whether you will be able to support a mp4 container
file generation from this library. Whether the change we made above is
acceptable for the time being? If you do not plan to support this, but
have different suggestions about the approach or another alternate
library, we would really appreciate your feedback.
/* a derived MediaSubsession used to source from a local stream */
class H264VideoSubsession : public MediaSubsession {
    CSession* video_ = NULL;
    std::shared_ptr<FramedSource> src_;

    H264VideoSubsession(MediaSession& parent) : MediaSubsession(parent) {
        video_ = new CSession;
        video_->openFramedSource(fParent.envir(), src_);
        fCodecName = "H264";
        fReadSource = src_.get();
        fVideoWidth = 2560;
        fVideoHeight = 1440;
        fVideoFPS = 30;
    virtual Boolean createSourceObjects(int useSpecialRTPoffset) {
        return true;

/* a derived MediaSession to fake a SDP from RTPSource */
class H264MediaSession : public MediaSession {
    H264MediaSession(UsageEnvironment& env) : MediaSession(env) {}

    virtual MediaSubsession* createNewMediaSubsession() {
        return new H264VideoSubsession(*this);

    static H264MediaSession* create(UsageEnvironment& env) {
        H264MediaSession* p;

        p = new H264MediaSession(env);
        if (NULL != p) {
                // sdp description
                "m=video 0 RTP/AVP 96\r\n");

        return p;

static void sessionAfterPlaying(void* /*clientData*/) {}

int main(int argc, char** argv)
    std::string name;

    TaskScheduler* sched = BasicTaskScheduler::createNew();
    UsageEnvironment* env = BasicUsageEnvironment::createNew(*sched);

    MediaSession* session = H264MediaSession::create(*env);

    QuickTimeFileSink* qtOut = QuickTimeFileSink::createNew(*env,
        *session, "video_T.mp4",
        100000, /// fileSinkBufferSize,
        2560, 1440, 30,
        false, /// no packetLossCompensate,
        false, /// no syncStreams,
        false, /// no generateHintTracks,
        true); /// yes generateMP4Format

    qtOut->startPlaying(sessionAfterPlaying, NULL);

    std::cout << "Press Enter Key to quit.";
    std::getline(std::cin, name);

    Medium::close(qtOut); qtOut = NULL;
    MediaSubsessionIterator iter(*session);
    MediaSubsession* subsession;
    while ((subsession = iter.next()) != NULL) {
        subsession->sink = NULL;

    return 0;
live-devel mailing list
live-devel <at> lists.live555.com
Julian Scheel | 27 Jan 12:24 2016

[RFC PATCH] Add option to disable RTCP

Hi Ross,

we have the need to create a RTSP Session without a RTCP Instance because of a
Server that has a bug causing it not to terminate sessions after it received
an RTCP report from the client.
I came up with the attached patch, which feels really ugly and I'm pretty sure
could have unexpected side-effects. Would you mind taking a look and maybe
propose a better way to do this?



Allow connections without a RTCP port to be configured.
 liveMedia/MediaSession.cpp        |  6 +++++-
 liveMedia/RTSPClient.cpp          | 18 ++++++++++++++----
 liveMedia/include/MediaSession.hh |  3 +++
 3 files changed, 22 insertions(+), 5 deletions(-)

diff --git a/liveMedia/MediaSession.cpp b/liveMedia/MediaSession.cpp
index a5b13d0..54e9346 100644
--- a/liveMedia/MediaSession.cpp
+++ b/liveMedia/MediaSession.cpp
 <at>  <at>  -821,7 +821,7  <at>  <at>  Boolean MediaSubsession::initiate(int useSpecialRTPoffset) {

     // Finally, create our RTCP instance. (It starts running automatically)
-    if (fRTPSource != NULL && fRTCPSocket != NULL) {
+    if (fRTPSource != NULL && fRTCPSocket != NULL && !fNoRTCP) {
       // If bandwidth is specified, use it and add 5% for RTCP overhead.
       // Otherwise make a guess at 500 kbps.
       unsigned totSessionBandwidth
 <at>  <at>  -867,6 +867,10  <at>  <at>  Boolean MediaSubsession::setClientPortNum(unsigned short portNum) {
   return True;

+void MediaSubsession::setNoRTCP(Boolean enable) {
+  fNoRTCP = enable;
 char const* MediaSubsession::attrVal_str(char const* attrName) const {
   SDPAttribute* attr = (SDPAttribute*)(fAttributeTable->Lookup(attrName));
   if (attr == NULL) return "";
diff --git a/liveMedia/RTSPClient.cpp b/liveMedia/RTSPClient.cpp
index fc68042..f659ff5 100644
--- a/liveMedia/RTSPClient.cpp
+++ b/liveMedia/RTSPClient.cpp
 <at>  <at>  -662,9 +662,15  <at>  <at>  Boolean RTSPClient::setRequestFields(RequestRecord* request,
     char const* transportFmt;
     if (strcmp(subsession.protocolName(), "UDP") == 0) {
       suffix = "";
-      transportFmt = "Transport: RAW/RAW/UDP%s%s%s=%d-%d\r\n";
+      if (subsession.noRTCP())
+        transportFmt = "Transport: RAW/RAW/UDP%s%s%s=%d\r\n";
+      else
+        transportFmt = "Transport: RAW/RAW/UDP%s%s%s=%d-%d\r\n";
     } else {
-      transportFmt = "Transport: RTP/AVP%s%s%s=%d-%d\r\n";
+      if (subsession.noRTCP())
+        transportFmt = "Transport: RTP/AVP%s%s%s=%d\r\n";
+      else
+        transportFmt = "Transport: RTP/AVP%s%s%s=%d-%d\r\n";

     cmdURL = new char[strlen(prefix) + strlen(separator) + strlen(suffix) + 1];
 <at>  <at>  -699,8 +705,12  <at>  <at>  Boolean RTSPClient::setRequestFields(RequestRecord* request,
     unsigned transportSize = strlen(transportFmt)
       + strlen(transportTypeStr) + strlen(modeStr) + strlen(portTypeStr) + 2*5 /* max port len */;
     char* transportStr = new char[transportSize];
-    sprintf(transportStr, transportFmt,
-	    transportTypeStr, modeStr, portTypeStr, rtpNumber, rtcpNumber);
+    if (subsession.noRTCP())
+      sprintf(transportStr, transportFmt,
+	      transportTypeStr, modeStr, portTypeStr, rtpNumber);
+    else
+      sprintf(transportStr, transportFmt,
+	      transportTypeStr, modeStr, portTypeStr, rtpNumber, rtcpNumber);

     // When sending more than one "SETUP" request, include a "Session:" header in the 2nd and later commands:
     char* sessionStr = createSessionString(fLastSessionId);
diff --git a/liveMedia/include/MediaSession.hh b/liveMedia/include/MediaSession.hh
index f2ea909..107fcdc 100644
--- a/liveMedia/include/MediaSession.hh
+++ b/liveMedia/include/MediaSession.hh
 <at>  <at>  -168,6 +168,7  <at>  <at>  public:
   char const* protocolName() const { return fProtocolName; }
   char const* controlPath() const { return fControlPath; }
   Boolean isSSM() const { return fSourceFilterAddr.s_addr != 0; }
+  Boolean noRTCP() const { return fNoRTCP; }

   unsigned short videoWidth() const { return fVideoWidth; }
   unsigned short videoHeight() const { return fVideoHeight; }
 <at>  <at>  -207,6 +208,7  <at>  <at>  public:
       // description does not specfy a client port number - an ephemeral
       // (even) port number is chosen.)  This routine must *not* be
       // called after initiate().
+  void setNoRTCP(Boolean enable);
   void receiveRawMP3ADUs() { fReceiveRawMP3ADUs = True; } // optional hack for audio/MPA-ROBUST; must not
be called after initiate()
   void receiveRawJPEGFrames() { fReceiveRawJPEGFrames = True; } // optional hack for video/JPEG; must not
be called after initiate()
   char*& connectionEndpointName() { return fConnectionEndpointName; }
 <at>  <at>  -312,6 +314,7  <at>  <at>  protected:
   char* fControlPath; // holds optional a=control: string
   struct in_addr fSourceFilterAddr; // used for SSM
   unsigned fBandwidth; // in kilobits-per-second, from b= line
+  Boolean fNoRTCP; // disable rtcp port announcement

   double fPlayStartTime;
   double fPlayEndTime;

Taimoor Alam | 27 Jan 08:45 2016

Getting RTSP stream in vector form for rendering in WebGl


I need to render an RTSP stream in browser using WebGl, I get binary raw data with the ffmpeg in PNaCl in Google Chrome. Now I need to render the pixels in WebGl, but it seems that WebGl is a rasterization API which is good at rendering vectors and not pixels. I have found a class BitVector.hh. Can you please guide me what is the purpose of this class and can it be used to convert an RTSP to vectorial form so that I can render the stream in vectorial form in WebGl?


live-devel mailing list
live-devel <at> lists.live555.com
SIMON Loic | 18 Jan 17:33 2016

Problem with sending a PAUSE command

Hi all

I am working on little app to record RTSP streaming

So i included a large part of OpenRTSP in my app, calling it in another thread (my entry point is the main function), but i got stuck very fast in the infinite loop "DoEventLoop" and I managed to make myself a exit point using watchVariable

But now i am trying to send a PAUSE command unsuccessfully for now

So my question is, how can i send a PAUSE command ( and PLAY to resume) from another thread knowing that my "Open rtsp" thread is in the doEventLoop ??

Thanks a lot

This email is intended only for the above addressee. Since the integrity of this message is not being secured on the internet, OKTAL SAS cannot be held responsible for its contents. Any use or unauthorized distribution is prohibited. If you are not the addressee of this message, please delete it and notify the sender.

live-devel mailing list
live-devel <at> lists.live555.com
Diego Tsutsumi | 13 Jan 17:36 2016

Problems using openRTSP

Hi Guys, I'm having some problems using the openRTSP application. We are trying to record videos from a H264 remote camera, which streams over RTSP protocol. It is actually recording but I couldn't have it recording into a MP4 container file (I'm using the option -4 as explained here http://www.live555.com/openRTSP/), it is recording into a MOV file. I know it is a MOV file because it doesn't play in Android mobiles, when I record from the same camera through VLC into a MP4 file it actually works and plays on Android Mobiles.

Do you guys have any hint of what is going on? Thanks

live-devel mailing list
live-devel <at> lists.live555.com