Sergey Lvov via live-devel | 19 Sep 02:44 2014

Possible bug in RTPInterface::sendDataOverTCP

Picon
From: Sergey Lvov <serg_lvov <at> yahoo.com>
Subject: Possible bug in RTPInterface::sendDataOverTCP
Date: 2014-09-18 12:55:12 GMT

Hallo everybody!

I discovered strange disconnections when I used streaming over TCP.
I recompiled live555 library with -DDEBUG and -DDEBUG_SEND and saw some diagnostic:

sendRTPorRTCPPacketOverTCP: 1448 bytes over channel 0 (socket 7)
sendDataOverTCP: resending 795-byte send (blocking)
sendDataOverTCP: blocking send() failed (delivering -1 bytes out of 795); closing socket 7
SocketDescriptor(socket 7)::deregisterRTPInterface(channel 255)
sendRTPorRTCPPacketOverTCP: failed! (errno 11)
RTSPClientConnection[0x8e80978]::handleRequestBytes() read 4 new bytes:$
RTSPClientConnection[0x8e80978]::handleRequestBytes() read 52 new bytes:?
schedule(5.170436->1411036332.468457)
RTSPClientConnection[0x8e7baf0]::handleRequestBytes() read 212 new bytes:GET_PARAMETER rtsp://192.168.0.35:8554/archive?record=541697a20c8ac43f&sessionId=35/ RTSP/1.0
CSeq: 21349
User-Agent: LibVLC/2.2.0-pre4-20140908-0202 (LIVE555 Streaming Media v2014.07.25)
Session: CED66A9C

So, errno 11 - it's EAGAIN, and that's very strange for socket in blocking mode.

However, I found topic: stackoverflow.com/questions/735249/blocking-socket-returns-eagain
And I understood that is quite possible.

I tried to fix the problem by this way:

-      sendResult = send(socketNum, (char const*)(&data[numBytesSentSoFar]), numBytesRemainingToSend, 0/*flags*/);
+     
+      do {
+           sendResult = send(socketNum, (char const*)(&data[numBytesSentSoFar]), numBytesRemainingToSend, 0/*flags*/);
+      } while(sendResult == -1 && envir().getErrno() == EAGAIN);

And it works now!

Could you possibly investigate this problem?

Thank you for you work!

Best regards,
Sergey.
Attachment (sendDataOverTCP.patch): application/octet-stream, 1456 bytes
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Muhammad Ali | 18 Sep 14:04 2014

OpenRTSP stream delay increasing with time

I am using OpenRTSP to open an RTSP stream (IP Camera) and send it to ffplay using Unix Pipes as input. I see the video playing which is roughly 1 to 1.5 sec delayed than live (Direct RTSP stream view of IP Cam) .

here is my command line

./OpenRTSP -v -Q -D 1 -n rtsp://<IP>:554/11 | ffplay -loglevel verbose -fflags nobuffer -i pipe:0

Now this delay slowly keeps increasing from 1.5 sec upto 4~5 seconds in about 20 minutes. And I assume it will continue to increase with time.

So my questions are :-
1 - what is causing the initial delay ?
2 - Why does that delay keeps increasing ?

Is there a way to disable "pre caching" of frames in OpenRTSP ?

I compare the delay by watching two videos of the same IP camera. One is configured at angelcam.com and other I view locally via the above command line.
Surprisingly, it is always Local stream that gets delayed. Angelcam stream keeps functioning as it started.

Any help ?

--
Muhammad Ali
And Or Logic
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ross Finlayson | 16 Sep 10:46 2014

Re: How to set H264 and aac live frame timestamp ?

And I use ByteStreamFileSource.cpp and ADTSAudioFileSource.cpp to get the frame data.

For h264/aac sync, I use testProgs/testOnDemandRTSPServer.cpp to do:

ServerMediaSession* sms 
= ServerMediaSession::createNew(*env, streamName, streamName, 
descriptionString);
sms->addSubsession(H264VideoFileServerMediaSubsession 
::createNew(*env, inputFileName, reuseFirstSource));
sms->addSubsession(ADTSAudioFileServerMediaSubsession 
::createNew(*env, inputFileName3, reuseFirstSource));

Using a byte stream as input works well when you are streaming just a single medium (audio or video).  However, if you are streaming both audio and video, and want them properly synchronized, then you *cannot* use byte streams as input (because, as you discovered, you don't get precise presentation times for each frame).

Instead - if you are streaming both audio and video - then each input source must deliver *discrete* frames (i.e., one frame at a time), with each frame being given an presentation time ("fPresentationTime") when it is encoded.

Specifically: You will need to define new subclass(es) of "FramedSource" for your audio and video inputs.  You will also need to define new subclasses of "OnDemandServerMediaSubsession" for your audio and video streams.  In particular:
- For audio, your subclass will redefine the "createNewStreamSource()" virtual function to create an instance of your new audio source class (that delivers one AAC frame at a time).
- For video, your subclass will redefine the "createNewStreamSource()" virtual function to create an instance of your new video source class (that delivers one H.264 NAL unit at a time - with each H.264 NAL unit *not* having an initial 0x00 0x00 0x00 0x01 'start code).  It should then feed this into a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer").  Your implementation of the "createNewRTPSink()" virtual function may be the same as in "H264VideoFileServerMediaSubsession", but you may prefer instead to use one of the alternative forms of "H264VideoRTPSink::createNew()" that takes SPS and PPS NAL units as parameters.  (If you do that, then you won't need to insert SPS and PPS NAL units into your input stream.)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ross Finlayson | 16 Sep 10:46 2014

Re: How to set H264 and aac live frame timestamp ?

And I use ByteStreamFileSource.cpp and ADTSAudioFileSource.cpp to get the frame data.

For h264/aac sync, I use testProgs/testOnDemandRTSPServer.cpp to do:

ServerMediaSession* sms 
= ServerMediaSession::createNew(*env, streamName, streamName, 
descriptionString);
sms->addSubsession(H264VideoFileServerMediaSubsession 
::createNew(*env, inputFileName, reuseFirstSource));
sms->addSubsession(ADTSAudioFileServerMediaSubsession 
::createNew(*env, inputFileName3, reuseFirstSource));

Using a byte stream as input works well when you are streaming just a single medium (audio or video).  However, if you are streaming both audio and video, and want them properly synchronized, then you *cannot* use byte streams as input (because, as you discovered, you don't get precise presentation times for each frame).

Instead - if you are streaming both audio and video - then each input source must deliver *discrete* frames (i.e., one frame at a time), with each frame being given an presentation time ("fPresentationTime") when it is encoded.

Specifically: You will need to define new subclass(es) of "FramedSource" for your audio and video inputs.  You will also need to define new subclasses of "OnDemandServerMediaSubsession" for your audio and video streams.  In particular:
- For audio, your subclass will redefine the "createNewStreamSource()" virtual function to create an instance of your new audio source class (that delivers one AAC frame at a time).
- For video, your subclass will redefine the "createNewStreamSource()" virtual function to create an instance of your new video source class (that delivers one H.264 NAL unit at a time - with each H.264 NAL unit *not* having an initial 0x00 0x00 0x00 0x01 'start code).  It should then feed this into a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer").  Your implementation of the "createNewRTPSink()" virtual function may be the same as in "H264VideoFileServerMediaSubsession", but you may prefer instead to use one of the alternative forms of "H264VideoRTPSink::createNew()" that takes SPS and PPS NAL units as parameters.  (If you do that, then you won't need to insert SPS and PPS NAL units into your input stream.)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Gerard Castillo Lasheras | 15 Sep 16:37 2014
Picon

RTSP audio & video synchronization issue

Hi all,

I have some doubts on liveMedia regarding RTSP audio & video synchronization.

We have developed a streaming application using live555 library to stream via RTSP. It works like charm when streaming only one stream per session (audio or video). However, problems appear when I try to stream both audio and video using the same RTSP session.

I tried playing it using VLC and it only plays the video. In debug mode, it shows this message continuously: 

 core audio output warning: buffer too late (-541608 us): dropped

In order to understand this behaviour I used the testRTSPClient as RTSP client, modified to print also the Normal Play Time. What I could observe is that Presentation Times of both audio and video streams are respectively coherent between them. However, after some seconds running, there is an abrupt change in Presentation Time of one of the streams (usually audio), which I suppose that corresponds to RTCP synchronization. After that, there is a big gap between video and audio NPT which remains there during all the transmission, taking sometimes negative values.

This is the testRTSPClient output (the timestamp gap can be observed):

Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.033291 NPT: 2.911316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.033291 NPT: 2.911316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.053291 NPT: 2.931316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.053291 NPT: 2.931316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.073291 NPT: 2.951316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.073291 NPT: 2.951316
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 1780 bytes. Presentation time: 1410784970.163530 NPT: 3.041563
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 3093 bytes. Presentation time: 1410784970.163530 NPT: 3.041563
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 9393 bytes. Presentation time: 1410784970.163530 NPT: 3.041563
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 5410 bytes. Presentation time: 1410784970.163530 NPT: 3.041563
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.093291 NPT: 2.971316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.093291 NPT: 2.971316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.113291 NPT: 2.991316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.113291 NPT: 2.991316
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 1731 bytes. Presentation time: 1410784977.533251 NPT: 10.411284
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 3474 bytes. Presentation time: 1410784977.533251 NPT: 10.411284
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 11707 bytes. Presentation time: 1410784977.533251 NPT: 10.411284
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 8298 bytes. Presentation time: 1410784977.533251 NPT: 10.411284
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.133291 NPT: 3.011316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.133291 NPT: 3.011316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.153291 NPT: 3.031316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.153291 NPT: 3.031316
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 2468 bytes. Presentation time: 1410784977.574917 NPT: 10.452950
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 4043 bytes. Presentation time: 1410784977.574917 NPT: 10.452950
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 12999 bytes. Presentation time: 1410784977.574917 NPT: 10.452950
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 9043 bytes. Presentation time: 1410784977.574917 NPT: 10.452950
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.173291 NPT: 3.051316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.173291 NPT: 3.051316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 1436 bytes. Presentation time: 1410784970.193291 NPT: 3.071316
Stream "rtsp://192.168.10.77:8554/S22J/"; audio/PCMU: Received 484 bytes. Presentation time: 1410784970.193291 NPT: 3.071316
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 3153 bytes. Presentation time: 1410784977.616583 NPT: 10.494616
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 3766 bytes. Presentation time: 1410784977.616583 NPT: 10.494616
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 6609 bytes. Presentation time: 1410784977.616583 NPT: 10.494616
Stream "rtsp://192.168.10.77:8554/S22J/"; video/H264: Received 4695 bytes. Presentation time: 1410784977.616583 NPT: 10.494616


I checked the RTP timestamps (in transmission) and everything seems OK (no gaps and coherent between them).

This behaviour is not the one I would expect and I think it is exactly what's happening when I try to play the session using VLC. So my question is what do you think is happening here and how do you think I can solve this issue? I have the feeling that I do something wrong in transmission (maybe defining the Presentation Time) but I can't see what.

Thanks in advance for any hint,

Kind regards,
--------------------------------------------------------
 Gerard Castillo Lasheras
  Enginyer de Projectes
  Fundació i2CAT - Unitat Audiovisual
  SkypeID: gerardcl85
  Telf.: +34.93.553.25.48
--------------------------------------------------------
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Streaming Admin | 11 Sep 15:16 2014

Configure live mpeg-ts udp input

Another approach I was thinking, if I split the incoming stream into 5-10 seconds chunks with time-stamp as filename, run a watcher script to generate index files, can the code amended to continue to the next chunk? Sorry I am not very familiar with C and couldn't find the source which checks for EOL and see if can be modified.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Alan Martinovic | 10 Sep 10:12 2014

Subclassing RTSPServer events

I have a streaming source that already provides me with a h264 RTP stream and would like to use live555 just as a RTSP server.
I require neither RTP packetization nor RTCP generation.

The ideal use case would be to be able to run custom scripts on events that represent RTSP commands, and be able to send a modified SDP.

 

Would the recommended approach be to subclass the RTSPServer, completely ignore the ServerMediaSession (and the source-sink mechanism) and  reimplement the handleCmd* commands?


DISCLAIMER:
This e-mail may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply e-mail and delete all copies of this message.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Streaming Admin | 9 Sep 16:38 2014

Configure live mpeg-ts udp input

Is it possible to (i) configure live mpeg-ts udp input (ii) store it in mpeg-ts and (iii) create timecode (MPEG2TransportStreamIndexer) on the fly to allow trick play?

Thank you.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Thilina Jayanath | 8 Sep 13:42 2014
Picon

No audio when streaming mkv , mpg

I downloaded the source code and compiled the code to make the 
live555MediaServer exe file. when i stream .mkv and .mpg file view using 
vlc, it does not receive an audio stream as shown in the image.

Image -> http://lookpic.com/O/i2/1433/9s4lV8AC.jpeg

Can someone please help me with this. Thank you in advance!
Nguyen Van Long | 8 Sep 06:55 2014

RTSPClient Auto re-connect when connection lost

Hi Ross,

Based on “openRTSP”, I wrote my own Rtsp client class that handle my own operations and everything works perfectly. My question is how to detect the connection lost (unplug the capable, server down …) and how to re-connect to server?

I’ve thought that if something’s wrong after sending play command, the “continueAfterPlay” function will be called and I could handle retrying in this function (sending Describe Command again …). Is that the right way to solve my issue or any your suggestions?

 

Thanks!

 

Regards,

 

Nguyen Van Long (Mr)

ESoft - Software Development

-----------------------------------------------------------------------------

ELCOM CORP                

Add: Elcom Building, Duy Tan Street, Cau Giay District, Ha Noi

Mobile: (+84) 936 369 326 | Skype: Pfiev.long | Web: www.elcom.com.vn

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Joy Carlos Gomes | 5 Sep 07:38 2014

OpenRTSP with Open Broadcast Software

Hi I am trying to receive an RTSP stream into Open Broadcast Software.  I currently use VLC (libVLC), however we have some hardware issues which causes BYE messages to be sent.  VLC doesn't handle these gracefully, so I am looking for another RTSP streamer.

Question:
libVLC is providing pixelData that is set on the  texture.  I am wondering how do I get the pixelData via openRTSP?  I am new to video programming and wondering where to start. 

Call to set pixel data on OBS texture:
GetTexture()->SetImage(*pixelData, GS_IMAGEFORMAT_BGRA, _this->GetTexture()->Width() * 4);

Thanks,
~Carlos~
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Gmane