Alex Anderson | 30 Jul 20:49 2015

test*Receiver RTCP estimatedSessionBandwidth of 160?

Hello,

In each of the the test*Receiver apps, they utilize RTCPInstance and pass it an estimatedSessionBandwidth of 160. I realize this parameter doesn't have to be accurate, but it should be roughly the bitrate of the stream, right? The test*Streamer apps use an estimatedSessionBandwidth of 5000, which seems to make more sense for a stream.

So, why such a small estimatedSessionBandwidth of 160 for the test*Receiver apps?

Thanks,

Alexander Anderson


EMAIL CONFIDENTIALITY STATEMENT: This email, including any attachments hereto, may contain confidential and/or proprietary information of Videon Central, Inc., and/or may include information controlled for export purposes under the U.S. International Traffic in Arms Regulations (ITAR) and is intended for the recipient only.   No export, sale, transfer, release or other disposition of this information is permitted without prior authorization. If you are not the intended recipient please 1) do not disclose, copy, distribute or use this message or its contents to any other person or entity. 2) advise the sender by return e-mail, and 3) delete all copies, including attachments, from your computer. Your cooperation is greatly appreciated.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Dnyanesh Gate | 30 Jul 13:34 2015

Memory leak in ProxyServerMediaSession

Hi,

Found a memory leak in class ProxyServerMediaSession. This leak grows with more number of cameras registered to live555ProxyServer.

class PresentationTimeSessionNormalizer is derived class of Medium. So that whenever object of class PresentationTimeSessionNormalizer is created, it creates its new entry in HashTable.
But when ~ProxyServerMediaSession() is called fPresentationTimeSessionNormalizer object is freed using delete operator instead of Medium::close(); Because of this its entry is not removed from HashTable and because HashTable is not empty, unable to reclaim UsageEnvironment at end.

If live555ProxyServer has only one UsageEnvironment object, then its not an issue, but if there are multiple UsageEnvironment objects (created when new stream registered and reclaimed when stream disconnected), then it generates memory leak per camera stream. This memory leak were causing a crash in our live555ProxyServer after too many registering and de-regestering camera streams.

Here is valgrind result (live.2015.07.23.tar.gz)
1. ProxyServerMediaSession.cpp:134  : delete fPresentationTimeSessionNormalizer;
    ==13119== LEAK SUMMARY:
    ==13119==    definitely lost: 1,040 bytes in 1 blocks
    ==13119==    indirectly lost: 171 bytes in 5 blocks

(FIX)
2. ProxyServerMediaSession.cpp:134  : Medium::close(fPresentationTimeSessionNormalizer);
    ==12990== LEAK SUMMARY:
    ==12990==    definitely lost: 0 bytes in 0 blocks
    ==12990==    indirectly lost: 0 bytes in 0 blocks

Please check whether this is a valid fix or not.
I hope this fix may help for live555 stability.
--
Thanks & Regards,
DnyaneshG.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Maxim P. DEMENTIEV | 29 Jul 18:18 2015
Picon

RTPSink for a PCM audio subsession

Hello,

We need to broadcast the PCM (S16LE/16000) audio as a separate channel
with our RTSP-server.

I've found this letter for the AAC format:

    http://lists.live555.com/pipermail/live-devel/2014-January/017980.html

What is a preferred RTPSink class that I should use for this audio
format in our OnDemandServerMediaSubsession?

Thank you!

Best regards,
Max

key frame detection

Hi,
I am developing an application based on testRTSPClient.cpp. I am able to store video files
from method DummySink::afterGettingFrame. What I would like is to detect key frames in order to close current file and open a new one starting always with a key frame.

I guess that that information is included in RTP headers, but I don't know the best way to propagate it in the same way that presentationTime.

Thanks in advance,

Best regards,

Paco
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Krishna Kumar K B | 28 Jul 12:54 2015

mp4 streaming in live555

Hi Ross,

I am able to stream mp3 files using live555MediaServer.

Now I want to stream mp4 file, I have given a try with testMPEG4VideoStreamer executable. I have modified inputFileName as below

        #include "liveMedia.hh"
        #include "BasicUsageEnvironment.hh"
        #include "GroupsockHelper.hh"
        

        UsageEnvironment* env;
        char const* inputFileName = "myfilename.mp4";

 

In client side, I am not able to receive the stream. Please let us know the right procedure to transfer mp4 files.

 

Regards,

Krishna

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Craig Matsuura | 27 Jul 21:46 2015

liveProxy handling of a failed SETUP command

I have a question in regards to the error handling for the liveProxy. If the DESCRIBE command succeeds to a camera being proxied (200 OK) and the following SETUP command to the camera fails (for any reason), how does the proxy handle this case? (the liveness OPTION command does not fail).


I assume the answer is a buggy RTSP server, however should this not be handled similar to when a DESCRIBE failures, and retries?


Thanks,

Craig

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ross Finlayson | 27 Jul 13:21 2015

Re: Video Stream Rendering - Live555

After rendering video from player, we will have video buffer. I didn't find any APIS for streaming buffer data. Looks like, its supporting only file based streaming(file.mp3) . Is live555 supporting only file based streaming? If its file based, how we can achieve streaming using video buffer?



This is explained in the FAQ that everyone is asked to read before posting to the mailing list.  In particular:
and

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Krishna Kumar K B | 27 Jul 09:57 2015

Video Stream Rendering - Live555

Hi All,

I am currently testing the live555 code base for streaming functionality. I have did hands on with  live555MediaServer and openRTSP client executable.

I am able to receive the stream successfully. Currently its stored in file(For Eg: audio-MPA-1)

 

Is there any support in live555 codebase  for rendering the video or we need to write our own code for rendering?

Please give a suggestion.

 

Regards,

Krishna

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
piaoyang5097 | 25 Jul 14:09 2015
Picon

Android rtsp client complied with NDK cannot receive any multicast video data

Hi,
  I'm working on an live streaming platform. I put the live555 rtsp server on the windows PC, then I developed a
android application which used testRTSPclient to receive the video data through JNI.
  Now the problem is:
  When the server on the win-pc used unicast, my android app can receieve the data smoothly,  but when the
server used multicast, my android app can only get the SDP information with no video data received.
  In the way of multicast, the SDP information is as follows.
07-25 19:26:43.940: I/recrtsp_jni(28905): Got a SDP description: 
07-25 19:26:43.940: I/recrtsp_jni(28905):  v=0
07-25 19:26:43.940: I/recrtsp_jni(28905): o=- 1437823591858813 1 IN IP4 192.168.3.9
07-25 19:26:43.940: I/recrtsp_jni(28905): s=without audio
07-25 19:26:43.940: I/recrtsp_jni(28905): i=group
07-25 19:26:43.940: I/recrtsp_jni(28905): t=0 0
07-25 19:26:43.940: I/recrtsp_jni(28905): a=tool:LIVE555 Streaming Media v2014.11.12
07-25 19:26:43.940: I/recrtsp_jni(28905): a=type:broadcast
07-25 19:26:43.940: I/recrtsp_jni(28905): a=control:*
07-25 19:26:43.940: I/recrtsp_jni(28905): a=source-filter: incl IN IP4 * 192.168.3.9
07-25 19:26:43.940: I/recrtsp_jni(28905): a=rtcp-unicast: reflection
07-25 19:26:43.940: I/recrtsp_jni(28905): a=range:npt=0-
07-25 19:26:43.940: I/recrtsp_jni(28905): a=x-qt-text-nam:without audio
07-25 19:26:43.940: I/recrtsp_jni(28905): a=x-qt-text-inf:group
07-25 19:26:43.940: I/recrtsp_jni(28905): m=video 6032 RTP/AVP 96
07-25 19:26:43.940: I/recrtsp_jni(28905): c=IN IP4 232.16.229.111/255
07-25 19:26:43.940: I/recrtsp_jni(28905): b=AS:12000
07-25 19:26:43.940: I/recrtsp_jni(28905): a=rtpmap:96 H264/90000
07-25 19:26:43.940: I/recrtsp_jni(28905): a=fmtp:96 packetization-mode=1;profile-level-id=4D4029;sprop-parameter-sets=Z01AKZpmCg/YC1AQEBBenAA=,aO48gA==
07-25 19:26:43.940: I/recrtsp_jni(28905): a=control:track1
  I was sure that my router did support IP multicast, and when I used testRTSPclient on the windows, it worked
fine to receive the multicast video data.
   After searching for the similar topics, I see it seems that I have to acquire a multicasting permission with
wifi at first. So I did:
- put permissions in AndroidManifest.xml

<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE""/>  
<uses-permission android:name="android.permission.INTERNET"/>  
<uses-permission android:name="android.permission.CHANGE_WIFI_MULTICAST_STATE"/>
- put following java codes in android Activity::onCreate()

WifiManager wm = (WifiManager)getSystemService(Context.WIFI_SERVICE);  
if( wm != null ) {  
    mMCLock = wm.createMulticastLock( TAG );  
    mMCLock.acquire();  
}

  But it still doesn't work. I have read FAQ and gone through the live555 mailing archives, finding nothing
helpful. Could you help me?
Ross Finlayson | 23 Jul 19:43 2015

Important "RTSPServer" bug fix; please update to the latest version of the "LIVE555 Streaming Media" code

If you make use of our “RTSPServer” class - including the “LIVE555 Media Server” or the “LIVE555 Proxy Server” - then you should upgrade ASAP to the latest version (2015.07.23) of the “LIVE555 Streaming Media” code.  This version fixes a potential buffer overflow bug in the “RTSPServer” implementation.

(The RTSP client code is not affected.)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Maxim P. DEMENTIEV | 23 Jul 15:33 2015
Picon

Changing maxPacketSize value for H264VideoRTPSink

Hello,

Currently I'm working on improving an RTSP/RTCP/RTP server based on Live
library which transmits the live H264 stream in HD resolution.

Trying to lower the CPU consumption by the server (the hot-spot was
MultiFramedRTPSink::sendNext callback), I increased the maxPacketSize
value (by calling setPacketSizes() in the subsession's
createNewRTPSink() before return). It has solved the problem but I worry
about the consequences.

So, I've got some questions:

1. What is the safe range for maxPacketSize regarding to the modern
RTSP/RTP clients and the local networks?

2. Can you give some advice about the net load/bandwidth balancing and
the size of the RTP packet in the context of Live library?

3. Are there other ways to reduce the CPU consumption by the
Fragmenter/Sink?

Thanks in advance!

Regards,
Max

Gmane