Xuan | 24 Jul 03:03 2014

How to combine live H.264 source and AAC source



发件人: Xuan [mailto:lkxkfl <at> hotmail.com]
发送时间: 2014723 23:01
收件人: live-devel <at> lists.live555.com
主题: How to combine live H.264 source and AAC source



  I am working on a testing program based on live555. In my program, I use a thread to capture video frames by webcam, and another thread to capture audio frames. In each thread, both video and audio frames are encoded with ffmpeg. Then I use a third thread to stream one of them.

  My problem is that streaming either of them is Ok, but how can I stream them together? If I do like this in a thread:

sms->addSubsession(PassiveServerMediaSubsession::createNew(*vSink, rtcp));

sms->addSubsession(PassiveServerMediaSubsession::createNew(*aSink, audioRtcp));


H264RealTimeStreamSource* naluSource =  H264RealTimeStreamSource::createNew(*env_live,&pThis->videoCList2,frame_rate);

h264Source = H264VideoStreamDiscreteFramer::createNew(*env_live, naluSource);

adtsSource = ADTSRealTimeStreamSource::createNew(*env_live,&pThis->audioCList,1,44100,2,NULL);

pThis->startVideoLive = vSink->startPlaying(*h264Source, afterPlayingLiveH264, NULL);

pThis->startAudioLive = aSink->startPlaying(*adtsSource, afterPlayingLiveAAC, NULL);

I can only receive video frames with VLC player, and also timestamp of VLC player is rather unstable.

Could I do anything wrong, or maybe I should stream h.264 and aac in different thread rather than in one thread at the same time?


Thank you a lot. Looking forward for reply!



live-devel mailing list
live-devel <at> lists.live555.com
Jon Shemitz | 24 Jul 20:39 2014

Decode Secure RTP on Android, using MediaCodec + MediaCrypto?

I hope this isn’t spam: I’m using VLC to stream camera output over RTSP and, so far as I know, VLC relies on live555 code for that.


Anyhow (as per http://stackoverflow.com/questions/24920200/which-mediacrypto-uuid-do-you-use-with-secure-rtp) I’ve written  some Android code that does a nice job decoding H264 format RTSP video, using the MediaCodec. But when I have VLC use Secure RTP, I need to pass the MediaCodec a MediaCrypto instance … which requires a UUID and a byte array of “initialization data”.


Can anyone here point me in the right direction?

live-devel mailing list
live-devel <at> lists.live555.com
Marco Porsch | 24 Jul 18:34 2014

RFC4175 uncompressed video streaming: variable payload header length


I am currently trying to implement uncompressed video streaming according to RFC4175. An issue that I
struggle with, is that the payload header is variable in length (see [1]). The header length depends on
packet size, image line length and current fragmentation offset. (I use some simplifications
concerning color sampling for now. Also the payload header is not yet evaluated except for the
continuation flag.)

I can calculate the header length and write the header accordingly in my subclass'
doSpecialFrameHandling() function that calls setSpecialHeaderBytes(). But on the receiver side I
always see image line offset artefacts, i.e. one or multiple lines shifted left/right.
The receiver code should be alright, as it just looks for the "continuation" flag to determine the header
length and skips the header by setting "resultSpecialHeaderSize" accordingly.

The issue seems to be in the order of events concerning doSpecialFrameHandling() and
specialHeaderSize(). My subclass' specialHeaderSize() is called called from
MultiFramedRTPSink.cpp before the header is written in my doSpecialFrameHandling(). Thus, it seems I
have to predict the header size one fragment ahead? But in that case I do not yet know how large
"numBytesInFrame" is, which seems to change depending on my previously written header sizes...

I also tried using setFrameSpecificHeaderBytes() and frameSpecificHeaderSize(), but the result was
just a totally garbled and twisted image on the receiver side.

Yes, this is all a bit confusing. Maybe I am just doing something totally wrong. Maybe you could clarify on
how to cope with variable per-frame payload header lengths at all?

Marco Porsch


PS: Please let's not have a discussion on the craziness of uncompressed video streaming in high
resolution. I am aware that it is. =)
Michael Rahlff | 23 Jul 17:17 2014

Problem setting rtpmap on Flir A320 thermal camera.

Dear Sirs.
I am having trouble trying to get a thermal camera (Flir A320) to stream the raw data.

My goal is to receive raw data from a Flir A320 (rtpmap 102 [See below output] ).

I have successfully connected to and streamed MPEG4 encoded video from the camera with VLC, but as mentioned, I need to change to rtpmap 102 to get the raw data instead.

I have downloaded “live555” and looked at testRTSPClient.cpp example, but cannot see where to set rtpmap.

Is it correct understanding, I need to make my own class and then inherence from the RTPSource class?

According to the Flir A320 manual, the transport format is as described in RFC4175 (RTP - payload format for uncompressed video)

Flir A320 SDP DESCRIBE response:
o=- 0 0 IN IP4
s=IR stream
i=Live infrared
c=IN IP4
m=video 0 RTP/AVP  96 97 98 99 100 102 103
a=rtpmap:96 MP4V-ES/90000
a=framesize:96 640-480
a=fmtp:96 profile-level-id=1;config=000001B003000001B509000001010000012002045D4C28A021E0A4C7
a=rtpmap:97 MP4V-ES/90000
a=framesize:97 320-240
a=fmtp:97 profile-level-id=1;config=000001B003000001B509000001010000012002045D4C285020F0A4C7
a=rtpmap:98 MP4V-ES/90000
a=framesize:98 160-128
a=fmtp:98 profile-level-id=1;config=000001B003000001B509000001010000012002045D4C282820A0A4C7
a=rtpmap:99 FCAM/90000
a=framesize:99 320-240
a=fmtp:99 sampling=mono; width=320; height=240; depth=16
a=rtpmap:100 FCAM/90000
a=framesize:100 160-120
a=fmtp:100 sampling=mono; width=160; height=120; depth=16
a=rtpmap:102 raw/90000
a=framesize:102 320-240
a=fmtp:102 sampling=mono; width=320; height=240; depth=16
a=rtpmap:103 raw/90000
a=framesize:103 160-120
a=fmtp:103 sampling=mono; width=160; height=120; depth=16

Thanks in advance

Best regards

Michael Rahlff
Holsteinsgade 35B
8300 Odder

Tlf: 407-408-22


live-devel mailing list
live-devel <at> lists.live555.com

RTSP range clock/npt SDP parse

looking into file liveMedia/MediaSession.cpp I believe there is an
extra space in parseRangeAttribute method.

Instead of:
return sscanf(sdpLine, "a=range: npt = %lg - %lg", &startTime, &endTime) == 2;
I think it should read:
return sscanf(sdpLine, "a=range:npt=%lg - %lg", &startTime, &endTime) == 2;

Instead of:
int sscanfResult = sscanf(sdpLine, "a=range:clock=%[^-\r\n]-%[^\r\n]", as, ae);
I think it should read:
int sscanfResult = sscanf(sdpLine, "a=range:clock=%[^-\r\n]-%[^\r\n]", as, ae);

Could you confirm if this is indeed the case ? Or can you at least
support both cases ?

Thank you,
Paulo Vitor
live-devel mailing list
live-devel <at> lists.live555.com
Cường Lê | 18 Jul 08:50 2014

RTSP Server: streaming filr .mp4

I 'm using your live555. I very like it.
I have builded and streamed file .mkv, .h264, .m4v..., it can operate very good.
When I try to streamed file .mp4, i have used VLC client capture URL but it can't display and error.
Can you help me to stream file .mp4 in RTSP server?
Thank you very much!
live-devel mailing list
live-devel <at> lists.live555.com
Alix Frombach | 17 Jul 13:26 2014

Using openRTP for receiving MPEG TS


I am able to use the openRTSP client for receiving H.264 and MP4 data from an IP camera fine, but cannot seem to find a way to receive MPEG TS from the same camera.  Is this a supported feature of openRTSP?  I saw a method of piping the received H.264 video data to MPEG TS using testH264VideoToTransportStream, but this is not desired as I would like to capture video and audio simultaneously.

Thanks in advance
live-devel mailing list
live-devel <at> lists.live555.com
rajesh gupta | 16 Jul 11:21 2014

Streaming from live source

Dear All,
             Is that possible from Live555MediaServer can stream the data form my webcamera .
live-devel mailing list
live-devel <at> lists.live555.com
Yann FLEUTOT | 15 Jul 17:25 2014

DoS in Media Server

Hello LIVE555 team,

Forging my own requests for testing purposes, I recently found a DoS vulnerability in the media server. Do you want me to give the details as well as an exploit script here on this mailing-list or privately?

Yann Fleutot
Stormshield Network Security developer
Arkoon Netasq
49 rue Billancourt - FR 92100 Boulogne-Billancourt

Twitter - LinkedIn - www.stormshield.eu

live-devel mailing list
live-devel <at> lists.live555.com
David Cassany Viladomat | 15 Jul 10:45 2014

use ServerMediaSubsession for RTP & RTCP streaming

Hi all,

I have some doubts on how to use the OnDemandServerMediaSubsession class. I am actually using it (we defined a very simple extension of it adapted to our data source) to stream via RTSP without any issue, it works like charm.

Now I am trying to achieve is to be able to stream the same source via RTSP (to any client that connects to the rtsp url) and via an static RTP/RTCP session (by static I mean that I want the server to stream to an specific IP and port, passed as a command argument for instance).

I want to stream to the static destination usign the same Subsession (this way I expect to not have to worry about concurrent RTPSinks getting frames from the same source). At the moment I am using:

subsession->getStreamParameters(clientSessionId, addr,
                                    port, port + 1, -1, 0, 0, dstAddr, destinationTTL,
                                    multicast, serverRTPPort, serverRTCPPort, streamState);

                            streamState, NULL, NULL, rtpSeqNum, rtpTimestamp, NULL, NULL);

//where addr and port are the IP adress and port I want to stream to, multicast is False, TTL = 255 adn streamState just a pointer where I keep the StreamState reference for later use.

These two lines executes without failure, and debugging them I think they run as expected, but after executing it does not stream. I have the feeling I miss something else to actually start streaming, does anyone have a clue?

Thanks in advance for any hint,
live-devel mailing list
live-devel <at> lists.live555.com
Marco Porsch | 14 Jul 13:51 2014

JPEG video with grayscale frames



I would like to report a bug concerning JPEG video streaming in live555.


I can successfully stream 3-component (color) images using a modified version of testOnDemandRTSPServer.cpp and custom JPEG source and subsession classes. I am able to successfully receive the stream using VLC media player. But if I encode my frames as 1-component (grayscale) images, VLC displays only nonsense.


When dumping my encoded JPEG images on the server side they are alright. I tested the streamed data on the receiver side using “openRTSP” with the “–m” switch. The still images also contain only nonsense data.


Upon further examination I found that the JPEG headers on the receiver side proclaim to contain 3-component (24bit) image data. Upon digging into the live555 sources I found that in JPEGVideoRTPSource.cpp in the function createJPEGHeader() e.g. in line 235 the header is always constructed for 3-component images.




Marco Porsch

Software engineer



Fon: +49(0)371 5347 760  ∙  Fax: +49(0)371 5347 761
m.porsch <at> intenta.de  ∙  http://www.intenta.de


Intenta GmbH  ∙  Annaberger Straße 240  ∙  09125 Chemnitz, Germany

Geschäftsführer: Dr.-Ing. Basel Fardi
HRB 26404 Amtsgericht Chemnitz


live-devel mailing list
live-devel <at> lists.live555.com