Karol Zdancewicz | 25 Aug 08:29 2015
Picon

How to set Presentation Time correctly for audio frame in rtsp server?

Hello,
 
I try on many ways but still something freezing my audio, Its like one second audio and two not. I want to be sure I am doing this properly:
 
I am sending audio as rtsp server, its PCMU/8000, I give it every second frame sized: 8000 bytes and send it. In Wireshark I can see that every second I am sending those packets so in sending looks ok. I am wondering about fPresentationTime can it be a problem? If I think good fDurationTime should be one second when packet is 8000 so I tried to set it to 1000000 because of microseconds. fPresentationTime:
 
void AudioOutStreamSource::doGetNextFrame()
{
    if( m_buffFrames.Size() > 0 ) {
        deliverFrame();
    }
    else {
        gettimeofday(&m_tCurrentTime, NULL);
    }
}
 
in deliverFrame before copying buffer: fPresentationTime = m_tCurrentTime;
 
RtpSink is set as: SimpleRTPSink::createNew(envir(), rtpGroupsock, 0, 8000, "audio", "PCMU", 1);
 
Do you see here any issues?
 
thanks in advance!
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
王利强 | 24 Aug 12:24 2015

Excuse me

Dear Mr/ Mrs,

I have been studying live555 recently. And now I have a question. I would like to use a RTSP client to actively push a video file to RTSP server whose IP is known. How should I use the live555? Could I change the testRTSPClient.cpp to make it?

Thank you and best regards.

Yours sincerely,
Liqiang Wang

    Liqiang Wang,  Algorithm Engineer
  HiRain Technologies, AE Chassis and Safety Department
  9F,Block D,Truth Plaza,No.7 Zhichun Rd.,Haidian Dist.,Beijing,P.R.China,100191
  Tel: 86-10-64840808-9276
  Fax
86-10-64853661
E-mail: liqiang.wang <at> hirain.com
Web: http://www.hirain.com

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ross Finlayson | 22 Aug 05:01 2015

A Sneak Preview: WebRTC streaming (to web browsers, without plugins) using LIVE555

If you are using a WebRTC-compliant browser (e.g., recent versions of Chrome, Firefox, or Opera - but *not* Safari or Internet Explorer), then you may be interested in taking a look at our experimental WebRTC demo at

This is a 'sneak preview’ of future LIVE555 technology for streaming real-time video (and/or audio) streams to web browsers, without requiring a browser ‘plugin’.

To see the video, you will need to be running a WebRTC-compliant browser, and also must not be behind a firewall that blocks UDP packets.  (Being behind a NAT, however, should be OK, provided that is passes UDP packets.)

The technology behind the demo is described in more detail on the web page.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Robert Smith | 20 Aug 11:57 2015

Provide multicast and unicast stream

The RTSP spec allows a server to provide multiple transport options for 
a single stream, e.g, unicast or multicast, but I can't figure out how 
to do this with the live555 library.

Currently I am creating two streams for the same (live) source and 
appending either '_unicast' or '_multicast' to the URL but I would 
prefer a single URL for the stream.

I'm surprised that I haven't been able to find anything relating to this 
since I guess it would be a common requirement? I did notice that the 
live555mediaserver doesn't support it which suggests that the library 
doesn't support this either? but openRTSP does support explicitly 
requesting a multicast stream.

Help appreciated,

Robert Smith.
Franco Miceli | 13 Aug 17:02 2015

Another H264 Live Input Issue

Subham,

I know this is an old topic, but I just found that with other players GRID streaming with live555 does work.

Have you tried with other players besides VLC?

My implementation (very similar to yours) was not getting played on VLC (clock ticked but no image). Then I tried OpenRTSP.exe and the file it stored was not played by VLC either.

That is when I downloaded MPlayerOSX Extended and, to my surprise, it opened the file and played it. Moreover, it also played the rtsp stream.

My issue now is that when I play the file openRTSP received it displays correctly, but when  I play the stream it seems to display the frames discretely (the clock advances in 6 seconds -or so- intervals) and a lot of frames are shown in a single instant.

Does anyone know where should I look in order for the live feed to be played correctly?

Regards

Franco Miceli
<at> Radixcast


Shubham, At a first (quick) glance, I don’t see anything obviously wrong here (apart from your use of an unprofessional email address, which is strongly discouraged on this mailing list). I notice that the first NALs that you deliver are SPS and PPS NAL units - which is important (and answers Marcin’s question). Because you’re streaming from a live source, make sure that the “reuseFirstSource” parameter is True. Also, you should make sure that one (and only one) NAL unit (without start code) is delivered by your “GridSource” class in response to each “doGetNextFrame()” call. (Your code might already be doing this OK; I didn’t look at it in detail.) In any case, I suggest that you first use our “openRTSP” command-line RTSP client application (rather than VLC, which is not our software) to test your server. If everything is working correctly, “openRTSP” will give you a non-empty output file, which you can then rename to have a “.h264” filename suffix. If the file is correct, then VLC should be able to play it (as a file). Then, and only then, should you try using VLC as a RTSP client. Ross Finlayson Live Networks, Inc. http://www.live555.com/
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Usama Shah | 13 Aug 14:16 2015
Picon

Run a method when streaming starts for another client?

I need to send some extra data `sendIframes` when a new client connects to our RTSP server. This is what I have found and done so far

    env->taskScheduler().scheduleDelayedTask(5000000, (TaskFunc*)sendIframes, NULL); //failed
    //delaying is useless as we need to send only to the new client connects

    EventTriggerId eId = env->taskScheduler().createEventTrigger((TaskFunc*)sendIframes);
    env->taskScheduler().triggerEvent(eId,NULL);//failed
    //how do we trigger it when new client connects or when streaming starts for the new client?

    env->taskScheduler().setBackgroundHandling(0, SOCKET_READABLE|SOCKET_WRITABLE,(TaskScheduler::BackgroundHandlerProc*)&sendIframes,NULL);//how to use this?
    //this looks like the right way to do what we want but how exactly?
   
    env->taskScheduler().doEventLoop(); // does not return
   
`doEventLoop()` handles new connections and send stream to the new clients all be itself.

At the moment only thing I can think of is to edit the source code of Live555.

    void StreamState::startPlaying(Destinations* dests, unsigned clientSessionId,
           TaskFunc* rtcpRRHandler, void* rtcpRRHandlerClientData,
           ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,
           void* serverRequestAlternativeByteHandlerClientData) {
    //.....
    //run sendIframes();
    //.....
    }
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Gerard Castillo Lasheras | 13 Aug 13:38 2015
Picon

Access to RTPSink object on OnDemandServerMediaSubsession childs

Hi Ross,

I'm implementing statistics on our software (liveMediaStreamer framework) and I'd like to have access to the RTPTransmissionStatsDB. But, I do not see how to get the RTPSink object (which has the RTPTransmissionStatsDB and its stats).

Which should be the proper way to get the RTPSink object related to my OnDemandServerMediaSubsession childs? I've seen that OnDemandServerMediaSubsession has a friend classe StreamState which has the RTPSink associated but, anyway, I'm not able to have access to it.


Thanks in advance,


Kind regards,
--------------------------------------------------------
 Gerard Castillo Lasheras
  Enginyer de Projectes
  Fundació i2CAT - Unitat Audiovisual
  SkypeID: gerardcl85
  Telf.: +34.93.553.26.30 (530)
--------------------------------------------------------
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Jeremiah Morrill | 13 Aug 03:59 2015
Picon

Static Initialization Issues

I’ve come across an issue where a BasicTaskScheduler0 gets statically initialized (ran before main(…)) and the DelayQueue gets stuck in an infinite loop.

 

I’ve found that the problem is in DelayQueue.cpp in the constructor of DelayQueue.

 

DelayQueue::DelayQueue()

  : DelayQueueEntry(ETERNITY) {

  fLastSyncTime = TimeNow();

}

 

Essentially in a statically initialized environment, the ETERNITY const may not have been set and therefor "0".  I'm not sure if this is a scenario you wish to support in your library, but one fix could look like this:

 

static DelayInterval getEternityDelayInterval()

{

    static const DelayInterval eternity(INT_MAX, MILLION - 1);

    return eternity;

}

 

DelayQueue::DelayQueue()

  : DelayQueueEntry(getEternityDelayInterval()) {

  fLastSyncTime = TimeNow();

}

 

In any case, if anyone comes across this issue, they’ll know what is causing it J

 

-Jeremiah

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Karol Zdancewicz | 10 Aug 15:23 2015
Picon

Question: RTSP server sending packets continously given by me.

Hello,
I have problem with creating server said in thread. What I did:
 
Class A that initiate everything + start event loop
Class B which inherits from OnDemandServerMediaSubsession with redefined: createNewRTPSink and  CreateNewStreamSource which return my C class
Class C which inherits from AudioInputDevice
 
Class C got method where I receive frames.
 
Could you help me find best way to do this? I am kind of confused. I try now to copy all data I get in doGetNextFrame()  (fTo,fFrameSize etc.) and  do FramedSource::afterGetting(this). Is this even good way of thinking? If yes, I will need mechanism to not let do afterGetting before last frame wasn’t finished.

thanks in advance for any help,
 
Karol
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Alex Anderson | 6 Aug 18:04 2015

Two liveMedia components sharing Groupsock. Shutting down one stops the other

Hello,

I have implemented a multicast out component using SimpleRTPSink and RTCPInstance, and a multicast in component using SimpleRTPSource and RTCPInstance. When the multicast in component is receiving from a multicast out component on the same device, I have them share Groupsocks for RTP and RTCP.

The issue I'm having is when I go to shutdown one component, it seems to stop the other component. The code tracks how many components own the Groupsock, so I know the Groupsock isn't being deleted.

Say for example I'm shutting down the multicast out component like so:

rtp_sink->stopPlaying();
Medium::close(framed_source);
Medium::close(rtcp_instance);
Medium::close(rtp_sink);

This will cause the multicast in component to stop working and stop receiving any more streams, Even if I re-instantiate multicast out using the same shared Groupsocks, the multicast out works just fine, but the multicast in continues to be stopped (unless of course I re-instantiate the multicast in). Keep in mind, this is only the case where the multicast in is receiving the multicast out on the same device, and the Groupsocks are shared.

The only link between components is the shared Groupsocks. I've looked into the Live555 code, but I can't find how this sharing of Groupsocks would cause one component to be stopped when another is shutdown.

Could you point to how exactly the sharing of Groupsocks would cause this? Even better, do you know a solution, while still being able to share Groupsocks?

Thanks!

Alexander Anderson

EMAIL CONFIDENTIALITY STATEMENT: This email, including any attachments hereto, may contain confidential and/or proprietary information of Videon Central, Inc., and/or may include information controlled for export purposes under the U.S. International Traffic in Arms Regulations (ITAR) and is intended for the recipient only.   No export, sale, transfer, release or other disposition of this information is permitted without prior authorization. If you are not the intended recipient please 1) do not disclose, copy, distribute or use this message or its contents to any other person or entity. 2) advise the sender by return e-mail, and 3) delete all copies, including attachments, from your computer. Your cooperation is greatly appreciated.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Nathan | 6 Aug 14:11 2015

RTSP server: how to correlate RTSPClientConnection and RTSPClientSession

Hello
 
We rely on a now-ancient version (2009-09-04) of LIVE555 Streaming Media library as the basis for the RTSP streaming server in our [open source] product. I'm trying to assess how feasible it would be to upgrade to the latest version of LIVE555 in the near future. Obviously there are challenges, for example the changeover from the ancient synchronous API to the "new" asynchronous API. I think I should be able to figure most things out given a bit of time. However, after a day of staring at code I've found one problem that I can't see how to solve.
 
As part of the product, we have a configuration section/panel that enables administrators to see the list of "clients" connected to the server. The list includes [for each "client"]: the client session ID, the client IP address, the server session ID that the client is accessing, and some additional metadata about the server session media. The administrator can choose to "kick" (drop) client sessions if they deem it appropriate.
 
In the version of LIVE555 that we're currently using, the fClientAddr and fOurSessionId are encapsulated in a single
RTSPClientSession class. Therefore the correlation between the 2 pieces of information is obvious, and we have no problem building the list. However, in the latest code these key pieces of information are split between RTSPClientSession and RTSPClientConnection. I can't see how to correlate the instances of these classes, especially since createNewClientSession() no longer provides the clientAddr parameter.
 
So, in short my question is: how can I get the RTSPClientConnection instance associated with a given RTSPClientSession? If it isn't possible, can you point me to an explanation of why the design was changed in such a way as to make it not possible.

Thanks in advance!
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Gmane