Jannik Schacht | 29 Jun 16:38 2016

http://live555.com/liveMedia/#config-windows with Visual Studio 2015 Pro.

Dear Live555 Support,

 

I am interested in using the openRTSP program for some scheduled video streaming tasks. My problem is now that I have no idea how to open it in Visual Studio 2015 Professional to compile me a .exe.

First I opened “win32config” and changed TOOLS23 to “C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC”.  Then I created the makefiles like written in http://live555.com/liveMedia/#config-windows .

Then I searched for some open/import functionality in Visual Studio 2015 to open these but didn’t find it. Then I read about nmake and tried to “nmake <projdir>\UsageEnvironment\UsageEnvironment.mak”.

After reading that “ntwin32.mak” and ”win32.mak” were missing and adding them (Out of Microsofts Windows 7.1 SDK) I hit my current problem which is “NMAKE : fatal error U1073: don't know how to make 'include/UsageEnvironment_version.hh'”

I didn’t change anything about the makefile (I use live555-latest.tar.gz last edited 26.06.2016) and the include/UsageEnvironment_version.hh file is present.

So I don’t know why I get that error.

Could you please explain how I can import the necessary files for openRTSP into Visual Studio 2015 or make the makefiles working with nmake or send me a compiled exe version of the openRTSP program?

 

 

Best Regards,

 

Jannik schacht

Software Engineer KM Embient

Subsea Monitoring

 

Jannik.Schacht <at> km.kongsberg.com

www.km.kongsberg.com

 

 

 

LOCATIONS

Kongsberg Maritime AS, Subsea Division, Strandpromenaden 50, NO-3183 Horten Norway, P.O.Box 111, NO-3191 Horten, Norway, Enterprise number 979750730

 

My location: Kongsberg Maritime Embient GmbH, a limited liability company according to German company law located at Gewerbering 10-14, 25469 Halstenbek, Germany, Handelsregister | Place of registration: Amtsgericht Pinneberg, Registernummer | Registration number: HRB 8422 PI, USt.-IdNr. | VAT ID-No.: DE269017722, Sitz | Registered office: Gewerbering 10-14, 25469 Halstenbek, Geschäftsführer | Managing Director: SörenThemann

 

Kongsberg Maritime Contros GmbH, a limited liability company according to German company law located at Wischhofstrasse 1-3 Bld.: 2, 24148 Kiel, Germany, Handelsregister | Place of registration: Amtsgericht Kiel, Registernummer | Registration number: HRB 8587 KI, USt.-IdNr. | VAT ID-No.: DE252286218, Sitz | Registered office: Wischhofstrasse 1, 24148 Kiel, Geschäftsführer | Managing Director: SörenThemann

 

CONFIDENTIALITY

This communication (including the e-mail and its attachments) contains KONGSBERG information that may be proprietary, confidential or subject to export regulations, and is meant only for the intended recipient(s). Any disclosure, copying, distribution or use of this communication is prohibited, unless and to the extent such disclosure, copying, distribution or use is explicitly agreed in advance with KONGSBERG. If this communication is received in error, please delete it immediately from your system and notify the sender promptly.

 

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Maxim P. DEMENTIEV | 29 Jun 17:54 2016
Picon
Gravatar

ServerMediaSession::numSubsessions() and fNumStreamStates in RTSPServer::RTSPClientSession::handleCmd_SETUP()

Hi,

A cycle with ServerMediaSubsessionIterator is used in
RTSPServer::RTSPClientSession::handleCmd_SETUP() to calculate the value
of fNumStreamStates.

I wonder why ServerMediaSession::numSubsessions() isn't used for it?
Could they have different results?

Regards,
Max
Ben Rush | 27 Jun 23:20 2016
Picon

Is it valid to call startPlaying on a MediaSink AFTER TaskScheduler->doEventLoop()?

I hope I'm not asking too many questions on this group. I *am* searching through the archives, but I'm not always finding the answers I need. 

Anyway, standard setup: 

_taskScheduler = BasicTaskScheduler::createNew();
    _usageEnvironment = BasicUsageEnvironment::createNew(*_taskScheduler);
.....
_taskScheduler->doEventLoop(&_volatile);

I have walked through the server code some, and I think the answer is yes, but is it valid to "attach" a RTPSink (SimpleRTPSink, for example) by calling 

_sink->startPlaying(...)

AFTER the doEventLoop has been called? So long as its been initialized with the same BasicUsageEnvironment which was passed my BasicTaskScheduler, it should then be hooked up and start executing. Correct? 

And if that's okay to do, are there certain points in the pipeline where doing so would result in the sink NOT playing correctly? I'd like to start sending RTP packets to the source AFTER I start receiving packets from said source. So in MediaSink, in afterGettingFrame, I check that I haven't done so already, and then I spin up an instance of a SimleRTPSink and then call ->startPlaying (to stream back to the source).

Am I doing this in the right points of the pipeline? 
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ben Rush | 27 Jun 18:35 2016
Picon

Re: Strange crashing "bug" (maybe?)

Resolved. This is NOT a bug with Live555 but simply a misunderstanding of how to use the Groupsocks. This is MY fault, NOT the Live555 code. 

I should change the InitializeRoomClientAudioChannel method to use Groupsock2 pointers instead of passing the address from the stack-based ones. These addresses must be held onto internally, and obviously when the stack unwinds the addresses are blown away. So the values of Groupsock aren't copied by SimpleRTPSource::createNew, but the addresses are held onto. Which is perfectly fine. 

Sorry for the unnecessary post. 

On Mon, Jun 27, 2016 at 10:57 AM Ben Rush <ben <at> ben-rush.net> wrote:
This has happened to me a couple times while coding up things and I'd like to see if it's something stupid I'm doing, or whether I've uncovered a bug in Live555. 

I've got a class called "RoomClientStreamingServer" which does RTSP + RTP audio. Here is the header file for it: 

class RoomClientStreamingServer : public IRoomClient
{
public:
    RoomClientStreamingServer();
    ~RoomClientStreamingServer();
    void InitializeServer(); 
    bool InitializeRoomClientAudioChannel(int roomClientAudioListenPort); 
    bool InitializeRoomClientRTSPChannel(int rtspPort, std::string streamName, bool doAudio); 
    void StartServer(); 
    virtual void StopServer() override; 
private:
    TaskScheduler* _taskScheduler; 
    BasicUsageEnvironment* _usageEnvironment; 

private://audio channel
    SpeakerSink* _speakerSink; 
    RTPSource* _audioRTPSource; 
    RTCPInstance* _audioRTCPSource;

private://RTSP channel
    RTSPServer* _rtspServer; 
    ServerMediaSession* _sms; 
    H264LiveServerMediaSession *_videoSubSession; 
    WindowsAudioMediaSession* _audioSubSession; 
    char _volatile; 
};

I use it like this: 

    RoomClientStreamingServer* roomClient = new RoomClientStreamingServer(); 
    roomClient->InitializeServer(); 
    roomClient->InitializeRoomClientAudioChannel(roomClientAudioListenPort); 
    roomClient->InitializeRoomClientRTSPChannel(rtspPort, streamName, doAudio); 

    roomClient->StartServer(); 

The weirdness happens when calling StartServer. StartServer has a very simple implementation: 

void RoomClientStreamingServer::StartServer()
{
    _taskScheduler->doEventLoop(&_volatile);
}

This bombs with the following error: 

"BasicTaskScheduler::SingleStep(): select() fails: No error
socket numbers used in the select() call: 812(r) 816(r)"

It appears to be a problem with the InitializeRoomClientAudioChannel() call because I can duplicate this error if I comment out the InitializeRoomClientRTSPChannel(). 

Here is the InitializeRoomClientAudioChannel() implementation: 

    _speakerSink = SpeakerSink::createNew(*_usageEnvironment, true, nullptr, false);

    unsigned int rtpPortNum = roomClientAudioListenPort; // StreamingOptions::RoomClientListenPort;
    unsigned int rtcpPortNum = rtpPortNum + 1;
    char* ipAddress = "0.0.0.0";
    //char* ipAddress = "239.255.42.42"; 

    struct in_addr address;
    address.S_un.S_addr = our_inet_addr(ipAddress);
    const Port rtpPort(rtpPortNum);
    const Port rtcpPort(rtcpPortNum);

    Groupsock2 rtpGroupSock(*_usageEnvironment, address, rtpPort, 1);
    Groupsock2 rtcpGroupSock(*_usageEnvironment, address, rtcpPort, 1);

    _speakerSink->SetGroupSocks(&rtpGroupSock, &rtcpGroupSock);

    //RTPSource* rtpSource = WaveFormDataStreamer::createNew(*environment, &rtpGroupSock);
    int payloadFormatCode = 11;
    const char* mimeType = "L16";
    int fSamplingFrequency = 44100;
    int fNumChannels = 1;
    _audioRTPSource = SimpleRTPSource::createNew(
        *_usageEnvironment, (Groupsock*)&rtpGroupSock, payloadFormatCode,
        fSamplingFrequency, "audio/L16", 0, False /*no 'M' bit*/);

    const unsigned maxCNAMElen = 100;
    unsigned char CNAME[maxCNAMElen + 1];
    gethostname((char*)CNAME, maxCNAMElen);
    CNAME[maxCNAMElen] = '\0'; // just in case

    _audioRTCPSource =
        RTCPInstance::createNew(*_usageEnvironment, (Groupsock*)&rtcpGroupSock, 5000, CNAME, NULL, _audioRTPSource);
    _audioRTCPSource->setByeHandler(subsessionByeHandler, _speakerSink);
    _speakerSink->startPlaying(*_audioRTPSource, afterAudioListening, NULL);
    //_taskScheduler->doEventLoop(&_volatile);
    return true; 

If you'll notice I have _taskScheduler->doEventLoop(&_volatile); commented out because, if I uncomment this line, then I no longer get the crash. 

it's almost as if the _taskScheduler->doEventLoop() needs to be called on the same stack as the rest of the setup code. Again, I've encountered situations like this before when trying to separate the _taskScheduler->doEventLoop() call in another method. 

One final bit of information: if I comment out 
InitializeRoomClientAudioChannel and just let the RTSP stuff go (the webcam), with the _taskScheduler->doEventLoop() in its own method, then everything works fine. 

I recognize this might be a bit tough to follow over email, so if you need source that's fine. However, I also recognize people on this group are busy and so fundamentally I'd like to at least understand what might be the cause of the 

"BasicTaskScheduler::SingleStep(): select() fails: No error
socket numbers used in the select() call: 812(r) 816(r)" 

error if nothing else.

If I uncover what I'm doing before I get a response here then I'll send out another email to the group for posterity's sake.  
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Gilles Chanteperdrix | 26 Jun 09:12 2016

"low latency" mpeg transport stream streaming

Hi,

I have another problem with streaming of MPEG transport stream. When
chaining an MPEG2TransportStreamFromESSource (but actually any
FramedSource object producing mpeg ts packets) with a SimpleRTPSink,
the SimpleRTPSink will wait to have a full packet before sending
data (with my settings 6 mpeg ts packets). This is a problem for
"low latency" video applications where we want a video frame to be
sent ASAP: if a frame generates a number of ts packets which is not
a multiple of 6, the last remaining ts packets will remain stuck in
SimpleRTPSink buffer until SimpleRTPSink gets enough data to send
another packet, so at minimum until the next frame (but it can in
fact be much worse if frames are, say, 8 bytes H264 black frames).

I understand that this usage is a bit specific, so I have tried to
solve this by deriving from live555 classes. I derived from
SimpleRTPSink an RTP sink class which implements a
frameCanAppearAfterPacketStart method, that, if its source object is of
MPEG2TransportStreamMultiplexor, return true only if that object has
data ready, namely if the next call to doGetNextFrame() would not
call awaitNewBuffer(), that is if:
fInputBufferBytesUsed < fInputBufferSize.

The reason why I am contacting this list is that I did not find a
way to do this without modifying the MPEG2TransportStreamMultiplexor
class. Do you see any?

Thanks in advance.
Regards.

--

-- 
					    Gilles.
https://click-hack.org
Ben Rush | 24 Jun 21:25 2016
Picon

Best way to "signal" RTPSink to stop playing

I'm using SimpleRTPSink to send audio to a client. Works great. I have it setup like this: 

        SimpleRTPSink* sink = SimpleRTPSink::createNew(this->envir(), rtpGroupsock,
            payloadFormatCode, fSamplingFrequency,
            "audio", mimeType, fNumChannels);
.....
        RTCPInstance::createNew(this->envir(), _rtcpSock,
            estimatedSessionBandwidth, CNAME,
            sink, NULL /* we're a server */, False);

.....
        AudioInputDevice *audioSource = AudioInputDevice::createNew(this->envir(), 0, bitsPerSample,
            numChannels, samplingFrequency);
        FramedSource* swappedSource = EndianSwap16::createNew(this->envir(), audioSource);

        Boolean started = sink->startPlaying(*swappedSource, nullptr, sink);

I'm using the RTPSink/Source so I can implement two-way audio (full duplex). What I'd like to do is properly shut things down when one or both ends of the two-way audio disconnects or decides to not longer participate in the call. What is the best way to do this using the Live555 libraries? 

I know to actually STOP playing I just do sink->stopPlaying(); but I need to know when to call this. 

I'm still pretty green when it comes to RTP/RTCP, so I'm not sure if this is something supported, or if I'm going to need to implement my own communication channel (my own socket connection just for control).

I'm assuming that's what RTCPInstance affords me, though. 

        RTCPInstance::createNew(this->envir(), _rtcpSock,
            estimatedSessionBandwidth, CNAME,
            sink, NULL /* we're a server */, False);

I see I have the ability to call sendBYE() via the RTCPInstance. Do I do this from the client to initiate a shutdown with the sender? 

Thanks in advance. 
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Saskia van Emst | 23 Jun 09:40 2016

RTSP Pause command not implemented

Hi Ross,

So I contacted the camera manufacturer (Hikvision) and asked them about the PAUSE command, and they said:

" If you directly send Pause command to our device, it will not active, because it is real time decoding."

Then I looked up the method definitions in the RFC (https://tools.ietf.org/html/rfc2326#page-29)
which says:

" Notes on Table 2: PAUSE is recommended, but not required in that a
   fully functional server can be built that does not support this
   method, for example, for live feeds."

Would it be hard to change the proxy behavior to disconnect instead of sending PAUSE when the last client
disconnects? Perhaps via a command line switch to indicate server does not support PAUSE.

What I did try was simply replacing the sendPauseCommand with sendTeardownCommand, and then the data
stream is stopped, but that creates issues because the proxy afterwards tries to resume the session..

Best regards,

Saskia

---------------------------------------------------------------

Message: 5
Date: Thu, 16 Jun 2016 02:26:42 -0700
From: Ross Finlayson <finlayson <at> live555.com>
To: LIVE555 Streaming Media - development & use
	<live-devel <at> ns.live555.com>
Subject: Re: [Live-devel] Proxy server - stop stream from back-end if
	there	are no viewers
Message-ID: <F504B75F-480F-4D4A-AE38-9F0D7C0E819C <at> live555.com>
Content-Type: text/plain; charset=utf-8

> Any ideas what could cause the PAUSE command not to work while it is listed?

No.  You?ll have to ask the manufacturer of the IP camera about this.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Gilles Chanteperdrix | 22 Jun 17:36 2016

Buffer size in MPEG2TransportStreamFromESSource

Hi Ross,

I use the MPEG2TransportStreamFromESSource class to stream H264
frames using an MPEG transport stream. It happens that the encoder
I use has some frames larger than the 100 KB buffer size.

I know you have repeated many times on this list that we should
configure the encoder to use slices instead of large frames, but
since you provide OutPacketBuffer::maxSize to increase the default
size of RTPSink objects, I thought that it would make sense to have
a way to change the size of the MPEG2TransportStreamFromESSource
objects buffer too.

So, here is a patch adding a parameter to the class
addNewVideoSource and addNewAudioSource methods, to do just that.
The parameter has a default value of 0, which is converted
internally to INPUT_BUFFER_SIZE, so as to avoid breaking existing
users of this class.

Thanks in advance.
Regards.

diff -Naurdp live.2016.05.20.orig/liveMedia/MPEG2TransportStreamFromESSource.cpp live.2016.05.20/liveMedia/MPEG2TransportStreamFromESSource.cpp
--- live.2016.05.20.orig/liveMedia/MPEG2TransportStreamFromESSource.cpp	2016-05-20
02:11:05.000000000 +0200
+++ live.2016.05.20/liveMedia/MPEG2TransportStreamFromESSource.cpp	2016-06-09
15:21:58.000000000 +0200
 <at>  <at>  -32,6 +32,7  <at>  <at>  class InputESSourceRecord {
 public:
   InputESSourceRecord(MPEG2TransportStreamFromESSource& parent,
 		      FramedSource* inputSource,
+		      unsigned bufferSize,
 		      u_int8_t streamId, int mpegVersion,
 		      InputESSourceRecord* next, int16_t PID = -1);
   virtual ~InputESSourceRecord();
 <at>  <at>  -80,16 +81,16  <at>  <at>  MPEG2TransportStreamFromESSource* MPEG2T
 }

 void MPEG2TransportStreamFromESSource
-::addNewVideoSource(FramedSource* inputSource, int mpegVersion, int16_t PID) {
+::addNewVideoSource(FramedSource* inputSource, int mpegVersion, int16_t PID, unsigned
bufferSize) {
   u_int8_t streamId = 0xE0 | (fVideoSourceCounter++&0x0F);
-  addNewInputSource(inputSource, streamId, mpegVersion, PID);
+  addNewInputSource(inputSource, streamId, mpegVersion, PID, bufferSize);
   fHaveVideoStreams = True;
 }

 void MPEG2TransportStreamFromESSource
-::addNewAudioSource(FramedSource* inputSource, int mpegVersion, int16_t PID) {
+::addNewAudioSource(FramedSource* inputSource, int mpegVersion, int16_t PID, unsigned
bufferSize) {
   u_int8_t streamId = 0xC0 | (fAudioSourceCounter++&0x0F);
-  addNewInputSource(inputSource, streamId, mpegVersion, PID);
+  addNewInputSource(inputSource, streamId, mpegVersion, PID, bufferSize);
 }

 MPEG2TransportStreamFromESSource
 <at>  <at>  -146,10 +147,10  <at>  <at>  void MPEG2TransportStreamFromESSource

 void MPEG2TransportStreamFromESSource
 ::addNewInputSource(FramedSource* inputSource,
-		    u_int8_t streamId, int mpegVersion, int16_t PID) {
+		    u_int8_t streamId, int mpegVersion, int16_t PID, unsigned bufferSize) {
   if (inputSource == NULL) return;
-  fInputSources = new InputESSourceRecord(*this, inputSource, streamId,
-					  mpegVersion, fInputSources, PID);
+  fInputSources = new InputESSourceRecord(*this, inputSource, bufferSize, 
+					  streamId, mpegVersion, fInputSources, PID);
 }

 
 <at>  <at>  -158,11 +159,12  <at>  <at>  void MPEG2TransportStreamFromESSource
 InputESSourceRecord
 ::InputESSourceRecord(MPEG2TransportStreamFromESSource& parent,
 		      FramedSource* inputSource,
+		      unsigned bufferSize,
 		      u_int8_t streamId, int mpegVersion,
 		      InputESSourceRecord* next, int16_t PID)
   : fNext(next), fParent(parent), fInputSource(inputSource),
     fStreamId(streamId), fMPEGVersion(mpegVersion), fPID(PID) {
-  fInputBuffer = new unsigned char[INPUT_BUFFER_SIZE];
+  fInputBuffer = new unsigned char[bufferSize ? bufferSize : INPUT_BUFFER_SIZE];
   reset();
 }

diff -Naurdp live.2016.05.20.orig/liveMedia/include/MPEG2TransportStreamFromESSource.hh live.2016.05.20/liveMedia/include/MPEG2TransportStreamFromESSource.hh
--- live.2016.05.20.orig/liveMedia/include/MPEG2TransportStreamFromESSource.hh	2016-05-20
02:11:05.000000000 +0200
+++ live.2016.05.20/liveMedia/include/MPEG2TransportStreamFromESSource.hh	2016-06-09
15:15:03.000000000 +0200
 <at>  <at>  -30,9 +30,9  <at>  <at>  class MPEG2TransportStreamFromESSource:
 public:
   static MPEG2TransportStreamFromESSource* createNew(UsageEnvironment& env);

-  void addNewVideoSource(FramedSource* inputSource, int mpegVersion, int16_t PID = -1);
+  void addNewVideoSource(FramedSource* inputSource, int mpegVersion, int16_t PID = -1, unsigned
bufferSize = 0);
       // Note: For MPEG-4 video, set "mpegVersion" to 4; for H.264 video, set "mpegVersion" to 5.
-  void addNewAudioSource(FramedSource* inputSource, int mpegVersion, int16_t PID = -1);
+  void addNewAudioSource(FramedSource* inputSource, int mpegVersion, int16_t PID = -1, unsigned
bufferSize = 0);
       // Note: In these functions, if "PID" is not -1, then it (currently, just the low 8 bits)
       // is used as the stream's PID.  Otherwise (if "PID" is -1) the 'stream_id' is used as
       // the PID.
 <at>  <at>  -43,7 +43,7  <at>  <at>  protected:
   virtual ~MPEG2TransportStreamFromESSource();

   void addNewInputSource(FramedSource* inputSource,
-			 u_int8_t streamId, int mpegVersion, int16_t PID = -1);
+			u_int8_t streamId, int mpegVersion, int16_t PID = -1, unsigned bufferSize = 0);
   // used to implement addNew*Source() above

 private:

--

-- 
					    Gilles.
https://click-hack.org
leonardo.citraro | 16 Jun 21:03 2016
Picon

RTSPServer authentication is secure?

Dear Ross,

I am trying to understand if the RTSPServer authentication (i.e. as in 
testOnDemandRTSPServer) is secure enough for my application. If I start 
a stream from server to client using digest authentication (so no 
password is send over the internet), is it possible to "sniff" and play 
(i.e with wireshark) the stream? If yes how could I overcome this issue?

Thanks in advance
Best regards
Leonardo Citraro
Saskia van Emst | 15 Jun 16:33 2016

Proxy server - stop stream from back-end if there are no viewers

Hi,

Is it possible to have the proxy server close the stream from source (IP camera) to the proxy if nobody is
accessing the proxy stream for that source?

Right now when I start the proxy server, the ip camera stream is only accessed after I connect to the proxy.
However when I close my connection, the proxy seems to keep streaming from the camera indefinitely. To
save bandwidth it would be nice if the stream from backend could be closed automatically after some time.

Best regards,
Saskia van Emst
PROMONET Michel | 15 Jun 11:17 2016

TaskScheduler based on poll

Hi Ross,

 

I read this quite old thread http://lists.live555.com/pipermail/live-devel/2014-January/018017.html

 

This could be interesting for us to overcome the limits of select with file descriptor number above 1024.

As the method “void BasicTaskScheduler::SingleStep” makes 2 things:

-       Call the select, manage fd_set, filedescriptor are set

-       Check trigger and call related handler

-      

 

Basically I would like to keep the trigger feature in another scheduler based on poll that will extend BasicTaskScheduler0 (as BasicTaskScheduler do)

 

Do you think it could be helpful to move trigger stuff to BasicTaskScheduler0 level and let only fd_set and there management at BasicTaskScheduler level ?

 

Thanks for your support.

 

Best Regards,

 

Michel.

 

 

 

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Gmane