kingaceck | 20 Oct 17:06 2014

use same UDP port to send all data?

I have a firewall - between the server and client .The firewall only can open a few port.So the server must use the same UDP port.
I want to modify the "LIVE555 Streaming Media" source code like below.I do not know whether it is correct to do it in this way,please help me.
all modification in OnDemandServerMediaSubsession::getStreamParameters() like below:
1)To allow reuse of socket numbers  modify line 140:NoReuse dummy(envir()); to //NoReuse dummy(envir()); 
2)To allow rtp and rtcp use the same port modify line 157:serverRTCPPort = ++serverPortNum; to serverRTCPPort = serverPortNum;(remove the ++ operation)
 
I must use UDP to transport the data and must use a few port in all streams.
I do not know whether it is correct to do it in this way,please help me.Thank you very much.
 
kingaceck
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Ross Finlayson | 20 Oct 16:43 2014

Re: Problem MJEG stream with proxyserver

When I test h.264 via live555ProxyServer, It's works, but MJEG stream is broken(https://dl.dropboxusercontent.com/u/4188226/mjpeg_broken.png).

 

RTSP Client is VLC and back-end RTSP Server is IP-Camera.

 

I am using lastest version.

Including the latest version of VLC: 2.1.5 ?


 please help me.



Sorry, but VLC is not our software.  You should use our “openRTSP” client application for your initial testing: http://live555.com/openRTSP/

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Alan Martinovic | 17 Oct 15:44 2014

Re: video/JPEG streaming

Hi Michael,

do you have a sample file “test.mjpeg” that works with your example?


DISCLAIMER:
This e-mail may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply e-mail and delete all copies of this message.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Alan Martinovic | 17 Oct 11:37 2014

Mechanism of passing data from source to sink

Hi,

I’ve read how the control flow mechanism works (http://www.live555.com/liveMedia/faq.html#control-flow)

and I’m trying to get the idea of how it works in practice by following the Elphel example (http://www.live555.com/Elphel/).

 

I’ve traced the actual payload to the fTo variable

 

#ElphelJPEGDeviceSource.cpp 95-96

// Then, the JPEG payload:

fFrameSize = fread(fTo, 1, fMaxSize, fFid);

 

 

But when grepping through all the *Sink* and *Session* files, I found no traces of fTo. Plus, the fTo is protected.
At first, I was thinking that when triggered, the JPEGVideoRTPSink was getting the raw data by accessing the FramedSource.fTo and then adding the RTP headers to that data (which represent raw video).

 

By which method (variable?) do sinks and sources actually share the raw video data?

 


DISCLAIMER:
This e-mail may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply e-mail and delete all copies of this message.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Alan Martinovic | 17 Oct 11:30 2014

Mechanism of passing data from source to sink

Hi,

I’ve read how the control flow mechanism works (http://www.live555.com/liveMedia/faq.html#control-flow)

and I’m trying to get the idea of how it works in practice by following the Elphel example (http://www.live555.com/Elphel/).

 

I’ve traced the actual payload to the fTo variable

 

#ElphelJPEGDeviceSource.cpp 95-96

// Then, the JPEG payload:

fFrameSize = fread(fTo, 1, fMaxSize, fFid);

 

 

But when grepping through all the *Sink* and *Session* files, I found no traces of fTo. Plus, the fTo is protected.
At first, I was thinking that when triggered, the JPEGVideoRTPSink was getting the raw data by accessing the FramedSource.fTo and then adding the RTP headers to that data (which represent raw video).

 

By which method (variable?) do sinks and sources actually share the raw video data?

 


DISCLAIMER:
This e-mail may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply e-mail and delete all copies of this message.
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Jan Ekholm | 16 Oct 21:19 2014

Proper use of StreamReplicator


Hi,

I've use Live555 for some prototypes for a while now and it's working quite well so far. My use cases
are to act as a cental video hub for a number of remote surveillance cameras as well as locally connected
USB cameras and serve H264/MJPEG streams using unicast and multicast. Those scenarios more or
less work ok. The code isn't too pretty but Live555 is quite hard to use.

Now I need to save the streams to disk too. There is the handy StreamReplicator class that should allow
me to save the streams to disk as well as stream them to clients, but I've not really understood how to use
it correctly. From what I've understood I need to create one StreamReplicator for the source stream and
then replicator->createNewStreamReplica() for the streaming as well as the saving. Well, this does not work
at all so I'm doing something wrong.

The class that handles a local USB camera and unicasts MJPEG is basically:

class LocalMJpegUnicastServerMediaSubsession : public OnDemandServerMediaSubsession {
...
protected:

    virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate);

    virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char
rtpPayloadTypeIfDynamic, FramedSource* inputSource);

private:
    CameraParameters m_cameraParameters;
    StreamReplicator * m_replicator;
    FileSink * m_saveSink;
};

FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned
clientSessionID, unsigned& estBitRate) {
    // create and initialize a source for the camera. This is a JPEGVideoSource subclass that captures,
encodes and delivers JPEG frames
    // it works fine as long as do not try to use StreamReplicator
    MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
    source->initialize( m_cameraParameters );

    m_replicator = StreamReplicator::createNew( envir(), source, False );

    return m_replicator->createStreamReplica();
}

RTPSink* LocalMJpegUnicastServerMediaSubsession::createNewRTPSink (Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) {
    return JPEGVideoRTPSink::createNew( envir(), rtpGroupsock );
}  

When I use this ServerMediaSubsession and connect a client the call sequence I see is:

LocalMJpegUnicastServerMediaSubsession::createNewStreamSource() 
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource() 
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()

Nothing is however delivered to the network. As if the stream doesn't start. I never see a call to
MJpegFramedSource::doGetNextFrame() which is the overridden method for, well, getting the next
frame. No errors, no crashes and no data. If I now try to save the stream I do get something
saved, but I have not analyzed the file yet. The amount of data looks correct though (a lot of
data very fast). To start saving I use code like (simplified):

void LocalMJpegUnicastServerMediaSubsession::startSavingStream (const std::string & filename) {
    FramedSource * source = m_replicator->createStreamReplica();
    m_saveSink = FileSink::createNew( envir(), filename.c_str(), bufferSize );
    m_saveSink->startPlaying( *source, 0, 0 );
}

So I get no stream but a saved file.

If I change my createNewStreamSource () back to the below it works fine for streaming:

FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned
clientSessionID, unsigned& estBitRate) {
    MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
    source->initialize( m_cameraParameters );
    return source;
}

In this case there is no replicator at all and I can not save the stream. Trying to later add a replicator to the
source and then use that with the FileSink leads to infinite recursion in doGetNextFrame(). But as I've understood
I can not use a replicator like this the behavior is perhaps to be expected. In this case I get a stream but no save
file.

So, how would one properly use StreamReplicator here so that I get both a stream and a save file? Later I also need
to be able to save streams that are local cameras multicasted as well as remote proxied cameras.

--

-- 
Jan Ekholm
jan.ekholm <at> d-pointer.com
PROMONET Michel | 16 Oct 15:30 2014

live555MediaServer segmentation violation when severals RTSP client access to the same stream.

            Hi Ross,

 

Since a couple of months live555MediaServer exit times to times with a segmentation violation when it is used by several client that ask for the same file.

One of the back trace is :

#0  ServerMediaSession::duration (this=0x131a500) at ServerMediaSession.cpp:177

#1  0x0000000000412615 in ServerMediaSession::generateSDPDescription (this=0x131a500) at ServerMediaSession.cpp:243

#2  0x00000000004074f6 in RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE (this=0x130c460,

    urlPreSuffix=<value optimized out>, urlSuffix=<value optimized out>,

    fullRequestStr=0x130c48c "DESCRIBE rtsp://127.0.0.1:8554/video.264 RTSP/1.0\r\nCSeq: 3\r\nUser-Agent: RTSP client (LIVE555 Streaming Media v2013.02.11)\r\nAccept: application/sdp\r\n\r\n") at RTSPServer.cpp:526

#3  0x0000000000405a30 in RTSPServer::RTSPClientConnection::handleRequestBytes (this=0x130c460, newBytesRead=150)

    at RTSPServer.cpp:990

#4  0x00000000004063c0 in RTSPServer::RTSPClientConnection::incomingRequestHandler1 (this=<value optimized out>)

    at RTSPServer.cpp:790

#5  0x00000000004063cf in RTSPServer::RTSPClientConnection::incomingRequestHandler (instance=<value optimized out>)

    at RTSPServer.cpp:783

#6  0x000000000044a762 in BasicTaskScheduler::SingleStep (this=0x1199010, maxDelayTime=<value optimized out>)

    at BasicTaskScheduler.cpp:171

#7  0x000000000044b8d5 in BasicTaskScheduler0::doEventLoop (this=0x1199010, watchVariable=0x0) at BasicTaskScheduler0.cpp:80

#8  0x00000000004020bb in main (argc=<value optimized out>, argv=<value optimized out>) at live555MediaServer.cpp:89

Others are similars accessing to a ServerMediaSession, sometimes processing SETUP.

 

I tried to comment out in DynamicRTSPServer.cpp removing of existing sessions (because I suspect it destroy ServerMediaSession that are still used by others RTSP connections)

  } else {

    if (smsExists && isFirstLookupInSession) {

      // Remove the existing "ServerMediaSession" and create a new one, in case the underlying

      // file has changed in some way:

//      removeServerMediaSession(sms);

//      sms = NULL;

    }

 

    if (sms == NULL) {

      sms = createNewSMS(envir(), streamName, fid);

      addServerMediaSession(sms);

    }

 

    fclose(fid);

    return sms;

  }

 

Obviously this change the behavior because it does not re-read the file, but with this modification it doesnot seems to crash anymore.

 

Is there some restrictions to access several times to the same stream provided by live555MediaServer ?

 

Thanks again for your support.

 

            Michel.

 

[ <at> <at> THALES GROUP INTERNAL <at> <at> ]

 

_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Sven Grossmann | 16 Oct 11:41 2014
Picon

RTSP stream of webm-file on windows machines

Hello folks,

I try to stream a webm-file on a windows machine via the 
live555MediaServer, but unfortunately I can't get the RTSP stream played 
on both VLC and MediaPlayerClassic.
VLC logs, that the audio and video codec is undefined.

I am able to stream MPG files and play them via VLC, also I am able to 
stream and play the same webm-files on an ubuntu machine.

Are there any known issues with streams to webm-files on windows 
machines?

OS: Windows 7 SP1
latest Live555MediaServer
latest VLC Player
File: http://www.webmfiles.org/demo-files/
kingaceck | 15 Oct 08:19 2014

rtp over tcp

When I transport rtp packets over tcp using live555 , I get many logs like below after playing many minutes.
sendRTPorRTCPPacketOverTCP: failed! (errno 32)
sendRTPorRTCPPacketOverTCP: failed! (errno 32)
...
 
or
 
sendRTPorRTCPPacketOverTCP: failed! (errno 9)
sendRTPorRTCPPacketOverTCP: failed! (errno 9)
...
 
then the vlc client can't receive any rtp packets.  
 
I reconnect to the live555 the question will come appear again.
 
kingaceck
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Nadir Raimondo | 14 Oct 15:22 2014
Picon

UDP packet loss

Dear all,

I'm trying to adapt testReplicator to restream a transport stream from a 
unicast address to a multicast one.
It works correctly if the UDP packet size in the input stream is fixed 
to 1316 bytes. The problem occurs if the packets size is allowed to 
change freely assuming values which are not multiples of 188 bytes  (e.g 
using ffmpeg as input encoder/streamer). In such a case I notice a 
random loss of entire burst of UDP packets and live555 does not log this 
anyhow. As a consequence of that artifacts are introduced in the output 
stream which loses the sequentiality of the ts-stream sync-byte.

This loss seems not to be directly related to the packet size since 
packets of same dimensions can be either received or dropped.
This cannot be due to fragmentation issues because packets size is 
anyway lower that the UDP max size of 1480 bytes and happens even if 
working in localhost.
On the other side if I receive the stream directly from a socket without 
using live555 all the data arrive correctly (no packet loss).

So some questions arise:
1) Should live555 be hable to handle input UDP stream with variable 
packet lengths? Or should the length be always multiple of 188?
2) Any advice about which class of live555 can be responsible for this loss?

Thank you in advance,

Yours,

Nadir
Muhammad Ali | 13 Oct 18:29 2014

Audio+Video streams OpenRTSP to FFMPEG

My objective is to use OpenRTSP to receive audio + video stream from IP camera and pass it on to FFMPEG which can then stream it to an RTMP server.

I've been using pipe to send stdout to ffmpeg (pipe input source) but that was only a video stream (OpenRTSP -v flag). Now the requirement has come to stream both the audio and video streams. So ofcourse I tried to replace -v with -4 and obviously it failed as they are two separate streams and not a single elementary stream. Am i correct ?

So now, what will be the correct way to achieve my objective. I myself am a developer and am not shy to code but I prefer to use something that is already around (if there is).

--
Muhammad Ali
And Or Logic
_______________________________________________
live-devel mailing list
live-devel <at> lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Gmane