DM81xx GStreamer Pipelines - SDK 2011Q2

From RidgeRun Developer Connection

(Difference between revisions)
Jump to:navigation, search
(Created page with 'The following examples show the usage of GStreamer with the RR DM81xx SDK 2011Q2. == Video streaming== These instructions show how to do video streaming over the network, a v...')
Line 1: Line 1:
 +
<span style="color:red ">'''Page Under Construction'''</span> <br>
The following examples show the usage of GStreamer with the RR DM81xx SDK 2011Q2.  
The following examples show the usage of GStreamer with the RR DM81xx SDK 2011Q2.  
-
 
== Video streaming==
== Video streaming==
-
These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host.  
+
These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use the default port (4951) to send the packets, if you want to change the port number, you have to add the port capability to the udpsink.(e.g udpsink port=$PORT host=$CLIENT_IP)
-
#'''Stream H.264 video test pattern over RTP'''
+
'''Stream H.264 video test pattern over RTP'''
-
* ''Server: DM81xx''
+
*''Server: DM81xx''
<pre style="background:#d6e4f1">
<pre style="background:#d6e4f1">
CLIENT_IP=10.251.101.58
CLIENT_IP=10.251.101.58
-
gst-launch videotestsrc ! omx_h264enc ! queue ! H264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v
+
gst-launch videotestsrc ! omx_h264enc ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v
</pre>
</pre>
-
This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:
+
:This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:
<pre>
<pre>
.
.
Line 37: Line 37:
</pre>
</pre>
-
You need the udpsink:sink capabilities for the client pipeline.
+
:You need the udpsink:sink capabilities for the client pipeline.
-
* ''Client: Ubuntu PC'
+
 
-
Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.
+
*''Client: Ubuntu PC'
 +
:Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.
<pre style="background:#ffffd0">
<pre style="background:#ffffd0">
CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
Line 47: Line 48:
PORT=4951
PORT=4951
-
gst-launch udpsrc $PORT ! $CAPS ! rtph264depay byte-stream=true ! queue ! ffdec_h264 ! xvimagesink sync=false -v
+
gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v
</pre>
</pre>
-
# '''Stream H.264 encoded video file over RTP'''
+
'''Stream H.264 encoded video file over RTP'''
 +
:This pipelines uses a video file and send it over the network. Here you can use any file encoded in H.264.
 +
* ''Server: DM81xx''
 +
<pre style="background:#d6e4f1">
 +
CLIENT_IP=10.251.101.58
-
=== Stream over RTSP===
+
FILE=sintel_trailer-1080p.mp4
 +
 
 +
gst-launch filesrc location=$FILE  ! qtdemux  ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v
 +
</pre>
 +
 
 +
:As before, you need the udpsink:sink capabilities for the client pipeline.
 +
 
 +
* ''Client: Ubuntu PC'
 +
Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.
 +
<pre style="background:#ffffd0">
 +
CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
 +
sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\",payload=(int)96,\
 +
ssrc=2152503956,clock-base=4043051310,seqnum-base=10306
 +
 
 +
PORT=4951
 +
 
 +
gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v
 +
</pre>
 +
'''Stream H.264 video test pattern over RTSP'''
[[Category:DM8168]] [[Category:GStreamer]]
[[Category:DM8168]] [[Category:GStreamer]]

Revision as of 14:29, 6 September 2011

Page Under Construction
The following examples show the usage of GStreamer with the RR DM81xx SDK 2011Q2.

Video streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use the default port (4951) to send the packets, if you want to change the port number, you have to add the port capability to the udpsink.(e.g udpsink port=$PORT host=$CLIENT_IP)

Stream H.264 video test pattern over RTP

CLIENT_IP=10.251.101.58

gst-launch videotestsrc ! omx_h264enc ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v
This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:
.
.
.
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)J0KAKouVAoPy, payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                                              
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                        
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                             
frames: 68      current: 67.90   average: 67.90 arm-load: 0   
.
.
.                                                                       
You need the udpsink:sink capabilities for the client pipeline.
Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.
CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\",payload=(int)96,ssrc=951364645,\
clock-base=2084568768,seqnum-base=10992

PORT=4951

gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

Stream H.264 encoded video file over RTP

This pipelines uses a video file and send it over the network. Here you can use any file encoded in H.264.
CLIENT_IP=10.251.101.58

FILE=sintel_trailer-1080p.mp4

gst-launch filesrc location=$FILE  ! qtdemux  ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v
As before, you need the udpsink:sink capabilities for the client pipeline.

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\",payload=(int)96,\
ssrc=2152503956,clock-base=4043051310,seqnum-base=10306

PORT=4951

gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

Stream H.264 video test pattern over RTSP

Navigation
Toolbox