DM81xx GStreamer Pipelines - SDK Turrialba

From RidgeRun Developer Connection

(Difference between revisions)
Jump to:navigation, search
(Video Preview)
(Video Recording)
Line 15: Line 15:
'''Encode videotest pattern in H.264 (without container)'''
'''Encode videotest pattern in H.264 (without container)'''
<pre style="background:#d6e4f1">
<pre style="background:#d6e4f1">
-
gst-launch -v videotestsrc num-buffers=1000 ! omx_h264enc ! gstperf ! filesink location=sample.264
+
gst-launch-0.10 -v videotestsrc num-buffers=1000 ! omx_h264enc ! gstperf ! filesink location=sample.264
</pre>
</pre>

Revision as of 14:48, 13 February 2013

The following examples show the usage of GStreamer with the RR DM81xx SDK Turrialba.

Contents

Video Preview

Display videotest pattern

gst-launch-0.10 -v videotestsrc ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60  ! gstperf ! omx_videosink sync=false

Scale the QVGA video test pattern to VGA

gst-launch-0.10 -v videotestsrc ! 'video/x-raw-yuv,width=320,height=240' ! omx_scaler ! \
'video/x-raw-yuv,width=640,height=480' ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! \
gstperf ! omx_videosink sync=false -v

Video Recording

Encode videotest pattern in H.264 (without container)

gst-launch-0.10 -v videotestsrc num-buffers=1000 ! omx_h264enc ! gstperf ! filesink location=sample.264

Video Playback

Decode H.264 file(without container)

gst-launch filesrc location=sample.264 ! h264parse access-unit=true ! omx_h264dec ! omx_scaler ! \
omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf ! omx_videosink sync=false -v

Video Streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use the default port (4951) to send the packets, if you want to change the port number, you have to add the port capability to the udpsink.(e.g udpsink port=$PORT host=$CLIENT_IP)

RTP

Stream H.264 video test pattern over RTP

CLIENT_IP=10.251.101.58

gst-launch videotestsrc ! omx_h264enc ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v

This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:

.
.
.
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)J0KAKouVAoPy, payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                                              
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                        
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                             
frames: 68      current: 67.90   average: 67.90 arm-load: 0   
.
.
.                                                                       

You need the udpsink:sink capabilities for the client pipeline.

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\",payload=(int)96,ssrc=951364645,\
clock-base=2084568768,seqnum-base=10992

PORT=4951

gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

Stream H.264 encoded video file over RTP
These pipelines use a video file and send it over the network. Here you can use any file encoded in H.264.

CLIENT_IP=10.251.101.58

FILE=sintel_trailer-1080p.mp4

gst-launch filesrc location=$FILE  ! qtdemux  ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v

As before, you need the udpsink:sink capabilities for the client pipeline.

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\",payload=(int)96,\
ssrc=2152503956,clock-base=4043051310,seqnum-base=10306

PORT=4951

gst-launch udpsrc $PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

RTSP

The target device runs an RTSP server based on the gst-rtsp-server library to send video over network. An RTSP server is created that is similar to the test-launch.c included with the gst-rtsp-server source code.

The professional version of the RidgeRun SDK includes integration of Wim Taymans GStreamer RTSP Server library along with a RTSP server that uses the library. The RR RTSP server is called rr_rtsp_server.
Stream H.264 video test pattern over RTSP

rr_rtsp_server "( videotestsrc ! omx_h264enc ! queue ! h264parse ! rtph264pay pt=96 name=pay0 )" &
SERVER_IP=10.251.101.240

gst-launch rtspsrc location=rtsp://$SERVER_IP/test ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false

Stream H.264 encoded video file over RTSP
These pipelines use a video file and send it over the network. Here you can use any file encoded in H.264.

FILE=sintel_trailer-1080p.mp4
 
rr_rtsp_server "( filesrc location=$FILE  ! qtdemux  ! queue ! h264parse ! rtph264pay pt=96 name=pay0 )" &
SERVER_IP=10.251.101.240

gst-launch rtspsrc location=rtsp://$SERVER_IP/test ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false
Navigation
Toolbox