Gstreamer pipelines for DM816x and DM814x

From RidgeRun Developer Wiki


Problems running the pipelines shown on this page? Please see our GStreamer Debugging guide for help.


Introduction

On this page you are going to find a set of pipelines used on the DM8168 (EVM and Z3 boards) and DM8148 (EVM). It's important to mention that some of these pipelines are documented twice in the first section of this page, the first one is to be run on the Z3 board and the second one to be used on the DM8168-EVM board, the main difference between them is the absence of the omx_tvp element which performs the initialization of the TVP7002 hardware in order to capture video from the component input on the EVM board. This element is not needed on the Z3 board, since Z3 has created an initialization script called init_7611_edid.sh which performs the hardware initialization for both, the TVP7002 and the ADV7611 to capture from HDMI inputs. Moreover, the second section of this page contains pipelines to be used on DM8148 with the RidgeRun SDK, on this section you will find pipelines to display video on the HDMI port as well as on the LCD that is built in with the DM8148-EVM. Several of the DM8168 pipelines can be used with the DM8148 too as the RTP streaming pipeline, etc.

Please be sure to run the init_7611_edid.sh script before to run pipelines to capture video (if you are using the Z3 board), also you will need to disable the graphics plane before to run pipelines to display video:

./init_7611_edid.sh
echo 0 > /sys/devices/platform/vpss/graphics0/enabled

Some of these pipelines have to be run on the target board and other in your computer, they are distinguished by the color used to document them, blue pipelines are for the Z3 board, yellow for EVM board, LightPink for DM385 and green pipelines have to be run in your PC

DM81xx

Display videotest pattern

gst-launch videotestsrc ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omx_scaler ! gstperf  ! v4l2sink sync=false device=/dev/video1

gst-launch videotestsrc ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omx_scaler ! gstperf  ! v4l2sink sync=false device=/dev/video2

Scale the QVGA video test pattern to VGA

gst-launch -v videotestsrc ! 'video/x-raw-yuv,width=320,height=240' ! omx_scaler ! 'video/x-raw-yuv,width=640,height=480' ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf ! omx_videosink sync=false -v

Encode videotest pattern in H.264 (without container)

gst-launch -v videotestsrc num-buffers=1000 ! omx_h264enc ! gstperf ! filesink location=sample.264

Single Video test source RTP streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use the default port (4951) to send the packets, if you want to change the port number, you have to add the port capability to the udpsink.(e.g udpsink port=$PORT host=$CLIENT_IP)

Stream H.264 video test pattern over RTP

  • Server: DM81xx
CLIENT_IP=<Your IP address>

gst-launch videotestsrc ! omx_h264enc ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v

This pipeline is going to print the capabilities of each element's pad thanks to the -v option.The video streaming pipeline should print something similar to this output:

.
.
.
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)J0KAKouVAoPy, payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                                              
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                        
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, 
encoding-name=(string)H264, sprop-parameter-sets=(string)\"J0KAKouVAoPy\\,KN4BriA\\=\", payload=(int)96, ssrc=(uint)951364645, 
clock-base=(uint)2084568768, seqnum-base=(uint)10992                             
frames: 68      current: 67.90   average: 67.90 arm-load: 0   
.
.
.                                                                       

You need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)"J0KAKouVAoPy\,KN4BriA\=",payload=(int)96,ssrc=951364645,\
clock-base=2084568768,seqnum-base=10992

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

Stream H.264 encoded video file over RTP
These pipelines use a video file and send it over the network. Here you can use any file encoded in H.264.

  • Server: DM81xx
CLIENT_IP=<Your IP address>

FILE=sintel_trailer-1080p.mp4

gst-launch filesrc location=$FILE  ! qtdemux  ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v

As before, you need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\",payload=(int)96,\
ssrc=2152503956,clock-base=4043051310,seqnum-base=10306

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

Single RTP streaming capturing from camera 1080p@60fps

DM81xx EVM (server):

IP=<IP addess of your host machine - client>

gst-launch -e v4l2src device=/dev/video0 always-copy=false queue-size=8  ! \
  'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! \
  omxbufferalloc numBuffers=8 ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 \
i-period=16 bitrate=16000000 ! rtph264pay ! udpsink host=$IP port=3002 -v

Host side (client):

gst-launch-0.10 udpsrc port=3002 ! \
  'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"J0KAKouVAPAET8qAAA\=\=\,KN4BriAA",payload=(int)96, \
ssrc=(uint)1438616454, clock-base=(uint)1676303462, seqnum-base=(uint)64771' ! rtph264depay ! queue ! ffdec_h264 ! fpsdisplaysink sync=false 

DM81xx EVM (client):

gst-launch udpsrc port=3002 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"J0KAKouVAPAET8qAAA\=\=\,KN4BriAA",payload=(int)96, ssrc=(uint)1438616454, clock-base=(uint)1676303462, \
seqnum-base=(uint)64771' ! rtph264depay ! queue  ! h264parse ! "video/x-h264, stream-format=byte-stream,alignment=au"  ! omx_h264dec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true print-arm-load=true  ! omx_videosink enable-last-buffer=false

DM8168

Live Preview 1080p@60

Z3 board:

gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! omx_videosink sync=false

without omx_scaler we found a lower framerate:

gst-launch omx_camera ! "video/x-raw-yuv, format=(fourcc)YUY2, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! omx_videosink sync=false

EVM board:

gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_scaler ! omx_tvp ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! omx_videosink sync=false

without omx_scaler we found a lower framerate:

gst-launch omx_camera ! "video/x-raw-yuv, format=(fourcc)YUY2, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_tvp ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! omx_videosink sync=false

v4l2:

gst-launch v4l2src device=/dev/video0 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=8 ! gstperf ! omx_scaler  ! v4l2sink device=/dev/video1

1920x1080 JPEG snapshots (on M3 coprocessor)

ARM load: 3% EVM board:

gst-launch  v4l2src device=/dev/video0  num-buffers=1 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omxbufferalloc numBuffers=12 ! queue  ! gstperf ! omx_jpegenc num-buffers=1 ! filesink location= test.jpeg

Video Encoding MJPEG 1080@30fps (on M3 coprocessor)

EVM board:

gst-launch omx_camera num-buffers=1000  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=10" ! omx_jpegenc ! qtmux dts-method=0 ! gstperf  ! filesink location=mjpegVideo.mov

Video Encoding H264 1080@30fps

Z3 board:

gst-launch -e omx_camera output-buffers=10 skip-frames=1 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 ! gstperf print-arm-load=true print-fps=true ! \ 
omx_h264enc force-idr-period=23 i-period=23 bitrate=10000000 profile=1 ! queue ! rr_h264parser singleNalu=true ! mp4mux dts-method=0 ! filesink location=test30fps.mp4

EVM board:

gst-launch -e omx_camera output-buffers=10 skip-frames=1 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10  ! omx_tvp ! gstperf print-arm-load=true print-fps=true ! \
omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! queue ! h264parse output-format=0 ! mp4mux dts-method=0 ! filesink location=test.mp4

Video Encoding H264 1080@60fps

Z3 board:

gst-launch -e omx_camera output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)60/1, buffer-count-requested=10  ! gstperf print-arm-load=true print-fps=true ! \
omx_h264enc force-idr-period=46 i-period=46 bitrate=10000000 profile=1 ! queue ! rr_h264parser singleNalu=true ! mp4mux dts-method=0 ! filesink location=test60fps.mp4

EVM board:

gst-launch -e omx_camera output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)60/1, buffer-count-requested=10  ! omx_tvp ! gstperf print-arm-load=true print-fps=true ! \
omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! queue ! h264parse output-format=0 ! mp4mux dts-method=0 ! filesink location= test.mp4

Encoding: Video (H264) + Audio (AAC) 1080@30fps

EVM board:

gst-launch -e omx_camera skip-frames=1 output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10  ! omx_tvp \
 ! gstperf print-arm-load=true print-fps=true ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! queue  ! rr_h264parser singleNalu=true ! mux.video_00 alsasrc  latency-time=20000 \
buffer-time=800000 ! "audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)44100, channels=(int)2" !  omx_aacenc output-format=4 ! queue ! aacparse ! \
mux.audio_00 mp4mux dts-method=0 name=mux ! filesink location=audioVideo.mp4

Encoding: Video (H264) + Audio (AAC) 1080@60fps

EVM board:

gst-launch -e omx_camera output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)60/1, buffer-count-requested=10  ! omx_tvp ! \ 
gstperf print-arm-load=true print-fps=true ! omx_h264enc force-idr-period=10 i-period=10 bitrate=10000000 profile=2 ! queue  ! rr_h264parser singleNalu=true ! mux.video_00 alsasrc \
latency-time=20000 buffer-time=800000  ! "audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)48000, channels=(int)2" ! omx_aacenc \
output-format=4 ! queue ! aacparse ! mux.audio_00 mp4mux dts-method=0 name=mux ! filesink location=audioVideo.mp4

Decoding: Video (H264) + Audio (AAC) 1080@30fps

EVM board:

amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse  ! "video/x-h264, stream-format=byte-stream"  ! omx_h264dec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_30 ! gstperf print-fps=true print-arm-load=true  ! \
omx_videosink enable-last-buffer=false mux.audio_00 ! queue ! omx_aacdec ! alsasink

Decoding: Video (H264) + Audio (AAC) 1080@60fps

EVM board:

amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse  ! "video/x-h264, stream-format=byte-stream"  ! omx_h264dec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true print-arm-load=true  ! \
omx_videosink enable-last-buffer=false mux.audio_00 ! queue ! omx_aacdec ! alsasink

RTSP - Video H264 1080@30fps

Z3 board:

rr_rtsp_server " ( omx_camera skip-frames=1 output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10 ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! \
gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 )"

EVM board:

rr_rtsp_server " ( omx_camera skip-frames=1 output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10 ! omx_tvp ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 )"

Host side

gst-launch-0.10 rtspsrc location=rtsp://<IPADDRESS>:/test ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink -v

Using the Z3 board to receive the streaming:

gst-launch rtspsrc location=rtsp://<SERVERIPADDRESS>:/test ! rtph264depay ! queue ! h264parse access-unit=true ! queue ! omx_h264dec ! omx_scaler ! queue ! omx_ctrl display-mode=OMX_DC_MODE_1080P_30 ! gstperf print-fps=true print-arm-load=true \
! omx_videosink sync=false enable-last-buffer=false

RTSP - Video H264 1080@60fps

Z3 board:

rr_rtsp_server " ( omx_camera output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)60/1, buffer-count-requested=10 ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! \
gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 )"

EVM board:

rr_rtsp_server " ( omx_camera output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)60/1, buffer-count-requested=10 ! omx_tvp ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 )"

Host side

gst-launch-0.10 rtspsrc location=rtsp://<IPADDRESS>:/test ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink -v

Using the Z3 board to receive the streaming:

gst-launch rtspsrc location=rtsp://<SERVERIPADDRESS>:/test ! rtph264depay ! queue ! h264parse access-unit=true ! queue ! omx_h264dec ! omx_scaler ! queue ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true print-arm-load=true \
! omx_videosink sync=false enable-last-buffer=false


RTSP - Video (H264) + Audio (AAC) 1080@30fps

Z3 board:

rr_rtsp_server " ( omx_camera skip-frames=1 output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10 ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 pt=96 alsasrc latency-time=20000 buffer-time=800000 \
! audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)48000, channels=(int)2 ! omx_aacenc output-format=4 ! queue ! aacparse ! rtpmp4apay name=pay1 pt=97 ) "

EVM board:

rr_rtsp_server " ( omx_camera skip-frames=1 output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10 ! omx_tvp ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 pt=96 alsasrc latency-time=20000 buffer-time=800000 \
! audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)48000, channels=(int)2 ! omx_aacenc output-format=4 ! queue ! aacparse ! rtpmp4apay name=pay1 pt=97 ) "

Host side

Please use VLC

RTSP - Video (H264) + Audio (AAC) 1080@60fps

Z3 board:

rr_rtsp_server " ( omx_camera output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)60/1, buffer-count-requested=10 ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 \
! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 pt=96 alsasrc latency-time=20000 buffer-time=800000 \
! audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)48000, channels=(int)2 ! omx_aacenc output-format=4 ! queue ! aacparse ! rtpmp4apay name=pay1 pt=97 ) "

EVM board:

rr_rtsp_server " ( omx_camera output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)60/1, buffer-count-requested=10 ! omx_tvp ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 pt=96 alsasrc latency-time=20000 buffer-time=800000 \
! audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)48000, channels=(int)2 ! omx_aacenc output-format=4 ! queue ! aacparse ! rtpmp4apay name=pay1 pt=97 ) "

Host side

Please use VLC


Dual capture: Video recording (H264) 1080@30fps | taking snapshots JPEG - EVM & Z3 board

Recording

gst-launch -e v4l2src device=/dev/video0 num-buffers=1500 always-copy=false queue-size=8  ! 'video/x-raw-yuv-strided,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  \
omxbufferalloc numBuffers=12 ! omx_h264enc output-buffers=16 input-buffers=10 force-idr-period=30 i-period=30 bitrate=16000000  ! queue ! rr_h264parser ! mp4mux dts-method=0 ! filesink location= video.mp4  

Snapshots

gst-launch -e v4l2src device=/dev/video5 num-buffers=1 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! \
rrbufferalloc numBuffers=10 ! queue  ! ffmpegcolorspace ! jpegenc ! jifmux ! filesink location=snapshot.jpeg


Dual capture: Video recording (H264) 1080@30fps & Video recording (MJPEG on M3) 1080@30fps

EVM:

 gst-launch -e omx_camera num-buffers=2000 output-buffers=10 skip-frames=1 ! video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 ! tee name=t ! queue ! gstperf ! omx_h264enc \
output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink location=videoA.mp4 t. ! queue ! gstperf ! omx_jpegenc ! qtmux dts-method=0  ! filesink location=videoB.mov

Dual Capture and dual display

EVM:

gst-launch v4l2src device=/dev/video0 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=10 ! gstperf ! v4l2sink device=/dev/video2 v4l2src device=/dev/video5 \
always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=10 ! gstperf ! v4l2sink device=/dev/video1

Dual H264 encoding 1080p@30fps - single Capture

Z3 board:

gst-launch -e omx_camera num-buffers=2000 output-buffers=10 skip-frames=1 ! video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 ! tee \
name=t ! queue ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink \
location=videoA.mp4 t. ! queue ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! \
qtmux dts-method=0  ! filesink location=videoB.mp4

Dual Capture - Dual H264 encoding 1080p@30fps/60fps

Z3 board:

gst-launch -e omx_camera num-buffers=2000 output-buffers=10 skip-frames=1 ! video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 \
 ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink location=videoA.mp4 v4l2src \ 
device=/dev/video5 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=8 ! gstperf ! omx_h264enc \
output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink location=videoB.mp4


EVM:


gst-launch -e v4l2src device=/dev/video5 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=8 ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! \
 rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink location=videoA.mp4 v4l2src device=/dev/video5 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=8 ! gstperf ! \
omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rr_h264parser singleNalu=true  ! qtmux dts-method=0  ! filesink location=videoB.mp4

Dual Capture - Dual H264 encoding 1080p@30fps/60fps - Dual RTP streaming

Here you will find an example but likely when doing streaming of your own pipeline you will need to follow the steps [ here ] to get your correct Caps, after that run a similar pipeline to this one:

Z3 board:

PORTA=3001
PORTB=3002
IP=<IP addess of your host machine>

gst-launch -e omx_camera output-buffers=10 skip-frames=1 ! video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 ! gstperf ! \
omx_h264enc output-buffers=6 input-buffers=4 force-idr-period=16 i-period=16 bitrate=16000000 ! rtph264pay ! udpsink host=$IP port=3001 v4l2src device=/dev/video5 always-copy=false\
 queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=8 ! gstperf ! omx_h264enc output-buffers=6 input-buffers=4 \
force-idr-period=16 i-period=16 bitrate=16000000 ! rtph264pay ! udpsink host=$IP port=3002 -v

Host side:

gst-launch-0.10 udpsrc port=$PORTB ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"J0KAKouVAPAET8qAAA\=\=\,KN4BriAA", \
payload=(int)96, ssrc=(uint)1438616454, clock-base=(uint)1676303462, seqnum-base=(uint)64771' ! rtph264depay ! queue ! ffdec_h264 ! fpsdisplaysink sync=false udpsrc port=$PORTA ! 'application/x-rtp, \
media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"J0KAKouVAPAET8qAAA\=\=\,KN4BriAA", payload=(int)96, ssrc=(uint)3331867104, clock-base=(uint)3123811109, \
seqnum-base=(uint)15982' ! rtph264depay ! queue ! ffdec_h264 ! fpsdisplaysink sync=false

Dual Capture - Dual H264 encoding 1080p@30fps/60fps - Dual RTSP Streaming

Here you will find an example pipeline that demonstrates dual capture, encoding and RTSP streaming. The dual streaming is based on RidgeRun's proprietary RTSP Sink solution. You can use them as a base for your own pipeline design.

Z3 board:

PORT=554

gst-launch omx_camera input-interface=VIP1_PORTA output-buffers=10 ! 'video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10' ! \
gstperf ! omx_h264enc output-buffers=10 input-buffers=10 i-period=30 force-idr-period=90 bitrate=5000000 ! rr_h264parser singleNalu=true  ! video/x-h264, mapping=/stream1 ! rtspsink name=sink service=$PORT \
v4l2src device=/dev/video5 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omxbufferalloc numBuffers=12 ! \
gstperf ! omx_h264enc output-buffers=10 input-buffers=10 force-idr-period=90 i-period=30 bitrate=5000000 ! rr_h264parser singleNalu=true ! video/x-h264,mapping=/stream2 ! sink.

Host PC

vlc rtsp://<z3.ip.address>/stream1

vlc rtsp://<z3.ip.address>/stream2



Capture 1080p30 and Dual resize

Capture 1080p30 resize to 720p and display, at the same time taking 1080p30 and resize to VGA resolution and display

EVM board:

gst-launch v4l2src decimate=2 device=/dev/video0 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' !  omxbufferalloc numBuffers=10 ! omx_mdeiscaler name=d d.src_01 ! gstperf name=scaler1a ! \
omx_mdeiscaler name=e e.src_01 ! gstperf name=scaler2a ! fakesink e.src_00 ! 'video/x-raw-yuv, width=(int)640, height=(int)480' ! gstperf name=scaler2b ! v4l2sink device=/dev/video2 d.src_00 ! \
gstperf name=scaler1b ! 'video/x-raw-yuv, width=(int)1280, height=(int)720' ! v4l2sink device=/dev/video1

V4L2SRC video playback / encode pipelines (Tested on Z3 board)

V4L2src capture + V4L2 sink display:

gst-launch v4l2src always-copy=false queue-size=12 ! 'video/x-raw-yuv-strided,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omxbufferalloc numBuffers=12 ! omx_scaler ! gstperf ! v4l2sink sync=false

V4L2src capture + V4L2 sink display (1% average ARM consumption by removing omx_scaler element):

gst-launch v4l2src always-copy=false queue-size=12 ! 'video/x-raw-yuv-strided,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omxbufferalloc numBuffers=12 ! gstperf ! v4l2sink sync=false

V4L2src capture + OMX sink display:

gst-launch v4l2src always-copy=false queue-size=12 ! 'video/x-raw-yuv-strided,format=(fourcc)NV12,width=1920,height=1080,framerate=(fraction)60/1' ! omxbufferalloc numBuffers=12 ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf ! omx_videosink sync=false

OMX camera capture + OMX sink display:

gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_scaler  !  omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true print-arm-load=true  ! omx_videosink sync=false

V4L2src capture + H264 encoding:

gst-launch -e v4l2src always-copy=false queue-size=12 ! video/x-raw-yuv-strided, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)60/1, buffer-count-requested=12 ! omxbufferalloc numBuffers=12 ! gstperf print-arm-load=true print-fps=true \
! queue ! omx_h264enc i-period=5 force-idr-period=120 ! h264parse output-format=0 ! mp4mux dts-method=0 ! filesink location= test.mp4

Performance of the MJPEG encoder

Performance on a Z3 board

Running at 60 fps saving the file to the SD:

 / # 
/ # echo 3 >  /proc/sys/vm/dirty_ratio
/ # 
/ # 
/ # gst-launch omx_camera num-buffers=1000  ! "video/x-raw-yuv, format=(fourcc)N
V12, width=1920, height=1080, framerate=60/1, buffer-count-requested=10" ! omx_j
pegenc quality=70 ! qtmux dts-method=0 ! gstperf  ! filesink location=mjpegVideo
.mov
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
perf0: frames: 63 	current: 62.83	 average: 62.83	arm-load: 33
perf0: frames: 124 	current: 60.21	 average: 61.51	arm-load: 34
perf0: frames: 143 	current: 18.91	 average: 47.35	arm-load: 86
perf0: frames: 204 	current: 60.24	 average: 50.58	arm-load: 40
perf0: frames: 264 	current: 59.71	 average: 52.40	arm-load: 42
perf0: frames: 325 	current: 60.25	 average: 53.72	arm-load: 34
perf0: frames: 344 	current: 6.88	 average: 39.04	arm-load: 92
perf0: frames: 407 	current: 62.12	 average: 41.42	arm-load: 40
perf0: frames: 440 	current: 32.58	 average: 40.59	arm-load: 73
perf0: frames: 500 	current: 59.76	 average: 42.22	arm-load: 37
perf0: frames: 514 	current: 11.04	 average: 39.20	arm-load: 89
perf0: frames: 556 	current: 28.22	 average: 38.08	arm-load: 73
perf0: frames: 590 	current: 33.62	 average: 37.79	arm-load: 74
perf0: frames: 599 	current: 4.20	 average: 33.74	arm-load: 94
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 18026384777 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Running at 60 fps using a fakesink:

 / # gst-launch omx_camera num-buffers=1000  ! "video/x-raw-yuv, format=(fourcc)N
V12, width=1920, height=1080, framerate=60/1, buffer-count-requested=10" ! omx_j
pegenc quality=70 ! gstperf ! fakesink 
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
perf0: frames: 64 	current: 63.27	 average: 63.27	arm-load: 28
perf0: frames: 124 	current: 59.72	 average: 61.50	arm-load: 7
perf0: frames: 185 	current: 59.75	 average: 60.91	arm-load: 8
perf0: frames: 246 	current: 60.23	 average: 60.74	arm-load: 53
perf0: frames: 306 	current: 59.73	 average: 60.54	arm-load: 9
perf0: frames: 367 	current: 60.23	 average: 60.49	arm-load: 5
perf0: frames: 427 	current: 59.73	 average: 60.38	arm-load: 9
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 7261175111 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Running at 30 fps saving the file to the SD:

 / # gst-launch omx_camera num-buffers=1000 skip-frames=1 ! "video/x-raw-yuv, for
mat=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requeste
d=10" ! omx_jpegenc quality=70 ! qtmux dts-method=0 ! gstperf  ! filesink locati
on=mjpegVideo.mov
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
perf0: frames: 24 	current: 23.90	 average: 23.90	arm-load: 32
perf0: frames: 55 	current: 30.12	 average: 27.05	arm-load: 18
perf0: frames: 85 	current: 29.87	 average: 27.98	arm-load: 17
perf0: frames: 116 	current: 30.12	 average: 28.52	arm-load: 25
perf0: frames: 146 	current: 29.87	 average: 28.79	arm-load: 17
perf0: frames: 166 	current: 19.90	 average: 27.32	arm-load: 66
perf0: frames: 197 	current: 30.13	 average: 27.73	arm-load: 25
perf0: frames: 227 	current: 29.86	 average: 27.99	arm-load: 22
perf0: frames: 258 	current: 30.13	 average: 28.23	arm-load: 16
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 10265632555 ns.
Setting pipeline to PAUSED ...
perf0: frames: 288 	current: 29.85	 average: 28.39	arm-load: 17
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Audio Encoding: WAV

EVM board:

gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2,endianness=(int)1234,signed=(boolean)true, width=(int)16, depth=(int)16' ! wavenc ! \
gstperf ! filesink location=audio_aac.wav sync=false

RTSP 4 channels A/V - Performance measurements

Z3 board:

In the following web link you can find a wiki page with some ARM load measurements for several RTSP streaming pipelines, some of them use until four instances of H264 encoder plus one AAC encoder to send four Audio/Video data streams through four RTSP channels: Performance Tests - Video(H264)/Audio(AAC) RTSP streaming pipelines

DM8148

Live Preview 1080p@60

LCD:

./etc/init.d/change_display lcd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_00 !  'video/x-raw-yuv, width=(int)800, height=(int)480' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_1080P_60 display-device=LCD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-device=LCD sync=false d.src_01  ! fakesink silent=true -v


HDMI:

./etc/init.d/change_display hdmi
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_scaler  ! omx_tvp  ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true \
print-arm-load=true  ! omx_videosink sync=false


SD:

  • NTSC
./etc/init.d/change_display sd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_01 !  'video/x-raw-yuv, width=(int)720, height=(int)480' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_NTSC display-device=SD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-mode=OMX_DC_MODE_NTSC display-device=SD sync=false d.src_00  ! fakesink silent=true -v
  • PAL
./etc/init.d/change_display sd pal
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_01 !  'video/x-raw-yuv, width=(int)720, height=(int)576' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_PAL display-device=SD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-mode=OMX_DC_MODE_PAL display-device=SD sync=false d.src_00  ! fakesink silent=true -v

Live Preview 1080p@30

LCD:

./etc/init.d/change_display lcd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera skip-frames=1   ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=30/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_00 !  'video/x-raw-yuv, width=(int)800, height=(int)480'\
 ! omx_tvp ! omx_ctrl display-mode=OMX_DC_MODE_1080P_30 display-device=LCD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-device=LCD sync=false d.src_01  ! fakesink silent=true -v


HDMI:

./etc/init.d/change_display hdmi
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera skip-frames=1  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_scaler  ! omx_tvp  ! omx_ctrl display-mode=OMX_DC_MODE_1080P_30 ! gstperf \
print-fps=true print-arm-load=true  ! omx_videosink sync=false


SD:

  • NTSC
./etc/init.d/change_display sd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=30/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_01 !  'video/x-raw-yuv, width=(int)720, height=(int)480' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_NTSC display-device=SD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-mode=OMX_DC_MODE_NTSC display-device=SD sync=false d.src_00  ! fakesink silent=true -v
  • PAL
./etc/init.d/change_display sd pal
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=30/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_01 !  'video/x-raw-yuv, width=(int)720, height=(int)576' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_PAL display-device=SD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-mode=OMX_DC_MODE_PAL display-device=SD sync=false d.src_00  ! fakesink silent=true -v

Video Encoding H264 1080@30fps

gst-launch -e omx_camera output-buffers=10 skip-frames=1 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10 ! omx_tvp ! gstperf print-arm-load=true print-fps=true ! omx_h264enc \
force-idr-period=23 i-period=23 bitrate=10000000 profile=1 ! queue ! rr_h264parser singleNalu=true ! mp4mux dts-method=0 ! filesink location=test_1080_30fps.mp4
sync

Video Encoding H264 720@30fps

gst-launch -e omx_camera output-buffers=10 skip-frames=1 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)30/1, buffer-count-requested=10 ! omx_tvp std=720 ! gstperf print-arm-load=true \ 
print-fps=true ! omx_h264enc force-idr-period=23 i-period=23 bitrate=10000000 profile=1 ! queue ! rr_h264parser singleNalu=true ! mp4mux dts-method=0 ! filesink location=test_720_30fps.mp4
sync

Encoding: Video (H264) + Audio (AAC) 1080@30fps

EVM board:

echo 100 > /proc/sys/vm/dirty_expire_centisecs
gst-launch -e omx_camera skip-frames=1 output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)30/1, buffer-count-requested=10  ! omx_tvp ! gstperf print-arm-load=true print-fps=true ! \
omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! queue  ! rr_h264parser singleNalu=true ! mux.video_00 alsasrc  latency-time=20000 buffer-time=800000 ! "audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, \
width=(int)16, depth=(int)16, rate=(int)44100, channels=(int)2" !  omx_aacenc output-format=4 ! queue ! aacparse ! mux.audio_00 mp4mux dts-method=0 name=mux ! filesink location=audioVideo.mp4

Encoding: Video (H264) + Audio (AAC) 1080@60fps

gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_00 ! 'video/x-raw-yuv, width=(int)800, height=(int)480' ! omx_tvp ! \

omx_ctrl display-mode=OMX_DC_MODE_1080P_60 display-device=LCD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-device=LCD sync=false d.src_01  ! fakesink 

EVM board:

echo 100 > /proc/sys/vm/dirty_expire_centisecs
gst-launch -e omx_camera output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1920, height=1080, framerate=\(fraction\)60/1, buffer-count-requested=10  ! omx_tvp ! gstperf print-arm-load=true print-fps=true ! omx_h264enc \
force-idr-period=4 i-period=4 bitrate=10000000 profile=1 ! queue  ! h264parse output-format=0 ! mux.video_00 alsasrc  latency-time=20000 buffer-time=800000 ! "audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, \
depth=(int)16, rate=(int)44100, channels=(int)2" !  omx_aacenc output-format=4 ! queue ! aacparse ! mux.audio_00 qtmux name=mux ! filesink location=audioVideo.mp4

Encoding: Video (H264) + Audio (AAC) 720@30fps

EVM board:

gst-launch -e omx_camera skip-frames=1 output-buffers=10 !  video/x-raw-yuv, format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)30/1, buffer-count-requested=10  ! omx_tvp std=720 ! gstperf print-arm-load=true print-fps=true ! omx_h264enc force-idr-period=23 \
i-period=23 bitrate=10000000 profile=1 ! queue  ! rr_h264parser singleNalu=true ! mux.video_00 alsasrc  latency-time=20000 buffer-time=800000 ! "audio/x-raw-int, endianness=(int)1234, signed=(boolean)true, width=(int)16, depth=(int)16, rate=(int)44100, channels=(int)2" ! \
 omx_aacenc output-format=4 ! queue ! aacparse ! gstperf ! mux.audio_00 mp4mux dts-method=0 name=mux ! filesink location=audioVideo.mp4

Decoding: Video (H264) + Audio (AAC) 1080@30fps

LCD:

./etc/init.d/change_display lcd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse output-format=1 ! omx_h264dec ! omx_mdeiscaler name=d d.src_00 !  'video/x-raw-yuv, width=(int)800, height=(int)480'  ! omx_ctrl \
display-mode=OMX_DC_MODE_1080P_30 display-device=LCD !  omx_videosink display-device=LCD enable-last-buffer=false mux.audio_00 ! queue ! omx_aacdec ! alsasink d.src_01  ! fakesink silent=true -v

HDMI:

./etc/init.d/change_display hdmi
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse output-format=1 ! omx_h264dec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_30 ! omx_videosink enable-last-buffer=false \
mux.audio_00 ! queue ! omx_aacdec ! alsasink

Decoding: Video (H264) + Audio (AAC) 1080@60fps

LCD:

./etc/init.d/change_display lcd
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse output-format=1 ! omx_h264dec ! omx_mdeiscaler name=d d.src_00 !  'video/x-raw-yuv, width=(int)800, height=(int)480'  ! omx_ctrl \
display-mode=OMX_DC_MODE_1080P_60 display-device=LCD !  omx_videosink display-device=LCD enable-last-buffer=false mux.audio_00 ! queue ! omx_aacdec ! alsasink d.src_01  ! fakesink silent=true -v
gst-launch omx_camera  ! "video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=60/1, buffer-count-requested=4" ! omx_mdeiscaler name=d d.src_00 !  'video/x-raw-yuv, width=(int)800, height=(int)480' ! omx_tvp ! \
 omx_ctrl display-mode=OMX_DC_MODE_1080P_60 display-device=LCD ! gstperf print-fps=true print-arm-load=true  ! omx_videosink display-device=LCD sync=false d.src_01  ! fakesink 

HDMI:

./etc/init.d/change_display hdmi
echo 0 > /sys/devices/platform/vpss/graphics0/enabled
amixer sset PCM 127
amixer sset 'Line DAC' 118
gst-launch filesrc location= audioVideo.mp4 ! qtdemux name=mux mux.video_00 ! queue  ! h264parse output-format=1 ! omx_h264dec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf print-fps=true print-arm-load=true  ! \
omx_videosink enable-last-buffer=false mux.audio_00 ! queue ! omx_aacdec ! alsasink

Encoding: Audio AAC-LC

gst-launch -e alsasrc num-buffers=500 latency-time=20000 buffer-time=800000  ! omx_aacenc output-format=4 ! aacparse  ! qtmux ! filesink location= audio.mp4

Decoding: Audio AAC-LC

 gst-launch filesrc location= audio.mp4 ! qtdemux  ! omx_aacdec framemode=true input-buffers=6 ! alsasink sync=false -v

RTSP - Video H264 1080@30fps

EVM board:

rr_rtsp_server " ( omx_camera skip-frames=1 output-buffers=10  ! video/x-raw-yuv, format=(fourcc)NV12, width=1920, height=1080, framerate=(fraction)30/1, buffer-count-requested=10 ! omx_tvp ! omx_h264enc force-idr-period=4 i-period=4 bitrate=10000000 \
profile=1 ! gstperf print-arm-load=true print-fps=true ! queue ! video/x-h264, width=(int)1920, height=(int)1080, stream-format=(string)byte-stream, alignment=(string)au ! rtph264pay name=pay0 )"

Host side

gst-launch-0.10 rtspsrc location=rtsp://<IPADDRESS>:/test ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink -v

Using another DM8148 board to receive the streaming and display it on LCD:

gst-launch rtspsrc location=rtsp://<IPADDRESS>:/test ! rtph264depay ! queue ! h264parse access-unit=true ! queue ! omx_h264dec ! omx_mdeiscaler name=d d.src_00 ! 'video/x-raw-yuv, width=(int)800, height=(int)480' ! queue ! omx_ctrl \
display-mode=OMX_DC_MODE_1080P_30  display-device=LCD ! gstperf print-fps=true print-arm-load=true ! omx_videosink sync=false enable-last-buffer=false display-device=LCD d.src_01 ! fakesink silent=true 


Encoding: Video (H264) + Audio (AAC) 1080@60fps from the SD card prebuilt Image

On the root directory you should find a script called recording.sh, to start the demo just type

 ./recording.sh

And the demo should start capturing from the HDTV and the alsa default input. To stop the demo simply press Ctrl+C.


DM385

Live Preview 1080p@30

HDMI:

echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch v4l2src device=/dev/video7 always-copy=false queue-size=8   ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=1920,height=1080,framerate=(fraction)30/1'  !  \ 
omxbufferalloc numBuffers=10  ! queue  ! gstperf  ! v4l2sink device=/dev/video1 sync=falsesilent=true -v


Video Encoding H264 1080@30fps

gst-launch v4l2src -e device=/dev/video7 always-copy=false queue-size=8  ! 'video/x-raw-yuv,format=(fourcc)YUY2,width=1920,height=1080,framerate=(fraction)30/1'  ! omxbufferalloc numBuffers=10  ! \
omx_noisefilter  ! gstperf  ! queue  ! omx_h264enc force-idr-period=23 i-period=23 bitrate=10000000 profile=1  ! queue  ! rr_h264parser singleNalu=true  ! mp4mux dts-method=0  ! filesink location=test_1080_30fps.mp4


Decoding: Video (H264) + Audio (AAC) 1080@30fps

HDMI:

echo 0 > /sys/devices/platform/vpss/graphics0/enabled
gst-launch filesrc location=<FILE_NAME>.mp4  ! qtdemux name=mux mux.video_00  ! queue  ! h264parse  ! 'video/x-h264, stream-format=(string)byte-stream, alignment=(string)au'  ! omx_h264dec  ! \
omx_scaler  ! 'video/x-raw-yuv, width=(int)1920, height=(int)1080'  ! gstperf  ! v4l2sink device=/dev/video1 sync=false

Video Crop

You can find an overview of the crop and scaling features available for the DM816x and DM814x platforms and its use through gstreamer in the Video crop and scaling with DM816x and DM814x page


FAQ

FAQ 1. I already enabled AAC audio encoding with the instructions in the getting started guide but I am still getting a segfault, what is wrong?

Setting pipeline to PAUSED ...
Caught SIGSEGV accessing address 0x4
Spinning.  Please run 'gdb gst-launch 487' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

Answer :

The problem that you are seeing is because your DSP firmware doesn't include the audio codec yet. Some times this happens when you first built the SDK without audio encoding support in the DSP, so the SDK created a EZSDK tarball with all the codecs needed but it didn't include the audio AAC codec, the next time you enabled the audio codecs support but the SDK keeps using the same tarball without audio, so the SDK didn't try to build a new firmware for the SDK with audio support because it didn't find the codec. You need to remove your old EZSDK tarball and compile again to create a new tarball including a new DSP firmware with audio support.


Step 1:

cd $DEVDIR
`make env`
#note, this will remove EZSDK tarball for DM816x and DM814x, if you want to remove just one please specify the name
rm -rf /opt/ridgerun/downloads/ti-ezsdk_dm81*
rm -rf downloads/ti-ezsdk_dm81*
cd $DEVDIR/proprietary/ezsdk-5_05_02_00/
make distclean
make ; make install


OR In the Step 1 above, if you find any errors while build at $DEVDIR/proprietary/ezsdk-5_05_02_00/ please follow the steps below


Step 2: (If the Step 1 fails again)


a) Remove EZSDK from your home directory or from /usr/local/ :

rm -rf <<path where the EZSDK is installed>>


b) Install EZSDK and codecs


c) Repeat the Step1 above:


OR Finally, if you still came across the same error, please do a fresh 'git pull origin' at a new $DEVDIR .

Articles related

RidgeRun gst-omx plugins and pipelines
Spectrum Digital DM8168 EVM
DM8168 Z3 RPS
Mistral DM8148 EVM