LeopardBoard GStreamer Pipelines

From RidgeRun Developer Connection

(Difference between revisions)
Jump to:navigation, search
(Camera streaming over the network with Auto exposure white balance demo library)
(When using dmaiacccel, we don't need v4l2src doing a copy.)
(30 intermediate revisions not shown)
Line 7: Line 7:
<pre>
<pre>
-
/ # alsamixer
+
alsamixer
</pre>
</pre>
Line 41: Line 41:
</pre>
</pre>
-
== Testing Audio Input ==
+
== Audio Input ==
=== Audio Test Source ===
=== Audio Test Source ===
Line 48: Line 48:
We did not test with an actual microphone.   
We did not test with an actual microphone.   
Play some audio on the device and execute the next pipeline
Play some audio on the device and execute the next pipeline
-
 
-
<pre>
 
-
gst-launch audiotestsrc ! alsasink sync =false
 
-
</pre>
 
=== Routing Line In to Line Out ===
=== Routing Line In to Line Out ===
Line 62: Line 58:
<pre>
<pre>
-
gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! dmaienc_aac bitrate=128000 ! qtmux ! filesink location=audio_aac.mp4
+
gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! queue ! dmaienc_aac bitrate=128000 ! qtmux ! filesink location=audio_aac.mp4
</pre>
</pre>
Line 72: Line 68:
<pre>
<pre>
-
gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! dmaienc_mp3 bitrate=128000 ! filesink location=audio_mp3.mp3
+
gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! queue ! dmaienc_mp3 bitrate=128000 ! filesink location=audio.mp3
 +
</pre>
 +
 
 +
== Audio output ==
 +
 
 +
=== Testing amplification ===
 +
 
 +
Use ''alsamixer'' to set the sound level and test with the following pipeline.
 +
 
 +
<pre>
 +
gst-launch audiotestsrc ! alsasink sync=false
 +
</pre>
 +
 
 +
=== AAC playback ===
 +
 
 +
<pre>
 +
gst-launch filesrc location=audio.mp4 ! qtdemux ! dmaidec_aac ! alsasink
 +
</pre>
 +
 
 +
=== MP3 playback ===
 +
 
 +
<pre>
 +
gst-launch filesrc location=audio.mp3 ! mad ! alsasink
</pre>
</pre>
Line 151: Line 169:
  ! TIDmaiVideoSink sync=false accelFrameCopy=false videoOutput=composite videoStd=D1_NTSC sync=false enable-last-buffer=false&
  ! TIDmaiVideoSink sync=false accelFrameCopy=false videoOutput=composite videoStd=D1_NTSC sync=false enable-last-buffer=false&
</pre>
</pre>
 +
 +
== Video Resizing ==
 +
 +
The DM36x processor includes a hardware resizer module to perform image-scaling operations and color space conversion. There are three different ways you can use the hardware resizer with GStreamer:
 +
 +
 +
1. '''Chain ipipe with v4l2src element:''' The v4l2src element provides the property ''chain-ipipe'', that indicates if the element must chain (use) the ipipe preview and resizer hardware modules.  When chained, you capture from the image sensor(e.g MT9P031) or YUV decoder(e.g TVP7002) with the ipipe preview and resizer modules configured for continuous mode of operation.  This means the previewer and resizer are included in the video frame data path, processing each captured frame to do image tuning and resizing. Due to vpfe driver limitations, the image scaling is only available for use with video frames from a YUV decoder.  If you use a raw image sensor the driver only allows you change the color space. The following is a pipeline that resizes the input video frames to a 640x480 resolution. You simply specify the capabilities filter setting to the desired width and height.
 +
 +
<pre>
 +
gst-launch -e v4l2src input-src=component chain-ipipe=true always-copy=false !  capsfilter caps="video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480" ! dmaiaccel ! TIDmaiVideoSink enable-last-buffer=false videoStd=720P_60
 +
</pre>
 +
 +
2. '''Use dmairesizer element:''' The dmairesizer element configures the DM36x hardware resizer and use it to scale each video input frame to the desired resolution. In this case, the DM36x resizer operates in single-shot mode, meaning the ARM processor needs to pass each buffer containing a video frame to the resizer. If you use this mode with v4l2src element, you must set the property ''chain-ipipe'' to false since the resizer can't be used simultaneously in continuous and in single-shot modes. The following is an example pipeline using dmairesizer.
 +
<pre>
 +
gst-launch -e v4l2src input-src=component chain-ipipe=false always-copy=false ! dmaiaccel ! dmairesizer target-width=640 target-height=480 ! TIDmaiVideoSink enable-last-buffer=false videoStd=720P_60
 +
</pre>
 +
 +
3.  '''Use dual resizer:''' [[DM36x_Dual_Resizer | DM36x dual resizer]]
==Video Recording==
==Video Recording==
 +
===Recording h264 NTSC===
===Recording h264 NTSC===
 +
<pre>  
<pre>  
-
gst-launch -e v4l2src always-copy=true chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel !\
+
gst-launch -e v4l2src always-copy=false chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel !\
-
  dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 !dmaiperf ! qtmux ! filesink location=test8003.h264\
+
  dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 !dmaiperf ! qtmux ! filesink location=test8003.mp4\
  sync=false enable-last-buffer=false&
  sync=false enable-last-buffer=false&
</pre>
</pre>
Line 164: Line 202:
ipipe-client run-config-script dm365_mt9p031_config
ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 150000 15 50 640 480 50
ipipe-client init-aew G EC S C 150000 15 50 640 480 50
-
gst-launch -e v4l2src always-copy=true chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel ! dmaienc_h264\
+
gst-launch -e v4l2src always-copy=false chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel ! dmaienc_h264\
-
  encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test8003.h264 sync=false\
+
  encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test8003.mp4 sync=false\
  enable-last-buffer=false&
  enable-last-buffer=false&
</pre>
</pre>
Line 172: Line 210:
<pre>  
<pre>  
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264\
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264\
-
  encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test.h264 sync=false enable-last-buffer=false&
+
  encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test.mp4 sync=false enable-last-buffer=false&
</pre>
</pre>
Line 180: Line 218:
ipipe-client init-aew G EC S C 150000 15 50 1280 720 50
ipipe-client init-aew G EC S C 150000 15 50 1280 720 50
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf\
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf\
-
  ! qtmux ! filesink location=test.h264 sync=false enable-last-buffer=false&
+
  ! qtmux ! filesink location=test.mp4 sync=false enable-last-buffer=false&
</pre>
</pre>
Line 212: Line 250:
gst-launch filesrc location= av_mpeg2.mpg ! mpegdemux name=demux .audio_00 ! queue ! mad ! alsasink demux.video_00 ! queue ! dmaidec_mpeg2 ! \
gst-launch filesrc location= av_mpeg2.mpg ! mpegdemux name=demux .audio_00 ! queue ! mad ! alsasink demux.video_00 ! queue ! dmaidec_mpeg2 ! \
TIDmaiVideoSink accelFrameCopy=true videoOutput=composite videoStd=D1_NTSC
TIDmaiVideoSink accelFrameCopy=true videoOutput=composite videoStd=D1_NTSC
 +
</pre>
 +
 +
===MJPEG===
 +
 +
<pre>
 +
gst-launch -e v4l2src always-copy=false input-src=composite ! 'video/x-raw-y
 +
uv,format=(fourcc)UYVY,width=720,height=480,pitch=736' ! dmaiaccel ! dmaienc_mjp
 +
eg qValue=30 outputBufferSize=2500000 ! dmaiperf print-arm-load=true ! queue ! 
 +
qtmux ! filesink location=test_v_mjpeg.mp4
</pre>
</pre>
===MJPEG+AAC===
===MJPEG+AAC===
 +
<pre>
<pre>
gst-launch -e alsasrc provide-clock=false latency-time=30000 buffer-time=800000 ! 'audio/x-raw-int,rate=(int)16000,channels=(int)2' ! \
gst-launch -e alsasrc provide-clock=false latency-time=30000 buffer-time=800000 ! 'audio/x-raw-int,rate=(int)16000,channels=(int)2' ! \
Line 226: Line 274:
dmaidec_aac ! alsasink
dmaidec_aac ! alsasink
</pre>
</pre>
 +
== Camera streaming over the network ==
== Camera streaming over the network ==
 +
These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host.
These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host.
Line 234: Line 284:
PORT=3000
PORT=3000
-
gst-launch v4l2src always-copy=FALSE input-src=composite chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)23/1 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000\
+
 
-
! rtph264pay !udpsink port=$PORT host=$HOST_ADDR sync=false enable-last-buffer=false&
+
gst-launch -e v4l2src always-copy=FALSE input-src=composite chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)23/1 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! rtph264pay ! udpsink port=$PORT host=$HOST_ADDR sync=false enable-last-buffer=false &
</pre>
</pre>
-
*On Client Ubuntu
+
* On Client Ubuntu
<pre>
<pre>
PORT=3000
PORT=3000
-
gst-launch udpsrc port=$PORT ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, profile-level-id=(string)640028, sprop-parameter-sets=(string)"J2QAKK2IDkOYIOEMKQpEByHMEHCGFIUiA5DmCDhDCkKQwEIYwhxmMhCGAhDGEOMxkIQwEIYwhxmMhCICEZjOI8KfEfiP4j8R8R4ziMREQoEIjEcR4j5PxH8n5PiPEcRkiLQCgC3I\,KO48sA\=\=", payload=(int)96, ssrc=(guint)1646914791, clock-base=(guint)4182448954, seqnum-base=(guint)19314' ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false
+
gst-launch -v udpsrc port=3000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\,aO48sA\=\=", payload=(int)96, ssrc=(guint)1335677188, clock-base=(guint)2580247201, seqnum-base=(guint)5999' ! rtph264depay ! 'video/x-h264' ! ffdec_h264 ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false' ! xvimagesink
</pre>
</pre>
Line 260: Line 310:
PORT=3000
PORT=3000
-
gst-launch udpsrc port=$PORT ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, profile-level-id=(string)640028, sprop-parameter-sets=(string)"J2QAKK2IDkOYIOEMKQpEByHMEHCGFIUiA5DmCDhDCkKQwEIYwhxmMhCGAhDGEOMxkIQwEIYwhxmMhCICEZjOI8KfEfiP4j8R8R4ziMREQoEIjEcR4j5PxH8n5PiPEcRkiLQCgC3I\,KO48sA\=\=", payload=(int)96, ssrc=(guint)1646914791, clock-base=(guint)4182448954, seqnum-base=(guint)19314' ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false
+
gst-launch -v udpsrc port=3000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\,aO48sA\=\=", payload=(int)96, ssrc=(guint)1335677188, clock-base=(guint)2580247201, seqnum-base=(guint)5999' ! rtph264depay ! 'video/x-h264' ! ffdec_h264 ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false' ! xvimagesink
 +
</pre>
 +
 
 +
 
 +
=== NTSC camera streaming MJPEG over the network ===
 +
 
 +
* Streaming from leo36x
 +
 
 +
<pre>
 +
HOST=10.111.0.4
 +
 +
gst-launch -v v4l2src chain-ipipe=true always-copy=FALSE input-src=composite ! 'video/x-raw-yuv, format=(fourcc)UYVY, width=736, height=480, framerate=(fraction)30000/1001' ! dmaienc_mjpeg qValue=50 ! udpsink port=5000 host=$HOST
 +
</pre>
 +
 
 +
* Streaming from Ubuntu (using webcam)
 +
 
 +
<pre>
 +
HOST=10.111.0.4
 +
 
 +
gst-launch v4l2src ! ffenc_mjpeg ! rtpjpegpay pt=96 ! udpsink host=$HOST port=5000
 +
</pre>
 +
 
 +
* Streaming to Ubuntu
 +
 
 +
<pre>
 +
gst-launch udpsrc port=5000 ! 'image/jpeg, width=(int)736, height=(int)480, framerate=(fraction)30000/1001, pixel-aspect-ratio=(fraction)1/1' ! jpegdec ! xvimagesink
</pre>
</pre>
Line 290: Line 365:
Please see [[Hardware setup - LeopardBoard DM365]] in order to get see more examples of capturing video from external modules.
Please see [[Hardware setup - LeopardBoard DM365]] in order to get see more examples of capturing video from external modules.
 +
 +
=== Component out 720P_60 ===
 +
 +
==== Video test source ====
 +
 +
<pre>
 +
gst-launch -e videotestsrc ! TIDmaiVideoSink sync=false enable-last-buffer=false videoOutput=component videoStd=720P_60
 +
</pre>
 +
 +
==== NTSC TVP5146 input ====
 +
 +
NTSC framerate is not 30 fps, but 30000 / 1001.  Also, the width must be divisible by 32 on DM36x hardware.
 +
 +
<pre>
 +
gst-launch  v4l2src chain-ipipe=true always-copy=false input-src=composite ! 'video/x-raw-yuv,format=(fourcc)NV12, width=736, height=480, framerate=(fraction)30000/1001' ! dmaiaccel !  dmaiperf ! TIDmaiVideoSink videoOutput=component videoStd=720P_60
 +
</pre>
[[Category:DM36x]] [[Category:LeopardBoard]] [[Category:GStreamer]]
[[Category:DM36x]] [[Category:LeopardBoard]] [[Category:GStreamer]]

Revision as of 18:09, 6 November 2012

The following examples show the usage of GStreamer with the RR DM365 Leopard board SDK 2011Q2. All the commands indicated in this examples are intended to be executed on the target, NOT the host.

Contents

Modifying audio levels

The sound should be heard on the headphones/speakers If the audio volume is too low, edit it by typing:

alsamixer
Then increase the PCM and Line DAC levels.
Also if the sound is noisy activate the AGC option using alsamixer (key M toggles between mute and active).
lqqqqqqqqqqqqqqqqqqqqqqqqqqqq AlsaMixer v1.0.24.2 qqqqqqqqqqqqqqqqqqqqqqqqqqqqqk
x Card: DaVinci DM365 EVM                              F1:  Help               x
x Chip:                                                F2:  System information x
x View: F3:[Playback] F4: Capture  F5: All             F6:  Select sound card  x
x Item: PCM [dB gain: -23.50, -23.50]                  Esc: Exit               x
x                                                                              x
x     lqqk     lqqk              lqqk     lqqk     lqqk              lqqk      x
x     x  x     x  x              x ax     x  x     x ax              xa x      x
x     x  x     x  x              x ax     x  x     x ax              xa x      x
x     x  x     x  x              x ax     x  x     x ax              xa x      >
x     x  x     x  x              x ax     x  x     x ax              xa x      >
x     x  x     x  x              x ax     x  x     x ax              xa x      >
x     x  x     x  x              x ax     x  x     x ax              xa x      >
x     x  x     x  x              x ax     x  x     x ax              xa x      >
x     xaax     xaax              xaax     xaax     xaax              xaax      >
x     xaax     xaax              xaax     xaax     xaax              xaax      >
x     xaax     xaax              xaax     xaax     xaax              xaax      x
x     xaax     xaax              xaax     xaax     xaax              xaax      x
x     mqqj     mqqj     lqqk     mqqj     mqqj     mqqj     lqqk     mqqj      x
x                       xOOx                                xOOx               x
x                       mqqj                                mqqj               x
x    35<>35   41<>41            41<>100    41     41<>100          100<>41     x
x  <  PCM   >Line DAC  LineL   LineL DA LineL Le LineL Li  LineR   LineR Li    x
mqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqj

Audio Input

Audio Test Source

Connect an audio player device to the MIC jack and speakers/headphones to the OUT jack. We did not test with an actual microphone. Play some audio on the device and execute the next pipeline

Routing Line In to Line Out

gst-launch alsasrc ! alsasink sync=false

Capturing Audio to AAC File

gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! queue ! dmaienc_aac bitrate=128000 ! qtmux ! filesink location=audio_aac.mp4

Capturing Audio to MP3 File

For this pipeline the MP3 codec and plugin need to be installed, further instructions can be found here
https://www.ridgerun.com/developer/wiki/index.php/Add_MP3_support_to_20011q2_SDK

gst-launch -e alsasrc ! 'audio/x-raw-int,rate=(int)44100,channels=(int)2' ! queue ! dmaienc_mp3 bitrate=128000 ! filesink location=audio.mp3

Audio output

Testing amplification

Use alsamixer to set the sound level and test with the following pipeline.

gst-launch audiotestsrc ! alsasink sync=false

AAC playback

gst-launch filesrc location=audio.mp4 ! qtdemux ! dmaidec_aac ! alsasink

MP3 playback

gst-launch filesrc location=audio.mp3 ! mad ! alsasink

Enabling video window

When you boot, the OSD (on screen display), called frame buffer, has priority for the video output. To enabled the video window to be visible, you need to run:

fbset -disable

Video and audio playback

Get a demo clip

wget -c http://downloads.dvdloc8.com/trailers/divxdigest/simpsons_movie_trailer.zip

Then unzip it

 unzip simpsons_movie_trailer.zip

And play it

 
gst-launch filesrc location= The\ Simpsons\ Movie\ -\ Trailer.mp4 ! qtdemux name=demux ! queue ! \
 dmaidec_h264 numOutputBufs=16 ! priority nice=-10 !  queue ! priority nice=-10 ! dmaiperf ! TIDmaiVideoSink  \
 accelFrameCopy=true videoOutput=component videoStd=720P_60  demux.audio_00 !queue ! priority nice=-5 ! dmaidec_aac ! alsasink

Video playback

The previewer and resizer for video capture must be initialized by the user. You can do that in two ways:

1. Compile and install the ipiped application located in the File System Configuration->Select target's file system software section of the configuration screen (make config). This application allows to get control over some video capturing properties (for more information see Auto exposure and auto white balance library). You can initialized the previewer and resizer by running the following command:

ipipe-client run-config-script dm365_mt9p031_config

or you can run this command instead:

ipipe-client set-previewer-mode cont

NOTE: The initialization must be done before any video capture session.


2. The V4L2 source element (v4l2src) as a build in property called chain-ipipe which automatically initializes the previewer when it is set (by default it is set to true).

NOTE: when using one of the methods described above the other one must not be used since this can create a conflict when the code tries to re-initialize the previewer.

Video test source

 
gst-launch videotestsrc ! TIDmaiVideoSink sync=false enable-last-buffer=false

Output:DVI, mode: 720P_60

Output:composite, mode: NTSC

 
gst-launch -e v4l2src chain-ipipe=true always-copy=false ! dmaiaccel ! video/x-raw-yuv,format=\(fourcc\)NV12, width=640, height=480, framerate=\(fraction\)30/1 !dmaiperf\
 ! TIDmaiVideoSink sync=false accelFrameCopy=false videoOutput=composite videoStd=D1_NTSC sync=false enable-last-buffer=false&

Output:composite, mode: NTSC with Auto exposure white balance demo library

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 150000 15 50 640 480 50
gst-launch -e v4l2src chain-ipipe=false always-copy=false ! dmaiaccel ! video/x-raw-yuv,format=\(fourcc\)NV12, width=640, height=480, framerate=\(fraction\)30/1 ! dmaiperf\
 ! TIDmaiVideoSink sync=false accelFrameCopy=false videoOutput=composite videoStd=D1_NTSC sync=false enable-last-buffer=false&

Video Resizing

The DM36x processor includes a hardware resizer module to perform image-scaling operations and color space conversion. There are three different ways you can use the hardware resizer with GStreamer:


1. Chain ipipe with v4l2src element: The v4l2src element provides the property chain-ipipe, that indicates if the element must chain (use) the ipipe preview and resizer hardware modules. When chained, you capture from the image sensor(e.g MT9P031) or YUV decoder(e.g TVP7002) with the ipipe preview and resizer modules configured for continuous mode of operation. This means the previewer and resizer are included in the video frame data path, processing each captured frame to do image tuning and resizing. Due to vpfe driver limitations, the image scaling is only available for use with video frames from a YUV decoder. If you use a raw image sensor the driver only allows you change the color space. The following is a pipeline that resizes the input video frames to a 640x480 resolution. You simply specify the capabilities filter setting to the desired width and height.

gst-launch -e v4l2src input-src=component chain-ipipe=true always-copy=false !  capsfilter caps="video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480" ! dmaiaccel ! TIDmaiVideoSink enable-last-buffer=false videoStd=720P_60

2. Use dmairesizer element: The dmairesizer element configures the DM36x hardware resizer and use it to scale each video input frame to the desired resolution. In this case, the DM36x resizer operates in single-shot mode, meaning the ARM processor needs to pass each buffer containing a video frame to the resizer. If you use this mode with v4l2src element, you must set the property chain-ipipe to false since the resizer can't be used simultaneously in continuous and in single-shot modes. The following is an example pipeline using dmairesizer.

gst-launch -e v4l2src input-src=component chain-ipipe=false always-copy=false ! dmaiaccel ! dmairesizer target-width=640 target-height=480 ! TIDmaiVideoSink enable-last-buffer=false videoStd=720P_60

3. Use dual resizer: DM36x dual resizer

Video Recording

Recording h264 NTSC

 
gst-launch -e v4l2src always-copy=false chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel !\
 dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 !dmaiperf ! qtmux ! filesink location=test8003.mp4\
 sync=false enable-last-buffer=false&

Recording h264 NTSC with Auto exposure white balance demo library

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 150000 15 50 640 480 50
gst-launch -e v4l2src always-copy=false chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=640,height=480' ! dmaiaccel ! dmaienc_h264\
 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test8003.mp4 sync=false\
 enable-last-buffer=false&

Recording h264 1280x720

 
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=true num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264\
 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf ! qtmux ! filesink location=test.mp4 sync=false enable-last-buffer=false&

Recording h264 1280x720 with Auto exposure white balance demo library

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 150000 15 50 1280 720 50
gst-launch -e v4l2src always-copy=FALSE chain-ipipe=false num-buffers=300 ! 'video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720' ! dmaiaccel !dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! dmaiperf\
 ! qtmux ! filesink location=test.mp4 sync=false enable-last-buffer=false&

Video recording and decoding (NOT TESTED)

This section captures video from an NTSC source and an audio source on the line-in connector. For every encoding pipeline it's complementary decoding pipeline is shown below

H264+AAC

gst-launch -e v4l2src always-copy=FALSE input-src=composite ! "video/x-raw-yuv,format=(fourcc)NV12,width=720,height=480,pitch=736" ! \
dmaiaccel ! dmaienc_h264 encodingpreset=2 targetbitrate=3600000 outputBufferSize=2000000 ! dmaiperf print-arm-load=true ! queue ! \
qtmux name=m alsasrc buffer-time=800000 latency-time=30000 ! 'audio/x-raw-int,rate=22050' ! dmaienc_aac outputBufferSize=131072 bitrate=128000 ! \
m. m. ! filesink location= av_h264.mp4
gst-launch filesrc location= av_h264.mp4 ! qtdemux name=demux ! queue ! dmaidec_h264 numOutputBufs=18 ! queue ! dmaiperf ! \
TIDmaiVideoSink accelFrameCopy=true videoOutput=composite videoStd=D1_NTSC  demux.audio_00 ! queue ! dmaidec_aac ! alsasink

MPEG2+MP3

For this pipeline the MP3 codec and plugin need to be installed, further instructions can be found here
https://www.ridgerun.com/developer/wiki/index.php/Add_MP3_support_to_20011q2_SDK

gst-launch -e alsasrc latency-time=30000 buffer-time=800000 ! "audio/x-raw-int, rate=(int)22050" ! dmaienc_mp3 bitrate=128000 outputBufferSize=131072 ! \
queue ! mpegpsmux name=m v4l2src always-copy=false input-src=composite ! "video/x-raw-yuv,format=(fourcc)NV12,width=720,height=480,pitch=736" ! \
dmaiaccel ! dmaienc_mpeg2 encodingpreset=2 targetbitrate=3600000 ! dmaiperf print-arm-load=true ! queue ! m. m. ! \
filesink location= av_mpeg2.mpg
gst-launch filesrc location= av_mpeg2.mpg ! mpegdemux name=demux .audio_00 ! queue ! mad ! alsasink demux.video_00 ! queue ! dmaidec_mpeg2 ! \
TIDmaiVideoSink accelFrameCopy=true videoOutput=composite videoStd=D1_NTSC

MJPEG

gst-launch -e v4l2src always-copy=false input-src=composite ! 'video/x-raw-y
uv,format=(fourcc)UYVY,width=720,height=480,pitch=736' ! dmaiaccel ! dmaienc_mjp
eg qValue=30 outputBufferSize=2500000 ! dmaiperf print-arm-load=true ! queue !  
qtmux ! filesink location=test_v_mjpeg.mp4

MJPEG+AAC

gst-launch -e alsasrc provide-clock=false latency-time=30000 buffer-time=800000 ! 'audio/x-raw-int,rate=(int)16000,channels=(int)2' ! \
dmaienc_aac bitrate=128000 ! queue ! qtmux name=m v4l2src always-copy=false input-src=composite ! \
'video/x-raw-yuv,format=(fourcc)UYVY,width=720,height=480,pitch=736' ! dmaiaccel ! dmaienc_mjpeg qValue=30 outputBufferSize=2500000 ! \
dmaiperf print-arm-load=true !  queue ! m. m. ! filesink location=test_av_mjpeg.mp4
gst-launch filesrc location = /mnt/MSD_mmcblk0p1/test_av_mjpeg.mp4 ! qtdemux name=demux .video_00 ! queue ! dmaidec_mjpeg ! \
'video/x-raw-yuv, format=(fourcc)NV12' ! TIDmaiVideoSink videoOutput=composite videoStd=D1_NTSC demux.audio_00 ! queue ! \
dmaidec_aac ! alsasink

Camera streaming over the network

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host.

H264 video streaming pipelines LeopardBoard DM365

HOST_ADDR=<Client's IP address>
PORT=3000


gst-launch -e v4l2src always-copy=FALSE input-src=composite chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)23/1 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! rtph264pay ! udpsink port=$PORT host=$HOST_ADDR sync=false enable-last-buffer=false &
PORT=3000

gst-launch -v udpsrc port=3000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\,aO48sA\=\=", payload=(int)96, ssrc=(guint)1335677188, clock-base=(guint)2580247201, seqnum-base=(guint)5999' ! rtph264depay ! 'video/x-h264' ! ffdec_h264 ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false' ! xvimagesink

Camera streaming over the network with Auto exposure white balance demo library

ipipe-client run-config-script dm365_mt9p031_config
ipipe-client init-aew G EC S C 150000 15 50 1280 720 50
HOST_ADDR=<Client's IP address>
PORT=3000
gst-launch v4l2src always-copy=FALSE input-src=composite chain-ipipe=false ! video/x-raw-yuv,format=\(fourcc\)NV12, width=1280, height=720, framerate=\(fraction\)23/1 ! queue\
 ! dmaienc_h264  dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! rtph264pay ! udpsink port=$PORT host=$HOST_ADDR sync=false enable-last-buffer=false&
PORT=3000

gst-launch -v udpsrc port=3000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\,aO48sA\=\=", payload=(int)96, ssrc=(guint)1335677188, clock-base=(guint)2580247201, seqnum-base=(guint)5999' ! rtph264depay ! 'video/x-h264' ! ffdec_h264 ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false' ! xvimagesink


NTSC camera streaming MJPEG over the network

HOST=10.111.0.4
 
gst-launch -v v4l2src chain-ipipe=true always-copy=FALSE input-src=composite ! 'video/x-raw-yuv, format=(fourcc)UYVY, width=736, height=480, framerate=(fraction)30000/1001' ! dmaienc_mjpeg qValue=50 ! udpsink port=5000 host=$HOST
HOST=10.111.0.4

gst-launch v4l2src ! ffenc_mjpeg ! rtpjpegpay pt=96 ! udpsink host=$HOST port=5000
gst-launch udpsrc port=5000 ! 'image/jpeg, width=(int)736, height=(int)480, framerate=(fraction)30000/1001, pixel-aspect-ratio=(fraction)1/1' ! jpegdec ! xvimagesink

Image capture

You also can capture JPEG images from the camera module. The following pipelines capture a JPEG image for VGA and 720P resolutions using the MT9P031 sensor.

FILE_NAME=VGA_jpeg_file.jpg

gst-launch -e v4l2src always-copy=false num-buffers=1 chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)UYVY, width=640, height=480 ! dmaienc_jpeg ! filesink location=$FILE_NAME


FILE_NAME=720P_jpeg_file.jpg

gst-launch -e v4l2src always-copy=false num-buffers=1 chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)UYVY, width=1280, height=720 ! dmaienc_jpeg ! filesink location=$FILE_NAME

Additional Examples

Capturing video from external modules

Please see Hardware setup - LeopardBoard DM365 in order to get see more examples of capturing video from external modules.

Component out 720P_60

Video test source

gst-launch -e videotestsrc ! TIDmaiVideoSink sync=false enable-last-buffer=false videoOutput=component videoStd=720P_60

NTSC TVP5146 input

NTSC framerate is not 30 fps, but 30000 / 1001. Also, the width must be divisible by 32 on DM36x hardware.

gst-launch  v4l2src chain-ipipe=true always-copy=false input-src=composite ! 'video/x-raw-yuv,format=(fourcc)NV12, width=736, height=480, framerate=(fraction)30000/1001' ! dmaiaccel !  dmaiperf ! TIDmaiVideoSink videoOutput=component videoStd=720P_60
Navigation
Toolbox