Using UDP Multicast with GStreamer: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
Line 17: Line 17:


<pre>
<pre>
gst-launch-0.10 udpsrc multicast-group=<multicast IP address> auto-multicast=true port=<port number> caps="application/x-rtp" !
gst-launch-0.10 udpsrc multicast-group=<multicast IP address> auto-multicast=true port=<port number> caps="application/x-rtp,
rtppcmudepay ! mulawdec ! alsasink
media=(string)audio, clock-rate=(int)8000, encoding-name=(string)PCMU, payload=(int)0, ssrc=(guint)1350777638,
clock-base=(guint)2942119800, seqnum-base=(guint)47141" ! rtppcmudepay ! mulawdec ! pulsesink
</pre>
</pre>


Line 27: Line 28:
Finally the audio packages are sent to the network by using the ''udpsink'' element. In order to configure the connection as a multicast type it is necessary to activate the udpsink's multicast compatibility and set the multicast IP address (from 224.x.x.x to 239.x.x.x) and port (from 1024 to 65535).
Finally the audio packages are sent to the network by using the ''udpsink'' element. In order to configure the connection as a multicast type it is necessary to activate the udpsink's multicast compatibility and set the multicast IP address (from 224.x.x.x to 239.x.x.x) and port (from 1024 to 65535).


Once the server starts sending the audio packages, the clients can access the streaming by accessing the multicast IP address to which it is sent. This is doing by using the ''udpsrc'' element configured to works in multicast mode with the IP address and port number set before. From this data received it is extracted the RTP packages using the ''rtppcmudepay'' element for then decode the mu-law audio and send it to the speakers through the ''alsasink''.
Once the server starts sending the audio packages, the clients can access the streaming by accessing the multicast IP address to which it is sent. This is doing by using the ''udpsrc'' element configured to works in multicast mode with the IP address and port number set before. From this data received it is extracted the RTP packages using the ''rtppcmudepay'' element for then decode the mu-law audio and send it to the speakers through the ''pulsesink'' (it is possible that your system doesn't support the pulse audio control, in that case you could use ''alsasink'').


== Video Multicast Streaming ==
== Video Multicast Streaming ==

Revision as of 14:13, 4 August 2010

In this document you will find how to create a network connection using multicast in order to transmit audio and/or video streaming.

Audio Multicast Streaming

In this section it will be shown how to build a GStreamer pipe for transmit audio information through a multicast network.

The pipes used are the following

Server:

gst-launch-0.10 filesrc location=<file.mp3> ! mad ! audioconvert ! audioresample ! mulawenc ! rtppcmupay ! udpsink 
host=<multicast IP address> auto-multicast=true port=<port number>

Client:

gst-launch-0.10 udpsrc multicast-group=<multicast IP address> auto-multicast=true port=<port number> caps="application/x-rtp, 
media=(string)audio, clock-rate=(int)8000, encoding-name=(string)PCMU, payload=(int)0, ssrc=(guint)1350777638, 
clock-base=(guint)2942119800, seqnum-base=(guint)47141" ! rtppcmudepay ! mulawdec ! pulsesink

On the server side we first used a filesrc element to set the media audio file we will play (this pipe is only for MP3 audio files), then the file content is passed through a mad decoder in order to get the audio in raw format. Once the audio has been decoded, we pass it through an audioconvert and an audioresample; those convert the audio to raw audio with a sample rate of 8KHz which is the sample rate necessary to decode the audio to mu-law using the mulawdec element.

Before we send the audio through the network it is necessary to package it into a RTP package with the correct payload, that is doing by using rtppcmupay.

Finally the audio packages are sent to the network by using the udpsink element. In order to configure the connection as a multicast type it is necessary to activate the udpsink's multicast compatibility and set the multicast IP address (from 224.x.x.x to 239.x.x.x) and port (from 1024 to 65535).

Once the server starts sending the audio packages, the clients can access the streaming by accessing the multicast IP address to which it is sent. This is doing by using the udpsrc element configured to works in multicast mode with the IP address and port number set before. From this data received it is extracted the RTP packages using the rtppcmudepay element for then decode the mu-law audio and send it to the speakers through the pulsesink (it is possible that your system doesn't support the pulse audio control, in that case you could use alsasink).

Video Multicast Streaming