Skip to content

Using GStreamer with Python Part 3

November 21, 2014

For the final GStreamer guide I have we will be learning how to split data so we can do different things with it.  This project will be showing you a

few new elements which allow us to split incoming data.  We will also be working with Request pads along with everything we’ve learned in the previous two guides.

Splitting Multimedia Project

This project is pretty hefty project because we will be splitting the audio through three different element chains.  Playbin2 will stream the multimedia and automatically play the video. We will pass the audio to a series of elements until it reaches our splitting element, called a tee. That tee will send the stream to three queue elements each of which leads a different chain of elements.  One chain will play the audio normally, one will create a wavescope and output that to the monitor, and one will save the audio as an mp3 file.

The tee element does not do anything aside from splitting the data to multiple pads.  The pads it uses are Request pads.  In order to use these Request pads we need to manually request them through a method call and manually link each pad to the elements they will interact with.  We will need to use separate Queue elements in each branch to provide separate threads for each branch. Otherwise a blocked dataflow in one branch would stall the other branches.  Luckily implementing Queue elements is rather simple.  If you have a good understanding of our previous projects, this upcoming project will be simple.

Lets take look at the elements which we will be using in this project. It’s a long list.  We will, however, only a few of these we will examine closer.

  • playbin2 which will get our multimedia from the internet, play the video automatically, and feed the audio to the rest of our elements.
  • decodebin which will decode the audio to raw media.
  • autoconvert which we use mostly to connect decodebin to tee.
  • tee which will split the audio data to our three branches.
  • queue which we will use to accept that data on each of our branches.
  • autoaudiosink which will play the audio over our speakers.
  • wavescope which will create a waveform representation of our audio.
  • ffmpegcolorspace which we use to connect wavescope to autovideosink.
  • autovideosink which we use to display the wavescope video on our monitor.
  • lamemp3enc which we use to encode the audio into an mp3 format.
  • filesink which we use to save the mp3 data to a file.

In your terminal inspect the Tee element, Queue element.  Under the Tee elements Pad Templates you will see that the Source pad is On Request and it shows you the method it has available. The Queue element has Always pads with ANY capability.

Let’s start this project.  Import gst, create your Bin.  Again, this bin will be used to store all the elements except for the playbin2 element. And now create all those elements we discussed. Remember, we will create three queues which lead each of our three branches.  All elements in each branch are prefixed with their purpose. This will help reduce confusion when we are working with all these elements. Use an If statement to make sure all elements were created successfully. And last, set the properties for the elements. We will use the Sintel trailer again, for the wavescope we will set some of the properties available to increate the visual appeal, and we tell our file_sink object where to save the mp3 file.

From here we will add all the elements into the bin except for our playbin2 element.  Now we start linking most of our elements, we will skip linking the Tee element with our Queue elements.  Let’s start with linking decodebin with audioconvert.  We tell our object to use a callback function when it sends the ‘new-decoded-pad’ signal, just like in the previous guide. Now for our callback function, we use the pad argument which was given when this function was called behind the scenes.  We get the decode element, we get the pipeline, we use the pipeline to get the element which we want to connect decodebin to, and we link them.

Now we need to link all the other elements.  We do this with several different calls to gst.element_link_many because we can’t just link the Tee element with the Queue elements plus we will have three different branches.  First we link audioconvert with tee.  Now we will link the branch which outputs audio to the speakers: link our Queue element named audio_queue with our autoaudiosink element.  Now we will link the branch which will create the wavescope: link our Queue element named wavescope_queue, wavescope element, ffmpegcolorspace element named wavescope_convert, and autovideosink element. Now we will link the branch which will create the mp3 file: link our Queue element named file_queue, lamemp3enc named file_encode, and filesink.

Now we have everything linked except the Tee element with the Queue elements as well as once final link needing to be done on playbin2 with our bin.  We will work on the Tee element for now.  Request a source pad for each branch with the get_request_pad method. We’ve also added a print statement that can help us with troubleshooting if something goes wrong.  From here we will request a static Sink pad from our three Queue elements which we will use for linking soon.  Now we can link the request pads with the static pads. This can be done in an if statement and we can compare the output message to gst.PAD_LINK_OK.

Finally, we will link the playbin2 to our bin. We get a static sink pad from the first element in line within our bin, create a ghostpad out of that, activate it, and add that ghostpad to our bin, just like in our previous guide.  Everything is connected, we set the playbin2’s state to gst.STATE_PLAYING, check the bus for an error or EOS, output that message for troubleshooting purposes, and free the resources.

When you run your project you should see the video playing, hear the audio, see the audio through a wavescope, and you should now see the mp3 file being written to your computer.  When the program ends, check the mp3 to ensure everything it did save properly.  If it didn’t save properly, our speakers may screech loudly so be ready to turn them off.  This happened to be when I didn’t have the lamemp3enc element in the branch.  If you want to see the project as a whole go to https://gist.github.com/markwingerd/712c270355b6c2642600

Advertisements

From → GStreamer, Python

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: