(*** The following is an excerpt from the book PyQt5 101 available here ***)
Chapter 6: Radio App
For this chapter we’ll explore a radio app that uses PyQt5. I spent about 2 days struggling to set up the environment for this on my Windows system, before I finally gave up and switched to Linux. I highly recommend you follow along on a Unix-like system.
For Windows users, you don’t need to create a separate partition and install Linux. You can set up an Ubuntu system on a virtual machine. For those following along on Unix-like systems other than Ubuntu, I trust that you’ll be able to adapt the install instructions to suit your particular OS.
You first need to set up a Python module called gi. This is usually found in the PyGObject package. To install this on Ubuntu switch to your CLI and run the following
sudo apt-get install python-gi python-gi-cairo python3-gi python3-gi-cairo gir1.2-gtk-3.0
This will install the gi module on your system if you didn’t already have it. To ensure that it works, run the script titled hello.py in the source folder for this chapter. For users who’ve just switched from Windows, you have to explicitly invoke python3 since your system will probably also have python2.
If everything goes well your output should look similar to the following.
If you can’t run the script, please visit the following URL and follow the instructions there.
People who’ve just switched from Windows will also need to run
sudo apt-get install python3-pyqt5
to install PyQt5.
Everyone will also need version 1.x of Gstreamer
sudo apt-get install gstreamer1.0-tools
sudo apt-get install gstreamer1.0-plugins-ugly
Now you’re all set up for the next part.
Back when I was in grade school, classes on Friday used to end at 3:20 pm. I’d usually end up at my friend Mo’s house. His dad was a mechanic and had all sorts of cool hardware that I liked going through while he was still at work. My folks had dull office jobs and their work things weren’t worth going through.
One day we discovered an electronics puzzle kit among his dads things. We could make simple circuits by snapping puzzle pieces together to match patterns shown in a book that accompanied the kit. I’d always loved radio and we followed the instructions to assemble a simple crystal radio. I was cynical about the claim that one could make a radio without any electricity and wasn’t surprised when it didn’t work. Mo read the instructions through again and we spent half an hour fiddling with a water tap before we got it to work.
It turned out that we’d assembled it correctly, but we hadn’t grounded it. There was a faint signal on the earphones from a local AM station, but it worked. Each puzzle piece had an electric circuit component soldered underneath it as well as connectors on either side that joined to other circuit components. It was quite complex for 10-year old kids without any electronics training, but the pictures on top of the puzzle pieces made it look and feel like a game. Search Amazon for “Snap Circuits”. What we used was similar, but wasn’t as colorful as today’s models.
Gstreamer is similar to those snap circuits. It comes with various components, that you can assemble together to make many different multimedia tools like a DVD player, streaming Internet radio player and recorder etc. The team that won the 2016 Nobel Prize for Physics used a Gstreamer plugin to help them detect gravity waves. If you have a Samsung flat-screen TV or have watched a movie on an airplane, they were probably using Gstreamer.
I will not give a comprehensive tutorial on Gstreamer(consult their website for that, links in the references section), I only give you an overview that will allow you to follow the code for this chapter. If you’re not technically-inclined, you can think of this like building a puzzle(like me and Mo did) and not worry too much about what’s beneath the puzzle pieces you’re assembling.
The substrate or puzzle board on which all puzzle pieces(elements) are mounted is called the pipeline. Elements process the data(for example output from your favorite Internet radio site) as it moves downstream from the source elements, to sink elements(like your speaker) passing through filter elements. Playing media straight from the Internet without storing it locally is known as streaming.
The ports through which GStreamer elements communicate with each other are called pads. There exists sink pads, through which data enters an element, and source pads, through which data exits an element. It follows naturally that source elements only contain source pads, sink elements only contain sink pads, and filter elements contain both. Source pads produce data, sink pads consume data.
A demultiplexer for digital media files, or media demultiplexer also called a file splitter by laymen or consumer software providers, is software that breaks down individual elementary streams of a media file, e.g., audio, video, or subtitles and sends them to their respective decoders for actual decoding. A media file for example an mp4 file is a container that has several streams of data packed together. An mp4 player will be able to read and unpack these streams of audio, video and subtitles in a process called decoding, then present them to you in a coherent manner, directing output to your speakers and video screen.
Media demultiplexers are not decoders themselves, but are format container handlers that separate program streams from a file and supply them to their respective audio, video, or subtitles decoders. A demuxer contains one sink pad, through which the muxed data arrives, and multiple source pads, one for each stream found in the container.
A pipeline for a basic ogg player is illustrated below.
Ogg is an open source file format. To confirm that you have a working version of Gstreamer tools installed on your system, run the following command
gst-launch-1.0 videotestsrc ! autovideosink
This will open a window with a video test pattern on your system.
If the above command run successfully, try the following command
gst-launch-1.0 audiotestsrc ! autoaudiosink
This will generate a 440 Hz audio test tone. Switch on your speakers to confirm that it works. Next we’re going to combine the audio and video in one pipeline and output the result as an ogg file. Type in
gst-launch-1.0 audiotestsrc ! vorbisenc ! oggmux name=mux ! filesink location=file.ogg videotestsrc ! theoraenc ! mux.
and let it run for at least 30 seconds. You can terminate it by pressing Ctrl+C. This will output 30 seconds of data into a file called file.ogg. Inspect your current working directory and confirm that this file exists.
Most of the above command should be familiar to you. Specifying the “name” property of an element lets you use it more than once. We take the output of the audio test source, feed it into an ogg muxer and name it mux. We then use mux again, by feeding the output of a video test source into it. Finally, we take the combined result and direct it into a file named file.ogg.
Remember, muxing is packing several streams of data for example audio, video and subtitles into a single container file. In effect what we’ve done is take Illustration 20 above and run it from right to left.
Pads are an element’s interface to the outside world. Data streams from one element’s source pad to another element’s sink pad. In addition to direction of data flow, pads have another important property- availability.
A pad can have any of three availabilities: always, sometimes and on request. The meaning of those three types is exactly as it says: always pads always exist, sometimes pad exist only in certain cases (and can disappear randomly), and on-request pads appear only if explicitly requested by applications. For this basic introduction, we will ignore request pads.
Some elements might not have all of their pads when the element is created. This can happen, for example, with an Ogg demuxer element. The element will read the Ogg stream and create dynamic pads for each contained elementary stream (vorbis, theora) when it detects such a stream in the Ogg stream. Likewise, it will delete the pad when the stream ends.
On your CLI run the following
and inspect the output. The element has only one pad: a sink pad called “sink”. The other pads are “dormant”. You can see this in the pad template because there is an “Exists: Sometimes” property. Depending on the type of Ogg file you play, the pads will be created. We will see that this is very important when you are going to create dynamic pipelines. You can attach a signal handler to an element to inform you when the element has created a new pad from one of its “Sometimes” pad templates.
- After being created, an element will not actually perform any actions yet. You need to change an element’s state to make it do something. GStreamer knows four element states, each with a very specific meaning. These four states are:
GST_STATE_NULL: this is the default state. No resources are allocated in this state, so, transitioning to it will free all resources. The element must be in this state when its reference count reaches 0 and it is freed.
GST_STATE_READY: in the ready state, an element has allocated all of its global resources, that is, resources that can be kept within streams. You can think about opening devices, allocating buffers and so on. However, the stream is not opened in this state, so the stream position is automatically zero. If a stream was previously opened, it should be closed in this state, and position, properties and such should be reset.
GST_STATE_PAUSED: in this state, an element has opened the stream, but is not actively processing it. An element is allowed to modify a stream’s position, read and process data and such to prepare for playback as soon as state is changed to PLAYING, but it is not allowed to play the data which would make the clock run. In summary, PAUSED is the same as PLAYING but without a running clock.
Elements going into the PAUSED state should prepare themselves for moving over to the PLAYING state as soon as possible. Video or audio outputs would, for example, wait for data to arrive and queue it so they can play it right after the state change. Also, video sinks can already play the first frame (since this does not affect the clock yet). Autopluggers could use this same state transition to already plug together a pipeline. Most other elements, such as codecs or filters, do not need to explicitly do anything in this state, however.
GST_STATE_PLAYING: in the PLAYING state, an element does exactly the same as in the PAUSED state, except that the clock now runs.
You can change the state of an element using the function set_state(). If you set an element to another state, GStreamer will internally traverse all intermediate states. So if you set an element from NULL to PLAYING, GStreamer will internally set the element to READY and PAUSED in between.
Bringing it all together
Is your head spinning after that whirlwind tour through Gstreamer? It’s time to bring it all together with a simple PyQt example that will reinforce what you’ve learnt so far. In the folder for this chapter open a file named baby.py. We will programmatically implement Illustration 20 and you will get to see an example of dynamic pipelines in action.
Run the following
Click on the start button, this will open a file selection dialog. Select the ogg file we created in the previous section. It will automatically start playing. You can stop it by clicking “Stop” in the main window. For this simple player, only start and stop states are possible. The next section will explain how it works.
Explanation of the code
(*** This has been an excerpt from the book PyQt5 101 available here ***)