Neste titorial faremos un programa que reproduce tonos que pode usar para afinar unha guitarra. Aprenderá como:
Configurar un proxecto básico en Anjuta
Crear unha GUI sinxela co deseñador de UI de Anjuta
Usar GStreamer para reproducir sons.
You'll need the following to be able to follow this tutorial:
An installed copy of the Anjuta IDE
Coñecemento básico da linguaxe de programación Python
Antes de comezar a programar, deberá configurar un proxecto novo en Anjuta. Isto creará todos os ficheiros que precise para construír e executar o código máis adiante. Tamén é útil para manter todo ordenado.
Inicie Anjuta e prema
Choose
Prema
Most of the code in the file is template code. It loads an (empty) window from the user interface description file and shows it. More details are given below; skip this list if you understand the basics:
The import
lines at the top include the tell Python to load the user interface and system
libraries needed.
A class is declared that will be the main class for our application. In the __init__
method
the main window is loaded from the GtkBuilder file (
Connecting signals is how you define what happens when you push a button, or when some other event happens. Here, the destroy
method is called (and quits the app) when you close the window.
The main
function is run by default when you start a Python application. It just creates
an instance of the main class and starts the main loop to bring up the window.
Este código está listo para o seu uso, polo que pode executalo premendo
A description of the user interface (UI) is contained in the GtkBuilder file. To edit the user interface, open
The layout of every UI in GTK+ is organized using boxes and tables. Let's use a vertical
Select a
Now, choose a
While the button is still selected, change the
Switch to the clicked
signal of the button. You can use this to connect a signal handler that will be called when the button is clicked by the user. To do this, click on the signal and type on_button_clicked
in the
Repita os pasos anteriores para o resto dos botóns, engadindo as 5 cordas restantes cos nomes A, D, G, B e e.
Garde o deseño da IU (premendo
In the UI designer, you made it so that all of the buttons will call the same function,
To do this, open
This signal handler has two arguments: the usual Python class pointer, and the Gtk.Button
that called the function.
For now, we'll leave the signal handler empty while we work on writing the code to produce sounds.
GStreamer é un marco de traballo multimedia de GNOME — vostede pode usalo para reproducir, gravar e procesar vídeo, son, fluxos de cámara web e semellantes. Aquí, usarémolo para producir tonos dunha única frecuencia.
Conceptually, GStreamer works as follows: You create a pipeline containing several processing elements going from the source to the sink (output). The source can be an image file, a video, or a music file, for example, and the output could be a widget or the soundcard.
Between source and sink, you can apply various filters and converters to handle effects, format conversions and so on. Each element of the pipeline has properties which can be used to change its behaviour.
Un exemplo de tubería de GStreamer.
In this simple example we will use a tone generator source called audiotestsrc
and send the output to the default system sound device, autoaudiosink
. We only need to configure the frequency of the tone generator; this is accessible through the freq
property of audiotestsrc
.
Change the import line in
The Gst
includes the GStreamer library. You also need to initialise GStreamer properly which
is done in the main()
method with this call added above the app = GUI()
line:
Then, copy the following function into the class in
The first three lines create source and sink GStreamer elements and a pipeline element (which will be used as a container for the other two elements). The pipeline is given the name "note"; the source is named "source" and is set to the audiotestsrc
source; and the sink is named "output" and set to the autoaudiosink
sink (default sound card output).
The call to source.set_property
sets the freq
property of the source element to frequency
, which was passed as an argument to the play_sound
function. This is just the frequency of the note in Hertz; some useful frequencies will be defined later on.
The next two lines call pipeline.add
, putting the source and sink into the pipeline. The pipeline can contain multiple other GStreamer elements. In general, you can add as many elements as you like to the pipeline by calling its add
method repeatedly.
A seguinte pipeline.set_state
úsase para iniciar a reprodución, estabelecendo o estado da tubería a reprodución (Gst.State.PLAYING
).
We don't want to play an annoying tone forever, so the last thing play_sound
does is to call GObject.timeout_add
. This sets a timeout for stopping the sound; it waits for LENGTH
milliseconds before calling the function pipeline_stop
, and will keep calling it until pipeline_stop
returns False
.
Agora, escríbese o código da función pipeline_stop
, chamada por GObject.timeout_add
. Inserte o código seguinte enriba da función play_sound
:
You need to define the LENGTH
constant inside the class, so add this code at the beginning of the
main class:
The call to pipeline.set_state
stops the playback of the pipeline.
We want to play the correct sound when the user clicks a button. First of all, we need to know the frequencies for the six guitar strings, which are defined (at the beginning of the main class) inside a dictionary so we can easily map them to the names of the strings:
Now to flesh out the signal handler that we defined earlier, on_button_clicked
. We could have connected every button to a different signal handler, but that would lead to a lot of code duplication. Instead, we can use the label of the button to figure out which button was clicked:
The button that was clicked is passed as an argument (button
) to on_button_clicked
. We can get the label of that button by using button.get_child
, and then get the text from that label using label.get_label
.
The label text is then used as a key for the dictionary and play_sound
is called with the frequency appropriate for that note. This plays the tone; we have a working guitar tuner!
Todo o código debería estar listo. Prema
Se ten problemas ao executar este titorial compare o seu código con este código de referencia.
Aquí hai algunhas ideas sobre como pode estender esta sinxela demostración:
Facer que o programa reproduza de forma cíclica as notas.
Facer que o programa reproduza gravacións de cordas de guitarras que se están afinando.
To do this, you would need to set up a more complicated GStreamer pipeline which allows you to load and play back music files. You'll have to choose decoder and demuxer GStreamer elements based on the file format of your recorded sounds — MP3s use different elements to Ogg Vorbis files, for example.
You might need to connect the elements in more complicated ways too. This could involve using GStreamer concepts that we didn't cover in this tutorial, such as pads. You may also find the
Analizar automaticamente as notas que toca o músico.
You could connect a microphone and record sounds from it using an input source. Perhaps some form of spectrum analysis would allow you to figure out what notes are being played?