ADSR envelope (attack decay sustain release) fires off a little bit of automation every time a key is pressed on the keyboard. In the ES1 it automates both the cutoff frequency of the low-pass filter and the amplitude/volume, so as the envelope raises the cutoff frequency during Attack part of the envelope, the volume is also increasing at the same time. The Attack is the amount of time it takes to raise the cutoff frequency to its highest point. The Decay is how long it takes to drop the cutoff frequency down to some level. The Sustain is that level, and the cutoff frequency stays there until the key is released. Once the key is released, the final Release stage begins. Release is the amount of time it takes for the cutoff frequency to return to 0 Hz, and in the case of the ES1, for the sound to die away to silence. See last Thursday's lab for more information.
analog-to-digital conversion (sampling)
attenuation: turning down the amplitude of a signal.
decibel (dB): a logarithmic scale unit used to measure intensity. A 3dB increase approximately doubles the intensity of a sound.
echo: a delayed signal that comes late enough after the original to create a separately-perceived sound.
equalization (EQ): a tone control used to balance the frequencies in a spectrum.
filter: a filter attenuates the amplitude of certain frequencies in a signal that passes through it. A low pass filter reduces (or "cuts") the frequencies below a cutoff frequency. A bandpass filter lets frequencies in a certain range pass unattenuated and cuts those above and below the range. A high pass filter lets frequencies above the cut off pass at full strength and attenuates the amplitude of the frequencies below it.
harmonic: when partials in a spectrum are integer multiples of the fundamental frequency the spectrum is said to be harmonic and the partials fuse together and give a sense of pitch.
Hertz (Hz): A unit of used to measure the number of cycles per second in a signal. Humans can hear from about 20 Hz to about 20 KHz.
inharmonic: when partials in a spectrum are not integer multiples of the fundamental frequency the spectrum is said to be inharmonic and the partials do not fuse to give a sense of pitch. Instead they create various degrees of noise.
low frequency oscillator (LFO): A low frequency oscillator gets its name from sending out a slowly changing voltage, usually below the audio range. Remember, we can only hear down to about 20 Hz. The low frequency oscillator is often set around 6 Hz. Vibrations in that range are too low to hear, but can be used to control the frequency of the oscillator that is creating a sound, for example in a saxophone patch, alternately increasing and decreasing its frequency 6 times a second, which sounds like vibrato to us. The amount that you raise the mod wheel controls how deep the vibrato is. The LFO controls how may times a second the sound changes from higher to lower pitch.
MIDI (Musical Interface for Digital Instruments): released by Yamaha and Sequential Circuits manufacturers in 1984 to make it possible for their synthesizers to communicate with each other. The software and hardware standards set how information will be communicated and interpreted. The MIDI Manufacturer's Association continues to add new features.
MIDI controller: The word "controller" in MIDI means many things, so it depends on the context. We started out looking at the M-Audio Axiom MIDI keyboards in the computer lab. They are called MIDI controllers because they don't have a synthesizer in them to make sound, they just send MIDI control information through the USB cable to the computer. The synthesis algorithms that receive the control information have the option to change the sound in response to them (or not). Some programmers choose not to respond to pitch bend messages on a piano patch, since that's not something the piano can do.
MIDI file (.mid): A MIDI file represents all the information necessary to recreate a sequence that has been recorded by a software program like Logic or Pro Tools. If you save a Pro Tools session in Digidesign format you won't be able to open it in Logic, and vice versa. If you save a song as a MIDI file you can then open the MIDI data in another program and then work on it there. For example, you could export your performance from Pro Tools as a MIDI file and then open it in Logic in order to use its synthesizer sounds. One of the things you want to learn how to do is to move between different programs and use what each does best. MIDI files allow you to transfer MIDI commands like note on and note off messages, control change messages, pitch bend messages, etc. They don't include audio files.
modulation wheel: some MIDI keyboard controllers can send modulation wheel messages (controller #1) that can control the sound in a number of different ways. It depends on what the programmer of the voice set up. For example, the saxophone patch in Logic's EXS24 has been set up to respond to incoming modulation wheel messages to increase the amount that the low frequency oscillator affects the pitch, which we would interpret as vibrato.
MusicXML: A MusicXML file does very much the same thing as a MIDI file. MusicXML was designed to transfer information from one software program to another, and keeps track of all the MIDI notes. In addition, it does its best to represent slurs, accents, rehearsal marks, and other marks and text that show up in a piece of printed music. MIDI files don't include that information. So if you've put in a lot of editorial marks in Noteflight and want to transfer it to Sibelius or Finale, a MusicXML file would allow you to transfer more of your work. We didn't do that kind of marking up of the scores in Noteflight, and so you can export your work as either a MusicXML or a MIDI file to move it to Logic, and in fact, we found out that the version of Logic that we have in the studio doesn't support MusicXML files.
noise: a mixture of all frequencies. White noise has more high frequency energy than pink noise.
pitch bend: the pitch bend wheel in a MIDI controller sends out pitch bend messages that can cause the receiving oscillator to change its pitch.
quantization in sequencing refers to the process of reducing timing inaccuracy. If you are playing quarter notes on your track, select the notes, and quantize them to quarter note subdivisions, the notes will move to the closest quarter note subdivision grid line. You have the option to specify a strength, which is the percentage of the way the note is moved. If you don't change the strength the note will be moved 100% of the way to the nearest quarter note beat, if you set it at 50% instead the note you played will move half way towards the beat where it was supposed to go. It's important to record with the metronome on if you plan on quantizing so that you and the computer are in agreement as to where the beats are supposed to be. There is another meaning of quantization in the analog-to-digital conversion process.
reverberation: a mass of delayed reflected signals that are not individually perceived but rather give a feeling of ambience.
serendipity: recognizing and taking advantage of happy accidents or surprises.
spectrum: a graph that represents the shows the tone of a signal by reporting on the energy of the various components, with frequency on the horizontal access and amplitude on the vertical access.
subtractive synthesis is a synthesis technique that subtracts some of the frequencies of a signal to change the tone. For our third project we are using the ES1 synthesizer in Logic for its ability to pass the oscillator's signal through a low-pass filter. The sawtooth waveform (one of the choices for the waveform in the ES1's oscillator) has many harmonics in its spectrum. The cutoff knob sets the cutoff frequency for the low-pass filter. The lower you set it, the more of the high frequency components get filtered out, giving it an increasingly "darker" tone. See last Tuesday's lab for more information.
timbre: French for tone.
tone: see timbre.
waveform: a graph of the changes in a signal's amplitude over time.