Photo of a woman editing audio and listening on headphones (Photo by Kelly Sikkema on Unsplash)

Have you ever wondered what makes up a sound

Sound is all around us all the time. This constant sensory input, for those of us who are not deaf or hard-of-hearing (please see note at the bottom of this post about accessbility), is an important part of how we experience and process the world around us. How we experience sound is significantly affected by the two major elements of sound: frequency and amplitude. Not only do these play a significant role in our daily experiences, they also have an impact on how we record sound and are the parts of sound we manipulate when we are editing audio. 

To put it simply (for now, scroll down for a more technical breakdown of sound):

  • Amplitude is loudness (or "volume".) High amplitude is loud, low amplitude is quiet. We measure loudness in decibels (db).
  • Frequency is pitch. High frequency is a high-pitched sound, low frequency is, well, low. We measure frequency in hertz (Hz) and kilohertz (kHz), which is thousands of hertz.

Understanding these parts of sound is very important to understanding how we work with sound when producing videos and podcasts.

The importance of amplitude/loudness is pretty obvious. If the sound isn't loud enough, we can't hear it. If the sound is too loud, it can be annoying, overwhelming, and even painful.

But what is the importance of pitch? The pitch of a sound actually tells us a lot about our physical relationship to the source of the sound. Lower pitches travel further and pass through objects more easily than higher frequency sounds. Every sound we hear, including each other's voices, is a complex combination of different high and low frequencies. The frequencies and how we hear them can be impacted by the environment. All the pitches that make up any one sound are important to the experience of that sound and the removal or quieting of any of them can change our experience of that sound. So, for example, when you hear someone talking on the other side of a door their voice sounds different because the higher frequencies in their voice do not carry through the door as well as the lower pitch parts.

When you combine the effects of amplitude and frequency things get a little more complicated. When high pitch sounds are quieter they sound like they are further away, because high pitch sounds/high frequencies don't travel as far or through objects as well as low pitch sounds/low frequencies. (It's why the kid in your elementary school class with the deep voice always gets caught talking in class... because their voice carries further and more powerfully than the kids with higher pitched voices.)

That is how we can manipulate frequencies in our editing: controlling the amplitude of frequencies. One of the tools you have in almost every editing software that does just that are equalizers. An equalizer allows you to raise or lower the amplitude of different frequencies, which can be extremely helpful in your editing process.

Some facts about sound to help you avoid potential audio issues and improve your audio in editing:

  1. When our hearing is at its best human beings can generally hear a range of frequencies from 20Hz to 20,000Hz (20kHz), but as we age we lose high frequencies. (Remember those viral videos of "The Sound Only a Teenager Can Hear?". More on this below in "Getting Technical.")
  2. A human speaking voice tends to be in the range between about 100Hz at the lowest and 15kHz at the highest. If you look closely at frequencies in dialogue in any of your productions you'll find everyone's voice occupies a significant range of those frequencies in every recording.
  3. Because the human speaking voice tends to occupy a significant range of frequencies in what we can hear, nearly every other sound you can hear will interfere with the recording of the voice. While you can reduce the volume of specific frequencies to try and minimize a disruptive sound, you have to be careful or you may negatively affect the sound of your dialogue.
  4. Always be very careful when using an equalizer to modify the volume of frequencies in a dialogue recording. If you make high frequencies too quiet, a person can sound muffled. If you make low frequencies too quiet, a person can sound robotic. 

How does this affect your recording process?

  1. When recording people, whether it's dialogue for a narrative or an interview for a documentary, find a quiet space as isolated from disruptive sounds as possible. Almost every sound is disruptive and it's not easy to isolate and remove those sounds without affecting the voice recording. 
  2. Get the microphone as close to people as possible when recording voices. Avoid the loss of high frequencies with distance and also the closer you are to the source of a sound the greater the amplitude of the sound. The greater the amplitude of the sound, the less sensitive your microphone has to be. The less sensitive your microphone is, the fewer unwanted sounds it picks up. 

What can you do in editing to manipulate frequency to improve your recordings?

The equalizer can be a powerful tool for helping make sure voice recordings are clear for listeners: 

  1. Lower the volume of frequencies below 100Hz, which will help to minimize the impact of sounds from fans, air conditioners, or even distant passing traffic or airplanes. 
  2. Raise the volume of frequencies between approximately 500Hz and 1000Hz (1kHz). This range tends to be where the human voice is most clear. Raising the volume of these frequencies can improve clarity for your listeners. 
  3. Lower the volume of frequencies over 15kHz, which will help to minimize the impact of high frequency sounds. 
  4. Did the lavalier microphone you used to record with end up buried under heavy clothes, muffling the voice of your subject? Raise the higher frequencies and you should be able to improve it and make it sound less muffled.
  5. Do you need someone to sound like they're far away or, perhaps, behind a door or wall? Lower the high frequencies above 1kHz and lower the overall volume of the audio clip and with a little adjustment you should find the desired result. 
  6. Do you need to make someone sound like they're on the telephone or on a video meeting like Skype or Zoom? Lower both the high and low frequencies. In order to ensure real time audio over a distance phones and video conferencing services minimize the frequencies they transmit, that's why voices always sound a little different when you use those services. 
  7. Other effects like hum removers are designed to similarly affect the volume of particular frequencies to improve audio quality. Hum removers are designed to control the humming sound that can sometimes be picked up from electrical devices. Electricity, like sound, also oscillates and has a frequency. That oscillation sometimes manifests as sound at the same frequency of the electrical current. Within the United States and some other countries that frequency is 60Hz. In the majority of other countries that frequency is 50Hz. The hum remover will minimize that sound.

Getting Technical

Those are some practical ways to work with sound and how you can manipulate amplitude and frequency to improve your recorded audio. But let's get a little more technical here for those of you who are curious. 

It's important to understand that sound is a wave. Technically it's a pressure wave in a medium. We can visualize this wave with our sound equipment as a sound wave. When you look at your audio in Adobe Audition, Adobe, Premiere, Avid Pro Tools, Avid Media Composer, Final Cut Pro X, Logic Pro X, DaVinci Resolve, or any other program with the ability to edit audio, you're looking at a visual representation of a sound wave. Amplitude and frequency are on display in that visual representation. Amplitude describes the power of the wave, which is represented by the height of the sound wave. Frequency describes the speed of the wave or the distance between the start and end of each wave. This is an oscillation and the number of oscillations per second is the frequency.

Screenshot of an audio waveform from DaVinci Resolve Fusion
Screenshot of an audio waveform on DaVinci Resolve's Fusion page.
Screenshot of a zoomed in Audio Waveform in DaVinci Resolve
Screenshot of the same audio waveform in DaVinci Resolve zoomed in.

We experience the height and speed of the wave as volume and pitch respectively. The greater the amplitude, the larger the wave is, which makes for a louder sound. We measure this amplitude in decibels and this is typically what you're manipulating the most: the loudness of the sound. You can increase loudness without affecting our other part of sound, frequency.

Diagram of an audio waveform indicating that the height of the wave is the amplitude and the beginning and end of an oscillation are measured as frequency.

Pitch corresponds to the frequency, or speed, of the wave. We measure frequency in hertz (Hz) or kilohertz (kHz which is used to describe thousands of hertz). The higher the hertz, the more oscillations per second, the shorter the distance between the waves. The shorter the distance between the waves, the higher the pitch. A high frequency sound wave, let's says something around 15,000 Hz (also known as 15kHz) would be experienced by us as a high pitched sound while a low frequency sound wave of, for example, 100 Hz would be experienced as a low pitched sound. You can increase or decrease frequency without affecting the loudness.

An understanding of the basics of the parts of sound can significantly improve your sound in both recording and editing. Hopefully, the facts and tips we shared above will help to improve all your future productions. Before we wrap up, though, here are a few fun facts about amplitude and frequency:

  1. Humans can hear frequencies between approximately 100Hz and 20kHz, but most dog breeds typically can hear from 67Hz to 45kHz and cats can hear from approximately 48Hz to 85kHz. So, if your pet seems to react to things that aren't there, maybe they're just hearing something you can't.
  2. Some research has shown that while human beings and many other animals can't hear low frequency sounds below 20Hz, we can feel it. Sounds in this range below 20Hz are referred to as infrasound. While undetected by the ears infrasound has been linked to feelings of uneasiness or even nausea in some people. These kinds of low frequency sounds have been associated with vibrations made by the earth before earthquakes and volcanos. There has even been some research in locations that are often reported as being haunted that has shown the presence of these low frequency sounds in the areas where people feel most uneasy. When the source of the sound was removed, the feelings of uneasiness went away. Is it a ghost in your house, or just a bad pipe vibrating and causing a low frequency sound? You may want to call the plumber before you call the exorcist.
  3. Back in 2006, after learning that there are high frequency sounds they can hear that adults are no longer able to, teens in Britain started to use those sounds as ringtones so they could hear their cellphone ring but parents and teachers couldn't. You can read about how they adapted this from a "teenager repellant" device to prevent loitering at WashingtonPost.com.

Special note: To make your project accessible to those who are deaf or hard-of-hearing, you'll want to learn about Subtitles for the Deaf or Hard-of-Hearing (SDH.) Here's a good place to start: https://blog.ai-media.tv/blog/what-is-sdh


For more information, tips and tricks visit www.mnn.org/learn for free workshops, professional courses, filmmaking bootcamps, and more resources to make your productions successful.