GET BREAKING NEWS IN YOUR BROWSER. CLICK HERE TO TURN ON NOTIFICATIONS.

X

“He’s just going through a phase right now.”

“The moon is in a waxing gibbous phase right now.”

“Fire phasers and photon torpedoes!”

Taylor

Okay, that last one isn’t going to be relevant for audio, but the other two have something in common we can talk about.

A phase is a point or period along either a linear timeline or a cycle. “Going through a phase” often indicates an expected future return to normalcy. And when we describe a phase of the moon, we talk about where it is on its orbital cycle.

In recording, phase is less understood than it should be. When more than one microphone records a single source of sound, and they’re mixed together, phase describes whether their sonic features align perfectly, or if there will be a timing mismatch.

The human hearing system is particularly adept at detecting phase and timing abnormalities in sound, and a characteristic of the best recordings and production in modern music is that the main sounds are in perfect phase with each other.

Lots of recording engineers and musicians know a little about audio phase. Just to get a picture for yourself, imagine a microphone inside a drum, pointed at the skin you’re hitting. One crisp strike will send a shock wave to the microphone, causing its sensitive element to go inward to mimic the movement of the drum skin.

The drum skin strikes toward the microphone, and when the sound is eventually played back over speakers, the speakers strike out toward the listener to recreate the event convincingly.

It doesn’t take a master’s degree in engineering to figure out  the sound of an instrument can quickly get changed or “distorted” the more microphones you add to the mix, if you are not paying attention to their relative distances from the sound source.

Consider many acoustic guitars offer internal microphones for amplification and recording, and many people like the “direct” quality of the internal mic sound, and want to mix that in to support what the microphone(s) outside guitar capture.

This gets tricky, because ears and brains on the listening end can detect that subtle difference in phase between the two, and the result will often not come off as crisp or realistic as it could.

Another place of phase confusion is when overhead drum mics are used. In the 1950s, people used to record a whole drum kit with one microphone, and this would result in a very accurate, if sort of narrow-sounding reproduction.

Using two overhead mics is fine, and this can capture the drums in a more modern stereo presentation when done well. Set up properly, the snare drum must be exactly equidistant from the microphones, or else the two mics and the subtle distance differences will begin to muddy up a sound that really needs to be sharp in the mix.

Some folks will throw a close mic on the snare to fix the way it sounds in the overheads, but take it from me — the whole kit will sound better if you bring your tape measure along. Get the phase right first, and then decide on the close mic.

Running out of space (the final frontier)… we may or may not pick this up from a different angle next week. Exciting intrigue! May you live long, and prosper.

Read more Taylor: coloradodaily.com/columnists. Stalk him: instagram.com/duncanxmusic.

blog comments powered by Disqus