What are the indicators of poor quality audio digitisation practices?

 

It’s an unfortunate reality that not all audio tape digitisation projects result in a high quality outcome. In some cases, the quality you sign up for is not delivered. While many audio tape digitisation agencies can promise the world, only a handful of them can actually deliver the highest level of quality service. Being able to trust your digitisation partner is essential as you embark on your journey to transfer your content, though no matter what level of trust you have, quality checking your new digital media when you receive it is still advisable.

So, what should you look for when assessing quality?

For some organisation, identifying quality issues isn’t a problem as the capacity and expertise to interrogate the newly created files exists in-house, and staff know what constitutes a quality result. Unfortunately though, the resources and expertise aren’t available to everybody. Having looked at some of the causes of poor quality audio tape digitisation in a previous post, we now want to set out to highlight the types of artefacts which can introduced as a result of poor audio tape digitisation practices.

Artefacts, whether analogue or digital, can be introduced at any point of the “signal path” – the various pieces of technology that are configured to convert the audio on tape to a digital file. There are numerous potential causes of artefacts, such as improper playback machine calibration or alignment, physical problems with tape transport, improper/inconsistent line level and gain staging, and improper encoding parameters.

Common types of artefacts include:

Hiss, Noise, Buzz or Distortion

This can be introduced by low quality replay machines used in the audio tape digitisation process, bad gain staging, poor electromagnetic shielding, bad cabling, or distortion from overloading analogue amplifiers or digital clipping. These types of artefacts are detectable upon careful listening as a hiss, noise, or buzz (constantly in the background) of the audio recording, or audio which sounds fuzzy and unnatural, like when somebody yells on the phone. Where possible, headphones should be used to listen to the signal coming directly from the playback deck (the best deck you have!), and used as a reference to compare against the sound of the encoded file – if they don’t match, something is being introduced.

Clicks and Pops

These can be introduced because of bad connections, debris on the media, poorly grounded equipment (and static electricity – e.g. from spinning reels!), but most often occur because of an unstable digital clock system. Correctly distributing and synchronising digital audio devices can be tricky, and if done incorrectly will certainly introduce clicks and pops. When you listen to the digitised file you will hear “clicks” and “pops” in the audio if this artefact exists. If you do note this issue you can verify it was introduce by listening to the original, at the place where the click/pop has occurred – if it’s not precisely repeatable, it is being introduced somewhere. This is an artefact that is easily identified visually if you have access to a DAW where you can “see” the waveform, e.g. Audacity (which is a free application); clicks and pops appear as tall, thin lines in the waveform.

Speed Artefacts

These are often the result of poor or dirty tape transports, or dirty media. Anything that may cause the tape to catch, stick, or slip unpredictably will result in pitch variations in the digital file. Some examples include mould, dust, or dirt, a degrading tape binder, a warped or damaged tape reel/flange, or a worn capstan or pinch roller. When you are quality checking your audio, you should listen for warble, flutter, or “glitches” (like a DJ scratching a record!), which are the characteristic sounds of this artefact. Again, if this artefact is identified, the best way to check if it was introduced is to listen to the encoded file against another pass of the original, comparing any suspect areas – if it’s not repeatable, you know it wasn’t present on the source media, so was introduced on playback.

Accidental Application of Compression or Limiting

This is an example of a more subtle artefact that can sometimes be introduced in the audio tape digitisation workflow without the operator’s knowledge. This is becoming more common with the rise of “prosumer” grade equipment/software, but can also occur with professional equipment and an inexperienced operator. Often, to save users the disappointment of having a clipped recording, manufacturers are now including compressors and/or limiters into digital audio hardware. While this is a great idea for avoiding embarrassment when recording a one-off live concert, it has severe consequences in the preservation environment. If digital clipping is occurring, the problem should be addressed by adjusting the gain scheme, either with the reference level for the converters, or perhaps also decreasing replay level at the playback deck, depending on the situation. Of course, having correctly calibrated equipment in the first place is essential.

If you can hear the level of background sounds changing unnaturally, quiet sounds like breathing seeming to be abnormally present, or a lack of dynamic range between loud and soft content, there is a chance your content has been compressed or limited. A good way to spot this visually is to use a peak level meter – usually included with any DAW, even free solutions like Audacity. On a peak meter, the levels should rise (and immediately fall) naturally to different peak levels between words or sounds. If the level seems to stop at the same point every time, and particularly if it seems to pause there before falling again, limiting has almost certainly been applied. Be particularly cautious if this is happening immediately before 0dBFS, or the digital clipping point – some equipment does this automatically, and often it is difficult to bypass it.

It is important to understand that sometimes artefacts are introduced in the original recording process, and as such are present on the original medium. None of the above mentioned artefacts should be confused with noisy originals, where as much information should be extracted from the carrier in the audio tape digitisation process as possible – not just for preservation ‘as is‘ principles, but also for restoration work now and into the future.

With audio tape media, high frequency loss is common without constant attention to the condition and calibration of the machine, as well as suitable preparation of the media. Anything which could cause an increased gap or a misalignment between head and tape will reduce the high frequency response. Often, issues may not even be perceivable without high quality monitoring equipment – but that content is still very important, particularly if restoration is to be performed in the future.

Once the preservation master – the like-for-like replica of the content – has been created as part of the audio tape digitisation project, restoration tools can be used to “clean up” the audio. There are a multitude of options available for post-processing in order obtain a useful “access” master, and there is great potential for these options to develop further in the future – highlighting the importance of having an accurate preservation master to work with!

When selecting a audio tape digitisation partner, it is recommended that, at a minimum, they use professional quality audio and electrical tools such as headphones, speakers, phase scopes, spectroscopes etc. to monitor the digitisation process. This is one practice which will aid in ensuring the absolute best playback from the original tape for migration to the digital domain. To really ensure you receive the best result, you should select a vendor that provides their audio tape digitisation services in line with the IASA TC-04 standard.

Remember, you need to trust your audio tape digitisation partner will deliver you the best possible results. That said, you should always quality check your new digital audio collection to make sure that you are satisfied with the result. Using the information above, you should have a good framework to start your QA testing with confidence.