A difference of 2 to 3 dB in MOL is already a big deal in
the cassette tape recording world but are there any benefits of using a higher
reference / test frequency in selecting the right recording bias level?
By: Ringo Bones
Even until well into the 21st Century, the
reference frequency that is used by leading cassette tape and cassette deck
manufacturers are either still 315 Hz or 1,000 Hz in order to set the MOL or
maximum output level in choosing the proper bias current level when recording.
But are there any benefits in choosing a much higher reference frequency when
selecting the “optimum” recording bias level to obtain the chosen cassette
tape’s MOL?
During a study conducted by their R n’ D engineers during
the latter half of the 1970s, Hitachi – one of the leading Japanese cassette
tape deck manufacturers of the time - determined that the bias level giving a
peak output at 1,000 Hz is roughly 33 percent higher than that which gives a
5,000 Hz MOL for “Normal” ferric-oxide tapes, roughly 25 percent higher for
chromium-dioxide tapes and 11 percent higher for ferrichrome tapes. Armed with
such statistical information, Hitachi was able to create a system that was
later used in their self-adjusting cassette tape decks that adjusted for MOL at
1,000 Hz while actually testing a tape at 5,000 Hz. Although a tape-type switch
– i.e. Normal/Type-I, Chrome/Type-II, Ferrichrome and Metal/Type-IV – must
still be used to make sure that the right correction factors are introduced.
This cassette tape recording “phenomena” discovered by
Hitachi’s R n’ D engineers at the time became the “brilliant engineering
solution” of its day when it comes to designing self-adjusting cassette tape
decks that most enthusiasts can still afford because competing designers of
self-adjusting cassette tape decks near the end of the 1970s have decided
against incorporating distortion analyzers in their cassette tape deck designs
– even their premium models – because the complexity and expense is deemed a
bit too much for the market at that time. Instead they all aim to use various
schemes to aim for maximum output level (MOL) at some specific frequency –
usually around 1,000 Hz. But it looks like Hitachi’s method of using a higher
frequency was deemed as a brilliant cost-effective engineering solution back
then.