Updated: Oct 16, 2019
Well, ok: I lied.
DAWs do not destroy recordings.
I love DAWs, and we can make wonderful recordings and mixes using any. This post is emphatically not about how analog is better than digital (it isn't).
There are, however, two super annoying things about most DAWs.
These are the default waveform zoom level and the default meters color coding (when the signal gets from green to yellow to red) in their GUI - i.e. the way they present tracks out of the box, when you've just installed the program.
These two aspects are annoying because they seem designed to trick engineers to do the wrong thing: namely, to turn the gain up too much and overcook a track levels, either at recording or when mixing.
Even worse, they reinforce each other in giving him or her a completely wrong idea of what's going on in terms of quality of the recording.
And worse of all - it doesn't need at all to be so, because in the end it's only a GUI choice.
Many DAWs get their user-interface cues from analog consoles.
That's nice and good - consoles ended being the way they are for a reason: they pack a lot of information and controls in a relatively small physical footprint. But there is s a fundamental difference between analog and digital recording - especially the 24 bit recordings which are common nowadays.
Digital recordings offer an incredibly wider dynamic range, greater than anything that ever came before - from gramophone wax to the professional studio and master tape used until the 90s..
Specifically, 24 bits allow for 144 dB of dynamic range. 144 dB is an immense number in linear terms. Really, really gigantic, so huge that analog/digital converters aren't yet able to handle it all - i.e. they can't fill all the 24 bits in a meaningful way. The best manage around 128-129 dB and your run of the mill interface will easily do 110 - both still very, very large numbers.
In practice this means that, recording at 24 bit, an audio signal at about the middle of the range will be terrific high quality. Much, much, much farther away from noise than even a "pushed" "professional" studio master tape recording could ever dream to be.
A signal like that will be joyously pristine.
And here's the rub. Both DAW default waveform display and default meters attempt to show the full scale - the whole zero to 144 de-ci-bels shebang. In a couple centimers of screen space at best.
That is insane.
Given the signal above, which is far better than anything MIchael Jackson ever got down to tape, the DAW will show the associated waveform as pathetically small and the meter levels as pathetically low.
Intuitively, it will look like the signal is terrible.
It's like looking at New York City while zoomed out to cover the Solar System from the Sun to Jupiter. The whole Big Apple will look like... a dot. A small part of a dot, actually - you won't be able to see it at all.
It's only if you manually zoom the waveform to a sensible range (from quite a lot over silence to middle scale, for example) that it will appear in all its glorious details.
And only if the meters are set to show a similar range, with the yellow appearing say at around -12 db below max in the full scale, they will give the right intuitive perception about the reality of the high recording quality we're seeing.
Sure, for the happy weekend occasions when you are close-miking a jet engine at full throttle, followed by a close-miking of a butterfly flight, you may need to see a little more of the scale. So it's fine to have the choice to see the full scale.. but it should be a choice, not the default.
The default for both waveform and meters should be to display a much more sensible range - the range where most of the (high quality) recording of real music exists.
Unfortunately, it isn't.
Following old analog-inspired conventions, which are totally out of place in a digital recording system, the DAW's meters and the waveform almost invariably default to showing the full scale, and therefore concur in deceiving the engineer to think that a great signal, indeed, sucks, while it's actually very good.
Even the metronome is often set, by default, to unity gain, with a click so loud that immediately pushes you to turn down the volume...
Said engineer, of course, will then proceed to crank up the gain.
And that means demanding more of their converters (unnecessarily), which often will make them sound worse even if they don't clip, and possibly killing the recording altogether if that marimba player suddenly decides to hit that note really hard.
So there you have it. I love DAWs and would never dream of having to cut tape (or also record to tape). But the fact is that , by simply presenting things the way they do, DAWs encourage overcooking levels and getting worse recordings, or no usable recordings at all.
The lesson is easy: next time, when you fire up your DAW, first thing zoom in the waveform display of the armed tracks; and set the meters to show a smaller range and turn yellow over -18dBFS, if you can. Most DAWs allow both customizations.
Your digital recordings will right away sound much better, and no trashcans will be involved!