And now for the dramatic conclusion to The Great MP3 Bitrate Experiment you’ve all been waiting for!

A few years ago I converted my CD collection to compressed digital audio.  I settled on 192 kbps VBR AAC, which was the bitrate at which I could no longer easily differentiate the quality of the test songs.  Looks like I may have been able to settle for a slightly lower bitrate and saved some disk space:

Even without busting out hard-core statistics, I think it’s clear from the basic summary statistics graph that only one audio sample here was discernably different than the rest – the 128kbps CBR. And by different I mean “audibly worse”. I’ve maintained for a long, long time that typical 128kbps MP3s are not acceptable quality. Even for the worst song ever. So I guess we can consider this yet another blind listening test proving that point. Give us VBR at an average bitrate higher than 128kbps, or give us death!

Anyway, between the anomalous 160kbps result and the blink-and-you’ll-miss-it statistical difference between the 192kbps result and the raw CD audio, I’m comfortable calling this one as I originally saw it. The data from this experiment confirms what I thought all along: for pure listening, the LAME defaults of 192kbps variable bit rate encoding do indeed provide a safe, optimal aural bang for the byte – even dogs won’t be able to hear the difference between 192kbps VBR MP3 tracks and the original CD.