mp3 decoder tests

  16-bit Least Significant Bit test  

In this test, the accuracy of various 16-bit mp3 decoders is compared, down to the LSB.

Least Significant bit accuracy of mp3 decoders (gif image 26kB)

What are the important results of this test?

Do we care?!

The purpose of this test is to shed light on the discrepancies in the least significant bit, as decoded by the programs on test. The 1-bit differences, as revealed in the objective sound quality test probably aren't audible - i.e. a decoder scoring 6 sounds no worse than a decoder scoring 7. However, the differences revealed in this test might be audible. Though the reason for encoding a 1-bit signal was to test the decoder's numerical accuracy in reconstructing this last bit, this does have a bearing on how the decoder will sound.

Many CDs are mastered to 20-bit accuracy. Though CDs only hold 16 bits of information, that only limits the noise level - within the noise, signals that would only excite the last bit of a 20 bit system can be stored, and heard. mp3 encoding doesn't claim to store a bit-perfect representation of the original signal, but when the signal level is VERY low (so only the last bit contains information) you CAN hear signals within the dither (noise) - mp3 encoding should store these signals, and decoders should correctly reconstruct them. This is what we test here.
If a decoder destroys this last bit, some listeners may perceive that the ambience has been removed from the recording.

Note that it is possible to decode mp3s to greater than 16-bit accuracy - see the 24-bit accuracy test for further details.

How was this test carried out?

3 1kHz sine waves were generated in a 32-bit processing domain, at levels of -96, -96 and -97dB FS respectively. (The amplitude of these tones is equivalent to about 1-bit of a 16-bit system). These tones were transferred from a 32-bit domain to a 16-bit domain using triangular dither, no dither, and rectangular dither respectively, and saved to a standard wave file. The resulting signal has a 1kHz tone with dither noise (+/- 1 bit), a harmonically distorted 1kHz tone (+/- 1 bit), and a 1kHz tone with dither noise (0 / -1 bit).

This file was encoded and decoded using the software under test. The resulting files were listened to and analysed. Some had nothing, or a distorted mess where the sine waves had been (shown in red). Some sounded OK, though there was some harmonic distortion added (shown in yellow). Some sounded exactly the same as the original, and measured as having no significant harmonic distortion (shown in blue).

Is this test reliable?

Yes, it's totally repeatable, if you understand how to generate the test tones.

FAQs

  1. Did this test show which decoder is most accurate?
    The decoders disagree as to what the least significant bit of a given signal should be. Surely one is right and the rest are wrong? Maybe, but this test doesn't tell us which! Using a test signal containing only 1 significant bit of information, most decoders decode it correctly, including all the ones we wish to compare.
  2. I don't understand dither, least significant bit etc - please explain?
    This page gives a good introduction to dither. This article explains the audible effect of dither in digital audio workstations.
  3. Where to now?
    Go back to the list of tests or go forward to the next test.

counter