I have a seen a thread from 2013 here discussing 24 bit verses 16 bit sound. But, I think it lacked one aspect of sound.
Most people are pretty certain that 24 bit color photo is more true to life than 16 bit color. Between two similar shades of blue, there can be an infinite number of gradations of shade.
This seems, to me, to be the same in music. Between the force of one pluck and another similar one, there is an infinite number of gradations. Between an A played on one french horn as opposed to an A played on another, there can also be an infinite number of gradations and variations.
24 bit should be able to capture far more of the different possible gradations than 16 bit.
The improvement from 16 bit to 24 bit is not the dynamic range at either end or higher frequency but the sound IN BETWEEN one similar thing and another. Giving a more true to life reproduction of that actual instrument as it was actually played.
Am I off??
Is it possible that music is the ONLY area of art or computer work which does not benefit from higher bit rates??
I think the bit depth enables audio to be accurately captured with in a volume rage. 16-bit captures 0 to -96 dbfs and 24-bit captures 0 to -144 dbfs. So a differences in signal to noise ratios. But for every -6 dbfs you lose 1-bit. So a sound at -60 dbfs is only 6-bit. You'd notice this if you recorded and playback a single guitar note, as the note starts to fade out before it becomes silent, the low resolution may be noticeable. In this case 24-bit is used because the 8 extra bits at -60 dbfs leave you with 14-bits where the full signal is more accurately captured.
But most music is captured well with 16-bits, and only music with very wide signal to noise ratios like Classical and some Jazz need 24-bit. But sadly most stuff including most film scores are compressed to be louder decreasing the signal to noise ratios, where 16-bit works just fine.
If you really hear a differences between 16-bit and 24-bit all the time...then you might be hearing the performances differences of your DAC and not the music itself. Some audio equipment performs better with higher sample rates and resolutions.
If you really wanna hear what the 44.1khz / 16-bit resolution can really do. Then I recommend this CD David Chesky's Ultimate Demonstration Disc: Chesky Record's Guide to Critical Listening. This disc not only teaches you what to listen for when judging sound quality, but challenges the performances of your audio system. And really demonstrates that its not so much the sample rates and resolutions that matters as much as the recording and mastering.
In 2001, on a film I was editing, the lush orchestral score was delivered to the final mix stage in 24-bit on a custom hard-drive array specifically designed to deliver it. At the time, the standard for film mix deliveries was 16-bit.
I had supervised production of the score, so I knew it intimately. We would preview the cues running through the system in 24-bit. It was immersive and powerful. But once we switched over, mixing the score into the film proper in the 16-bit realm, it's depth was diminished.
So yes, you can most certainly hear the difference – in the right environment.