More Objectivist Hogwash on High Resolution Audio

Not so long ago, I stumbled across an article on a web site called that was titled "24/192 Music Downloads... and why they make no sense." My first thought was "okay, here we go again." 

Ar-Hi-Res-Audio.jpg"Monty," the only reference given to the author's name, did, in fact, present what appears to be a well researched and well written academic paper. His article makes the claim that not only is a 24/192 download not superior to 16/44; it is actually inferior. In reading this supposed treatise, Monty begins by discussing in particular detail precisely how the ear functions. Actually, I found that part interesting despite my inclination that it didn't really make any difference to the central question. 

He next discusses "Sampling rate and the audible spectrum," noting that the range of human hearing is 20Hz to 20KHz. Well, nothing perceptibly new there. He also included in this section a graph of the Fletcher and Munson loudness curves, which again, I have every confidence is exactly correct. 

It is at this point that the attack begins. 

Do "Golden Ears" really exist? According to Monty, no they do not. I suppose in the strictly scientific realm that might be true. But is not the practice of the audiophile hobby in some measure about listening? I have always held the notion that to whatever degree the listener chooses, the audiophile hobby is about serious and critical listening of music. I also believe this to be a learned experience. While my own listening skills might certainly be improved upon, I have little doubt I am a better listener today than five years ago, ten years ago, or really any time in the past. I see the question of listening skills as defining exactly how science defines "Golden Ears" and how audiophiles do the same. 

A little further on, Monty claims that 192kHz music files offer no benefit. He notes that "Neither audio transducers or power amplifiers are free of distortion..." Okay, distortion, I can buy that. He includes a graph that exemplifies an "Illustration of distortion products resulting from intermodulation of a 30Hz and a 33kHz tone in a theoretical amplifier with a non linear total harmonic distortion (THD) of about .09%." Theoretical amp? Test tone? And what of that THD? My amp, which is actually real, tangible, and weighs over a 100 pounds, has a THD of .009%. Besides, THD is only one measurement in a gaggle of measurements critical to amp design. And my amp is also the last stage in the chain of events that produces, real, actual, listenable music.  And Monty is using a theoretical amp? 

AR-Amp33.jpgIn a different part of the paper, Monty has a section on clipping. So, wait now, are his claims based on a theoretical amp with questionable distortion (he never identified what type his theoretical amp was - class A, AB, SET or so on) driven to clipping levels? I don't know about Monty's amp, but I can drive my system at 100db for hours, days and probably weeks and never even remotely approach clipping levels. Why? Because my amp has sufficient headroom to play music at those volume levels. Why? Because it is a highly engineered audiophile product and not some theoretical equivalent used in a laboratory. 

Not surprisingly, and quite predictably, Monty eventually, and after several sections with ever more graphs, finally arrives at the most beloved part of scientific theory - the double blind test. Would it surprise ANYONE that Monty concludes that with a properly conducted DBT no measurable, perceptible, repeatable difference can possibly be noticed between different digital music files of varying resolutions? Here's a question - would, or could, any scientific paper on musical resolution be complete without a reference to a DBT? 

My DAC is set to upsample everything to DSD so much of the time I don't notice a huge difference in sonics between standard and high resolution. So for fun, and to test my supposed "Golden Ears" (which I'm not supposed to have), I changed the setting on the DAC to no up sampling. At various times, and for various reasons, I have purchased both standard resolution and high-resolution music of the same CD, and copied them all to my server. I decided to conduct my own little test to see if I could notice a difference. 

AR-Oscilliscope.jpgWithout using a clipboard, lab coat, or an oscilloscope, I can say that yes, I heard a difference. With no up sampling performed by my DAC there is an easily noticeable, and obvious, improvement to the playback quality of a higher resolution file than a standard one - whether playing the same artist's work or just in general. While my findings might not be an accreditation of the high-end industry as a whole, it certainly is to me on my system. 

So what does Monty offer as an alternative and what are HIS recommendations? That's real simple. Use a FLAC file format rather than Mp3. Maybe I'm over paraphrasing a six-paragraph section of his article, but Mp3? He also mentions using better headphones. Why not try an actual, live and in person, reference level audio system? 

This is one paper among many that all support the same claim - higher resolution is no better than standard off the shelf CD's. I keep asking myself if an academic type with a clipboard and a white lab coat, peering hour by hour into an oscilloscope, wrote this paper and the other ones like it. I cannot help but wonder if they ever, just once, visited the home of an audiophile with a really nice reference system and did what audiophile's the world over all do - listen to the music? 

comments powered by Disqus

Audiophile Review Sponsors