It’s the time of year for saving money!
I bet no audio writer’s ever used this LEDE before – Webster’s Unabridged Dictionary defines “Reviewer” as one that reviews; especially : a writer of critical reviews. The first time the phrase “reviewer” was used was in 1651, which was also the first year the term “daydreamer” appeared, as well as “acrimonious.”
But does Webster’s cover what an audio reviewer’s job is? Nope. And as I read through the current crop of reviews that crossed my viewscreens during the past week I’m inclined to believe that some reviewers don’t know either…
I have always operated under the assumption that the principal role and goal of an audio review is to explain what a device does, how it does it, how well it does it, and how it compares with other similar products. And while I think making an audio review interesting as opposed to a set of bullet points is part of the process, I don’t care what a reviewer likes for breakfast, lunch, or who they’ve had dinner with. I do care about what their level of experience happens to be.
I also need some context, especially if it’s is a novice’s post, to know how critically listening comparisons (if any) were accomplished and with how much “scientific” rigor they were carried out with.
For me as a reader of reviews (as opposed to being a writer of them) I need to have enough information so I can figure out how much veracity to attach to a review. Frankly, I don’t care how impressed a reviewer was with a piece of gear, but I do care about the WHYS of that favorable impression. In other words, I’m far more concerned with the reasons for an opinion than the actual opinion itself.
And while it’s really not that hard for first-time reviewer to get the basic specifications and features right, doing a listening comparison is far more complex and much easier to get wrong. How many reviewers actually bother to see if two Digital Audio Convertors being compared in a real-time A/B comparison have the same output levels? Some do (I hope) but many, as far as I can tell from the reader’s position, never gave leveling the playing field a single thought before their listening tests.
Some Amateur reviewers are extremely thorough, so thorough that their reviews run for multiple screens. That’s a good thing since there’s more space for them to explain how they conducted their listening sessions. But there is a point of diminishing returns where a review is simply too long…On some I wonder how many segments the harmonic spectrum needs to be divided into…with several paragraphs devoted to each individual segment…zzzz…yes, in my humble opinion a review can be too long…
And while on the subject of too long – I’m not a fan of video reviews of audio equipment, period. My reason is simple – I can read and digest the information I need much quicker in written form than in a video – also I can find the information a second and third time far easier with a written piece than a video. For me vids are slow and inefficient. I can read in fifteen seconds what it takes even a fast presenter more than a minute to deliver. My time matters to me and vids are more time-wasters than value-adders in my world…
Roping back to the original question – what is a reviewer’s job? For me the answer is relatively simple – tell me what I need to know in order to make an educated buying (or not buying) decision. Everything else is just noise…
LEDE, I get it!
“I’m not a fan of video reviews of audio equipment, period.”
Amen to that! You hit the nail right on the head.
For me mostly moot as I do not have the same gear or room or ears of the reviewer. Saying that the specs, measurements and general character of component. Ease of use, service, price reliability.
I agree about video reviews. They are real time-wasters.
I have found the hard way that reviews of new technology can be perfunctory and miss important things, through the reviewer’s lack of experience or equipment. Several well-known streaming products received rave reviews from major publications without any warning that they didn’t do gapless network playback. That is not rocket science — it’s needed for music! And this while competing products could do it fine.
Reviewers of analog equipment change associated components, try several cables, and so on. When it comes to DSP and network gear, the ball often gets dropped. I’ve see network reviews — maybe the majority — in which the reviewer didn’t verify compatibility with both Android and iOS, or didn’t try other important features of a product. Really, you can afford $2000 cables, but can’t afford a $300 Android tablet? Most reviews of the Classe CP-800 preamp, notable for its DSP features, didn’t use the DSP. Why review something if you can’t do a good job?
So what I want are reviews that are thorough, knowledgeable, and compare the unit to others while taking the needed steps to ensure those comparisons are fair.
YES on the video reviews take longer = tedious = scroll to the end to see the final minute wrap up.
The other factor is that some of us are wired to gain context better from reading text than to listening. Re-reading a sentence that I am not sure of its meaning, is much easier than trying to rewind and re-listen, rinse, repeat.
Video reviewers would do a service by making a transcript available to accompany their on-camera blatherings.