Whenever I’m looking through reviews for laptops or smartphones, I’ve usually come across at least one person saying something along the lines of “this product’s design is boring,” or that the device as a whole isn’t “exciting,” and then calling it boring later on.

There’s nothing “boring” about these products. These kinds of tech reviewers really frustrate me, since this perspective of “exciting product” and “the boring usual” distracts from otherwise useful information. I don’t need a reviewer to tell me what they think of the design, since I can clearly see it in all of the photos they tend to load their reviews up with (or just footage in a video). At the very least I’d like to know what the materials used are, since that can be a bit of an unknown factor when it comes to tech, but yeah. There’s just too much of this “tech critic” attitude as opposed to an actual “tech reviewer” attitude out there.

The majority of reviewers prefer to talk more-so about the externals of their units, and that’s totally fine. However, I don’t see why they can’t at least go past that further into their reviews. If people don’t want to hear about performance numbers, benchmarks, heat, etc etc, they’ve probably already skipped ahead to the verdict, or just zoned out completely. At the same time, there are plenty of people who just skip past reviewer’s opinions on how the device looks (which honestly is the least valuable part of any review, unless there’s something notably different about it in person). Those people tend to want performance numbers. They want to know what the experience is like using the device, not how “epic” it looks to carry around. They want to be sure they’re getting decent performance per dollar, and maybe they’re on a budget, trying to spend the least amount possible for the most computer possible.

Don’t get me wrong, there are some sources that, in fact, do performance numbers as well as some opinions on the looks of the device. For example, NotebookCheck is a good source, although their track record for consistent, valid testing of performance is a bit shaky. Their numbers are generally good, but take them with a grain of salt. The main issue is just that a lot of potential reviewers choose not to even bother with stress tests, or even try and work the machine as hard as they can. They’ll fire up Overwatch, play for a few minutes and then say the gaming experience is “amazing,” and then in reality the device starts overheating about 25 minutes in. Sometimes reviewers won’t even use the laptops for more than maybe a day or two before writing up their review, not giving themselves even enough time to make it their own and REALLY test it.

There’s something to be said about quality review practices when it comes to consumer tech, and unfortunately they’re not very common with the more popular reviewers. At the same time, I’m not sure who’s to blame for that; Is it the audience influencing the reviewers, or lazy reviewers influencing the audience? Or maybe it’s the brands. Conspiracy theories aside, it’s just unfortunate that decent reviews are far and afew. I’m trying to solve that issue with my site (and potentially my own company at some point? Optimistic, I know.) but only time will tell.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.