Golf Digest is a pretty good publication that tries their damndest to educate golfers on a variety of issues from the swing to golf trips to the professional tours and one of their hallmark institutions is their annual Hot List. The Hot List can be found in it’s entirety here. Basically, it’s a panel of golfers of varying skill levels testing the most recent equipment releases and putting it all in one handy resource, which sounds great on paper, right?
However, here comes the rub…can you trust the reviewers or the Hot List’s proctor? Hell, can we even trust that this exercise even happened in the first place and isn’t just a whole bunch of marketing fluff? Sure there’s photos on GD’s website that sure looks like a bunch of people testing golf clubs, but for the sake of transparency and accuracy of the review, I think it’s important to understand the metrics used to evaluate…and a simple 5-star aggregate rating isn’t cutting the mustard.
Here’s my biggest concern with the Hot List…it’s run by the same people that receive millions of dollars in revenue from these equipment companies, and since a Hot List review can impact potential sales every bit as much as an advertisement, there’s a conflict of interest. Say you’re TaylorMade and you pay GD a ton of money to put your huge ads in their magazines and then GD’s Hot List, as objective as it could be, completely rips your new product…would you really want to give that magazine your money the next year? Now imagine you’re Golf Digest and you’re a print publication in a dying industry that needs every bit of money it can get…do you really want to piss off the guys lining your pockets?
Admittedly, this is a very cynical view of something that’s designed to be as democratic as possible, but considering our current political climate where conglomerations have the ability to shape their respective industrial landscapes, it’s hard to take these types of things at face value. So fine…we’ll play nice, for the time being, and assume that these evaluations are completely removed from any bias and take the testers at their words. Which brings us to the next problem…the evaluations themselves.
After the jump, we’ll set our expectations from the beginning and then get into some of the problems.
Equipment, in any realm, is highly, HIGHLY subjective, and golf clubs are no different.
Personally, I’m not attached to any particular brand, and don’t have any real biases one way or another aside from my individual preferences. As important as any performance or equipment specs, if a club doesn’t feel comfortable to the user, it’s going to have to perform unbelievably well in order to convince that player to purchase it. As a prospective buyer, you have your own set of guidelines…maybe you like more forgiving irons or a certain type of material used in the clubface. Maybe you like bigger, lighter drivers or smaller fairway woods. Perhaps you prefer long irons over hybrids…whatever it is, at this point, pretty much every manufacturer has something to fit your eye, and it’s usually of at least decent quality.
But really, if everyone in the panel tested the submitted clubs like this, it’d be pretty hard to separate the results from the individual tester’s preferences and even harder to get testers to ignore brand bias. Everyone knows that one guy…the kind of guy that’ll show up to a course with a Tour bag full of the same brand, and they’re undyingly loyal. You find these guys everywhere, from the dude who only wears Nike gear to pickup basketball games to the softball player who will only use DeMartini bats to the guy who’s driven the same brand of car for the past three decades…once someone gets comfortable with how a product performs and looks, they tend to view comparable products through their own, unique lens.
There are also the types of people who go against the grain simply to go against the grain (and I’d consider one of my feet firmly planted in this camp, for the sake of honesty) who will revile in horror whenever they see a major manufacturer’s name on a product, no matter the quality. Does it matter that your TaylorMade driver is the best performing club? No, because these guys don’t like how TaylorMade releases thirty products a year and hates their hit-or-miss quality control, so they’ll look elsewhere even if the product in question is undoubtedly the best choice for them. The best example of this type of reaction can be found in the people who will NEVER play a Nike golf club because they think the only reason Nike’s sell is due to Tiger Woods and the countless conspiracy theories about his gear that, ultimately, has nothing to do with the quality of the product itself.
Personally, while I harbor supreme doubts about overly marketed stuff that promises to ADD 25 YARDS TO YOUR DRIVES, I’m very open-minded when it comes to clubs because the ultimate goal is to tailor your equipment around your game, and that’s what we’re getting into next.
Much like there widely varying preferences, there are wildly varying mindsets on what a golf club should be, and how it’s design fits into the industry landscape. In today’s era of technological advancement, empty marketing buzzwords pepper these types of tests, and really confuse the marketplace as to how significant these technological advancements are. For example, the recent trend in drivers is “longer club length leads to more distance,” something that sounds 100% right on paper but is only tangentially rooted in actual performance, and if anything, creates more of a deficiency in the other aspect of ballstriking, accuracy.
Another misconception is that adjustability in clubs is a huge benefit. Sure, TaylorMade adjustable drivers have their place in the market, and it sure as hell makes it easier for clubfitters to fit golfers and optimize their sticks, something that can be marketed along the lines of “you’ll see an increase in distance because the club fits your swing better.” Again, better on paper than in reality. Any mass you remove from the club and place on the hosel has a negative impact on driver performance (even if it’s something as minute as a couple of grams), and when I recently tested TaylorMade’s r11 (adjustable) and Burner Superfast 2.0 drivers (non-adjustable), my swing speeds were consistently grouped yet I saw an increase in ball speed, carry distance and overall distance with the non-adjustable driver, and I’m sure that difference would be a little more drastic once I got the non-adjustable Burner fit.
Even ignoring pure physics, the Hot List, instead of taking every manufacturer’s product and evaluating it on it’s own merit, mentions things like this in their section on Ping’s new g20 drivers…
Not: For a company that invented modern fitting, it’s time the G-series went adjustable.
…which ignores that Ping has an innovative fitting system where they fit their clubs to the individual from the factory and they’ve been doing this for decades and have a system for all of their clubs from driver through putter. Why would Ping need to adopt an adjustable club when they BUILD the club to fit the player and don’t have to take any mass away from the club?
That’s part of the problem with the Hot List…any golfer who’s serious enough to spend $400 on a driver or $900 on a set of irons SHOULD spend the time, effort and money on getting them set the exact way they want it. ******, it costs maybe $30-40 to fit a driver…so why not get it fit instead of skimping on chump change to get a lesser driver? Even looking at the regular model’s testing, it got 5-stars for performance…and yet it’s time to go adjustable? THEY’RE ALREADY ADJUSTABLE FROM THE MANUFACTURER.
Integrity of the Reviews
Red flags get raised when you read a review like this…
Middle-Handicapper: “I could hit the low spinner and the high flop. It glides through the rough as if it were on rails.” … “The protruded sole makes it easy to hit out of the sand. That bulge also helps with chipping.”
Middle handicappers don’t hit low spinners and high flops unless…there is no unless. If you have the technique to pull these shots off, you’re probably not a mid handicapper. Either that, or this guy’s review is an emotional reaction to hitting a couple shots like this while ignoring the flubbers and bladed shots you see all the time from a 15 handicap.
Here’s another one from the previously linked g20 driver…
Middle-Handicapper: “All shots flew within a 10-yard window. Even when I hit it all over the face.”
If I was playing a match for money against this “middle handicapper,” chances are good the match would be over real soon. Either this guy’s sandbagging and is a single digit player, (which “all shots flew withing a 10-yard window” would seem to suggest) he’s lying because shots hit “all over the face” would be all over the course as well or the force is strong with him. Or maybe he’s the best driver in the world that is horrible in every other aspect, but…yea…rare.
First, we have no idea what the testers are like aside from handicap. We don’t know how tall they are, how long their arms are, if they have a late or early release, club head speed, angle of attack, etc. All this comes into play when evaluating clubs because, while I’m 6′ with gangly, long arms, have a late release, have 110 mph average clubhead speed (as of 12/23/2011) with my driver and hit into the ball at a steep angle. What gear works for me is not going to work for someone taller that has a sweep release and a shallow angle of attack. I prefer heavier clubs, you might not. I like flatter lies, you might like them more upright.
There is no standard amongst the major club and shaft manufacturers for anything. One company might use legitimate, premium aftermarket shafts (like Adams) while the bigger companies us “made for” shafts that are different than the aftermarket models even if the shaft uses the same model name and/or has the same graphics. What one shaft company might consider “stiff” might actually test out more flexible than another company’s “regular.” Each manufacturer’s clubs come in different swingweights with different lofts and lie angles so while you might have a Titleist 7-iron next to a TaylorMade 7-iron, the only thing similar between the two clubs would be the number on the bottom.
I know I’ve already used Ping’s g20 as an example twice, but the text for the Ping reviews are just mindboggling.
Like we said above, there are different peacemakers for different cowboys. Typically, the more forgiving “game improvement” irons have MUCH stronger lofts because of the way they’re designed and it’s not uncommon to see the pitching wedge in a set of GI irons measure out at 45* loft…3* (and sometimes up to 5*) stronger than the PW loft in traditional irons because, typically, the player that buys these irons (or falls into the marketing hype) has a typical, early released slicer’s swing that adds loft at impact (instead of properly taking a descending blow and de-lofting the club at impact) and coupled with the perimeter weighting helping get the ball airborne, needs stronger lofts. Again, sounds great in theory, but what good is “hitting an 8-iron” 160 yards when, in actuality, that club is more like a 6.5-iron in design with an 8 stamped on the bottom? Good for the ego, not necessarily good for the golfer.
So when GD’s review says this about Ping’s s56 irons…
Not: The lofts are among the weakest in the category. Players seeking distance will have to bring their own.
…it’s confusing. Of course the lofts are among the weakest in the category…the category is entirely comprised of irons designed for players that DO de-loft the club at impact properly and make a descending blow and the clubs are designed for a lower ball flight. And even then, when you compare the specs of the s56 to the other “player’s clubs,” such as the Titleist 712 CB, Mizuno mp-59, Nike VR Pro CB and TaylorMade Tour Preferred MC…IT’S NOT EVEN TRUE. The Ping’s are right there with all of those models throughout, and are actually stronger when you get to the 6-iron than some of those listed.
Old Man’s Club
Right off the bat, and despite GD’s claims, smaller manufacturer’s and niche brands are not well represented. Sure, there’s the Tour Edge Exotics and Yonex wedge, but why isn’t Miura featured? Oh, right…because every other iron and wedge maker would be fighting for second place and it’s probably not a good idea to tell people to go spend close to $2,000 for irons and wedges, no matter how much sense it makes financially (getting off-the-rack clubs to the exacting specs Miura’s come in would add a considerable cost to the base price of the irons considering the variance in lofts and lies in clubs from the factory, not to mention custom shafts and grips).
Inconsistencies aside, if you take a look at some of the ratings given out, it’s pretty obvious how politically motivated the Hot List is, and how little it actually tells you about the tested clubs. I mean, seriously? DEMAND is one of the four categories listed? I fail to see how demand has a single ******* thing to do with the performance of a club, and if if it only accounts for only 5% of the overall rating, it shouldn’t even be considered to begin with. But, even ignoring that, some of the ratings simply don’t make any sense at all.
Take two fairway woods…the Adams Speedline Fast 10 and TaylorMade Rocketballz. Both models feature a similar design characteristic…a speed slot that increases the trampoline effect of the clubface allowing for more distance. In the Innovation category, the Adams got 4.5-stars while the TaylorMade model got 5-stars as well as the award for leading the category. The only problem with this is that the TaylorMade model only has one, small speed slot on the bottom while Adams has two…oh, and the fact Adams has been using this technology to a greater effect for the past two ******* years.
Now, lets look at some of the negatives…
First the Adams…
Not: Not all players want a 7-wood that draws.
Simple solution…don’t buy the 7-wood then. I’m really failing to see how this is a negative.
Now the TaylorMade…
Not: Some of our testers found this 3-wood launching lower compared to the other fairway woods.
As opposed to a simple purchasing decision that’s pretty moot to begin with considering pretty much every golfer out there uses a hybrid these days instead of a ******* 21* fairway wood, a low launching 3-wood is a serious ******* problem. Hell, ONE OF THE GUYS RUNNING THE ******* TEST even said in one of his articles…
The results of a 2009 launch-monitor test by Golf Digest found everyday golfers who swing less than 85 miles per hour actually carried the ball five yards longer with a 4-wood as opposed to a 3-wood, while those swinging more than 85 mph hit the 4-wood eight yards longer. (Elite players still hit 3-woods farther than 4-woods). In both instances the ball went higher too, providing a softer landing for those using the club on approaches into greens.
Now, let’s go back to the player reviews for the Adams model…
Low-Handicapper: “The slot really looks unconventional, but the performance outweighs how it looks.” . . . “When you hit the ball thin, it still elevates.”
…yup, these are my testers.
For the record, both these models got identical scores in the Performance category, even though the biggest criticism of the TaylorMade model ONLY HAS TO DO WITH PERFORMANCE while the Adams criticisms were about a model that’s almost obsolete that you simply don’t have to purchase. The TaylorMade got a gold star overall while the Adams got silver. I seriously just don’t understand it.
Oh wait, yes I do…I picked up a random, relatively recent issue of Golf Digest I had laying around my office and, lo, Taylor Made had five entire pages worth of ad space as well as the back cover. Adams? One.
Adams doesn’t care about not getting a return on their investment because it probably didn’t cost them much for one page, they really only make drivers, hybrids and irons so they’re not
losing money on putters and wedges, they’re specifically targeting their market and have their own niche in the industry. But TaylorMade? Five full pages and a back cover for one issue had to cost a lot of money…and considering they release clubs FAR more frequently than any other manufactuerer, and have entire releases that have failed economically, they want that ******* gold star damnit.
When you’re looking to buy golf clubs, please don’t fall for marketing hype and “objective” tests like these.
The golf club industry has gotten out of control, and frankly, if you look in the bags of PGA pros that aren’t contractually required to put new models in play, you’ll see a lot of old stuff that’s supposedly “outdated*.” Golf Digest even did a story on clubs like these, and if Rory McIlroy can still hit 280-yard 3-woods using a 5-year old Titleist model and Mark Calcavecchia is using two Callaway fairway woods from 1998 (!), of which can be purchased for $40 or less these days, why would you drop $200+ on a new model?
*a quick aside…7-woods are outdated no matter what.
I’m not blaming the companies for behaving like this, since it’s their job to turn a profit, and I don’t blame Golf Digest either for letting their standards become corrupt. But what I am blaming them for is not properly educating golfers on equipment, and instead, forcing the same old **** down our throats year in and out.
Technology HAS increased club peformance, but you have to ask yourself, to what degree? Does it matter how good a club performs when you stripe it if you, as a player, rarely find the club’s sweetspot? Probably not. Sure, new technology can help you get more out of a *******, but it was still *******. Sure, a model might have weaker lofts than a different model, but is there a reason for that or is static loft the be-all-end-all (hint: it’s not)?
Prices have gotten out of hand throughout the golf industry, and from a personal point of view, outside of wedges (which you really don’t have much choice other than to buy new), I simply cannot afford to buy a new golf club and still play the game. I’m going to buy a new driver shortly and there’s no way in hell I’m about to shell out $300-500 for a brand new model when I can buy a previous model at a much steeper discount, purchase the quality aftermarket shaft and have it all set up exactly the way I want it for $200-250, all said and done?
Every year these manufacturers come out with new releases, and sometimes, hell, oftentimes they live up to the hype. I’ve criticized TaylorMade a lot throughout this post, but I’ll probably be buying a previous TaylorMade model driver because it’s a really ******* good product. I’ve had a love/hate relationship with Nike Golf, but their new dual sole wedges look like the best mass produced lob wedges in the market, in my opinion, and yet, the sound of a Nike driver makes my ears bleed, even still. The point is, the marketplace is flush with quality products that are very, very similar to each other, and the only way you’re going to find out what works best for you is by trying them out yourself, not by blindly following some marketer’s non-speak.
Now, please excuse me…I’m going to take a dump and re-read the issue again.