It’s finally March, which means a few things: 1) The Player’s is just around the corner, 2) The Cubs are looking GREAT, and 3) Golf Magazine releases their annual wishy-washy “review” of this year’s new golf equipment.
I’ll admit it – I used to read this issue cover to cover and dream about splurging on the latest stuff (see my earlier post to see how often I actually pulled the trigger). It’s a great way to get a complete overview of each manufacturer’s offerings that year, but what it definitely does NOT do is give you any way to decide between all of the equipment presented. Their usual approach was to have golfers across a range of abilities try the equipment and the magazine would publish one or two brief comments about each club or set tested. In an era where data is everything, the lack of objective, hard facts really stood out. And almost every piece of equipment was given positive reviews, so it was easy to conclude that the magazine had wimped out to avoid pissing off advertisers.
But this year promised to be different. Golf Magazine paid to have a laboratory test all of the equipment the way you and I would do it: slow swing speeds, fast swing speeds, sweet-spot contact and not-so-sweet-spot contact, etc. And the results were…well, not really presented. What? They went to all the trouble and expense to generate all of that wonderful data and sat on it? It sure seems like that.
As a service to our readers, I’ve summarized the data Golf gave us to help choose between the most popular 10 drivers. This includes both the comments from testers and findings from robotic testing, and any info not provided is shown as “???”:
You can barely tell the difference between the comments from golfers and the results of the robotic testing! Is the “Impressive” rating for forgiveness for the Mavrik Max better than saying that it “Goes” for the Mavrik Sub-Zero? Is it a good thing or a bad thing that the Cobra Speed Zone Extreme looks “like a hammer”? WTF? Golfers who select their equipment based on actual club performance should be offended by the red herring of robotic testing, which adds nothing to the comparisons.
Given that the list price for the clubs in the table above is between $450 and $550, nobody should be making buying decisions with the scant info provided. The lack of real data only confirms the true purpose of this issue: to give advertisers an additional advertising slot.
I reached out to Jonathan Wall, Golf’s Managing Editor for Equipment, for a comment and I’ll update this post if I hear back from him. Not holding my breath…