There Is a Lack of Trust with Golflink's Golf Course Ratings
I’ve always felt that there is a huge bias in the golf course rating world. Admittedly great courses like Bethpage, Pebble Beach, Doral and Torrey Pines all deserve their lofty scores. But, how these scores are derived is what I find very suspect.
As far as I know, other than Top 50 or Top 100 lists, there are only two national guides that attempt to rate golf courses across the United States: Golf Digest & Zagat’s.
Just recently I ran across an article from the Orlando Sentinel that explores the ratings found at the website Golflink.com and its Top 100 List. The reporter mentions that Stoneybrook West, a $32/round course in Central Florida is ranked #27, higher than Sawgrass (#42) and Pebble Beach (#81). Even in Minnesota where I’m located, the only course rated in the Top 100 is Somerset Country Club (#64), and the two courses that are considered the best in the Twin Cities, Hazeltine (site of the 2009 PGA Championship) and TPC Blaine (site of the Senior Tour 3M Championship) don’t even show up. How can that be?
According to GolfLink’s own website: “the GolfLink Top 100 United States Golf Courses lists the best golf courses out of more than 20,000 public and private golf courses across the country (Note: according to the National Golf Foundation, there are approximately 16,000 courses in the USA). Where most golf magazines rate golf courses based solely on the subjective views and limited experience of a handful of editors, our list is calculated from a more objective range of factors, including the preferences of up to a million or more visitors to our web site every month. This makes our Top 100 United States Golf Courses list the definitive online guide to the best U.S. golf courses, and a great resource for figuring out where to play.”
The “scoring” is based on a 1-5 Star rating system in five categories: Amenities, Difficulty, Maintenance, Scenery and Value. Like Golf Digest and Zagat’s, to obtain a rating, all you need to do to become a rater is sign-up online. My big problem with this approach is that scores obtained this way can’t be trusted.
There is no qualifications, no training, no guidelines, just opinions. A course rater might be the club pro, the general manager, the cart girl and the clubhouse attendant all signing up to rate the course, all giving it 5 stars. Also, the scoring isn’t based on a tested set of factors like the USGA’s par and slope rating, just opinions from a bunch of anonymous golfers.
I think that the Orlando Sentinel reporter said it better than I would: “You’ve got to love that through the power of user-driven golf sites, golfers can generate a voice in an otherwise smug world of golf course design. No offense to Stoneybrook West, but it probably wouldn’t sniff the top 100 on a Golf Digest ranking.”
So much for accurate golf course rating.
|« Taylor Made's New Golf Ball Promo Is Just Empty Marketing Hype||Open Letter to WorldGolf.com. Re: Michelle Wie »|
Thanks for the critique of our GolfLink Top Courses list.
As you noted, we take a different approach with our rankings and we are fully aware that there are some "interesting" selections in our Top List. It is true that our rankings are 100% determined by our site visitors and we do look at things beyond just how "good" a course is.
For example, "value" is one of the criteria. Obviously, Pebble Beach is a great course. But we consistently hear back from people that they don't think it's a great value and that they would probably not play it again because of the high cost. It still made the top 100, but it came in at #83, partly because it only scored 3 stars (out of 5) for "value."
Proximity is another factor. So even though Pebble Beach is a great course, what good is it if someone lives in Seattle? If there is a really good muni course near where they live that only costs $40 to play, doesn't that have a LOT more value to them? We think so, and we like that our Top List is different from the same ole, same ole lists that always have courses like Pebble Beach at the very top.
One final note about our user-based course reviews. GolfLink receives more traffic to golf course pages than any site. It also captures more course ratings than any other source. We currently have more than 25,000 course reviews, which is more than 4X as many the Zagats golf course guide. So if there is any entity that can make sense of the collective knowledge of the entire golf community, it is GolfLink.com. It won't always be perfect, but we feel that it is much more relevant to the average golfer than rankings that are subjectively determined by a very short list of self proclaimed "experts."
I'm a huge fan of looking at golf courses in a different way and 25,000 reviews is a testament to the willingness of golfers to do that. Where I diverge from what is out there is that all of the leading "review" sites, yours included, rely on golfers that choose to fill out the online form not based on their experience but only their willingness to fill out the form. The courses aren't being scored against each other using the same scale because it is just the rater's opinion. Unlike the USGA's method of rating a golf course in which all courses are judged by the same factors, a rating in the guides out there, both in print and online, lend themselves to bias because a course could have the club pro, the GM, the club president and his mother rate the course and there is no way to distinguish one review from another or the amount of bias inherent in those reviews.
I may be repeating myself, but regardless, I like the fact that your reviewers don't necessary follow lockstep with the rest of the world on marque courses like Pebble, Torrey Pines and Bethpage.
Comments are closed for this post.