Car Buying Articles
Rate My Ratings
Seeing Stars When Buying a Car
You're shopping for a new car and you don't have the time to research every vehicle on the market. So, like a lot of people, you turn to the vehicle ratings. But then you notice a strange thing: While editors are giving certain cars low numbers, owners of the same cars are scoring it much higher.
Curiosity aroused, you probe deeper. It appears that Publication A is giving all foreign cars high marks while harshly evaluating domestic vehicles. Publication B, meanwhile, seems to favor American-made cars and foreign cars equally.
How do you make sense of it all? And how do you know which vehicle rating system to trust?
Editor vs. Consumer Ratings
If you look at the Edmunds ratings for the 2007 Saturn Vue, you'll see that road test editors gave it a 6.5 ("adequate"), while the people who own and drive this car gave it a Consumer Rating of 8.5. What's going on here?
At first glance this seems to reveal either an Edmunds.com bias against the Saturn or a sinister conspiracy against American-made cars. However, it's important to understand the difference between a road test editor's rating and an assessment made by a person who purchased the car.
"Editors have a better sense of where the 'you are here' pin is on the automotive map," said Director of Vehicle Testing Dan Edmunds, in charge of vehicle ratings for Edmunds.com. "It comes down to comparative knowledge. What else have you driven? And of the cars you've driven, did you do more than drive it around the block at the local Chevy dealer?"
It's a Tough Job, but...
Whether at Edmunds.com or one of the many "buff books" or automotive Web sites, road test editors are always driving new cars. Their job is to drive every new car available; their goal is to know the market.
"While the normal car buyer might drive one or two cars in a class, most of our editors have driven everything available," said another Edmunds.com editor. "In addition, our editors drove last year's model too, and the year before that. It really brings home the need for extensive test-drives and comparison shopping."
This means that an editor driving the Saturn Aura, for example, has also driven the Nissan Altima, the Toyota Camry, the Honda Accord and all the other competing models. That editor has also driven cars in classes below and above the Aura and can speak to shoppers who are still unsettled on what class of car they want.
Consumer Car Reviews
Contrast the road test editor's base of knowledge to the average car shopper who may drive two or three vehicles on short test-drives. To be fair, car salesmen make it difficult to "cross-shop." Once a customer appears on the lot, the salesman's job is clear: Stop them from leaving before they buy a car!
Without a broad base of knowledge, consumers often base their opinions on perception rather than hands-on experience. An interesting example of this is a survey done in the mid 1990s of the Geo Prizm and the Toyota Corolla.
"They were both built in Fremont, California, on the same production line, same union, same coffee breaks — same car!" Dan Edmunds said. "One car was sold through Chevy dealers and the other sold through Toyota. The Prizm scored 30 points better than the Corolla."
The survey indicated that American buyers wanted to like what they thought was a U.S.-built car. Conversely, when they thought it was a Japanese car, it was judged harshly, since Toyota has cultivated a group of ultra-picky owners.
The Value of Vehicle Ratings
Many car shoppers have serious time constraints. Their old faithful has finally given up and they need new wheels now. The danger is that they will test-drive until they find a car that is "good enough" or they come across an aggressive car salesman who quickly turns a shopper into a buyer.
Rather than going from car lot to car lot and listening to a parade of salesmen, shoppers would be better served by reading reviews and looking at ratings from multiple sources. But first, these shoppers need a better understanding of how to interpret the information they are reading.
Limitations of Consumer Reviews
Consumer car reviews have built-in limitations. As such, they need to be understood before they can become useful.
"There is emotion in the ratings which just can't be tickled out," Edmunds said. "Even nationalism enters into it."
Marketing experts have pointed out that consumers reviewing cars they just bought are actually "validating their purchase" — they don't want to admit they are wrong. Not only that, but consumer reviews tend to be written in the first 90 days of ownership, the so-called "honeymoon period" before there are maintenance costs or reliability problems.
A Common Misunderstanding
One major online automotive site awards a 1-5 rating system. "It's easy to look at their results and think they are delivered from a team of researchers in lab coats carrying clipboards," a marketing expert said. "In fact, what you are looking at is merely the results of a mail-in survey filled out by owners of that vehicle."
While mail-in surveys of reliability issues are important, such surveys that evaluate driving dynamics and usability issues are wide open to interpretation while effectively having no "checks and balances" system to validate the findings. In this area, a road test editor's opinion is likely to be more useful.
"The reasons ratings are misunderstood is because there is a wide variety in how they are done," the marketing expert said. "And often they don't disclose how they reach their findings. But once people learn the raters' point of view, they can use it as a reference point."
J.D. Power and Associates
Reviews of cars, boats, electronics and many other things are compiled by J.D. Power and Associates in what it calls the "Circle Awards." The basis of its evaluations is the mail-in survey which, according to the company Web site is, "based on the opinions of consumers who have actually used or owned the product or service being rated. Since the ratings are based on J.D. Power and Associates research studies that survey a representative sample of owners, they are indicative of what typical buyers may experience."
Another section of the J.D. Power site explains that the information "is based on independent and unbiased feedback from verified product and service owners — meaning that the consumer actually owns or has owned or used the product or service being rated."
Is this valuable information? Absolutely. Is it the final word for consumers in picking the right car? Not necessarily.
Edmunds.com Vehicle Rating System
The Edmunds.com ratings team reviews 25-30 evaluation items (depending on the type of vehicle being reviewed) under the major areas of Dynamics, Comfort, Function and Design/Build Quality. This results in an overall score of 1-10 and a one-word description such as "adequate" or "good." A detailed review of each category showing the comments of the road test editor is also available.
While six editors make up the ratings team, they receive input from a much larger pool of opinions from an editorial staff that is constantly driving the cars that are reviewed. Furthermore, these ratings are generated under controlled conditions, with multiple "test runs" performed to ensure the vehicle's performance is verifiable.
The editors' reviews are displayed side by side with the consumer reviews. In nearly every case, the scores from consumer reviews are higher than editor reviews (for reasons already discussed in this article). It's also significant to see how many consumers have contributed their opinion; the higher the number of contributors, the more representative the rating is.
What It All Means
While vehicle rating systems aren't perfect, they provide a useful tool for car buyers — if they are understood and used properly. The best advice for car shoppers who are reviewing the rating results is contained in the old saying: "Consider the source."