Home |
Search |
Today's Posts |
#11
![]() |
|||
|
|||
![]()
If you go to JD Powers web site they have reviewed many different models and
products ( http://www.jdpower.com/cc ). If the survey was controlled by the manufacturer, I would not have expected Sea Ray to come in number 1, and Bayliner and Maxum to be next to the last, since they are all owned by Ok. From the top. JD Powers does not directly compare products. They collect and compare "Customer Satisfaction Surveys". Can we agree on that? If so, on to point two. If not, let me know. Point two: None of the people they are surveying are directly comparing two or more products either, (in most cases). They own a new brand X car, boat, motorcycle, Frisbee, or what not and they get a survey. These people have no idea based on actual usage experience how their product compares to the competition. (They probably think their product is either the best, or the best deal at the time they buy it.) Can we agree that the survey respondents are, in the vast majority of cases, not comparing two products? If so, on to point three. If not, let me know. Point Three: Powers assembles rankings based not on how the products actually compare, but on how many bubbling, glowing, happy-owner responses it gets on the various products. Can we agree on this? If so, on to point four. Point Four: The nature of the questions that are asked in the survey will influence the type of responses that come in. Take a product with a known defect in, say, the "on" switch. The failure rate is 50%, and the factory is hustling to do recalls as fast as possible. You want that product to do poorly? You ask, "How would you rate the reliability of the On switch?" You want that product to show pretty well? Don't ask about the reliablity of the On switch, ask whether the factory and dealer have been quick to respond when repairs are needed. Throughout the entire process, JD Powers is creating a product and selling it. |