View Single Post
  #31   Report Post  
Spam Me Please
 
Posts: n/a
Default Evinrude FICHT beats out Yamaha in JD Powers survey

I agree with every point you made, except the fact that JD Power is skewing
the questions (which I agree, can be done on a survey) to get a
predetermined response. JD Powers is selling a product that will show the
manufacturer how the owners perceive their product to be as far as quality.
It does not matter if the answer is good or bad, the companies still are
interested in "customer perception". While a good response makes for a
great advertising plus, a bad response is even more important to them. If
the company is making a better mousetrap, but the customer does not perceive
it to be, then they have a problem which can be easily solved. This is why
we now have auto dealers so interested in how your service work was
performed by the dealer. Dealers get paid on customer perception of the
service call.

If customer perception of a quality problem is real, the manufacture would
rather hear about it from JD Powers than when their sales decrease. JD
Powers is creating a product and selling it, but the product is not a biased
survey that will allow them to say they are #1 on the JD Powers survey. The
product JD Powers is selling is an unbiased survey of the customers
perception of the product and the dealers network to service the product.
The minute the companies or the consumer believe the survey is biased they
have nothing to sell.

I believe you have seen other companies who will give you a survey to
highlight a companies benefits, but JD Powers is not one of them.



"Gould 0738" wrote in message
...
If you go to JD Powers web site they have reviewed many different models

and
products ( http://www.jdpower.com/cc ). If the survey was controlled

by
the manufacturer, I would not have expected Sea Ray to come in number 1,

and
Bayliner and Maxum to be next to the last, since they are all owned by


Ok.

From the top.

JD Powers does not directly compare products. They collect and compare
"Customer Satisfaction Surveys". Can we agree on that? If so, on to point

two.
If not, let me know.

Point two: None of the people they are surveying are directly comparing

two or
more products either, (in most cases). They own a new brand X car, boat,
motorcycle, Frisbee, or what not and they get a survey. These people have

no
idea based on actual usage experience how their product compares to the
competition.
(They probably think their product is either the best, or the best deal at

the
time they buy it.) Can we agree that the survey respondents are, in the

vast
majority of cases, not comparing two products? If so, on to point three.

If
not, let me know.

Point Three: Powers assembles rankings based not on how the products

actually
compare, but on how many bubbling, glowing, happy-owner responses it gets

on
the various products. Can we agree on this? If so, on to point four.

Point Four: The nature of the questions that are asked in the survey will
influence
the type of responses that come in. Take
a product with a known defect in, say, the "on" switch. The failure rate

is
50%, and the factory is hustling to do recalls as fast as possible. You

want
that product to do poorly? You ask, "How would you rate the reliability of

the
On switch?" You want that product to show pretty well? Don't ask about the
reliablity of the On switch, ask whether the factory and dealer have been

quick
to respond when repairs are needed.

Throughout the entire process, JD Powers is creating a product and selling

it.