See my comparison above which is most likely the most we can assume...
To all those asking, the math works out just fine. Someone below noted the stars weren't actual score but assuming they were and Samsung scored well in their range vs. Apple as lower, Samsung can easily come out on top.
How can they arrive 20.54 the same for three categories that have weightings of .19, 17 and .16 respectively and Get final scores of 15.34 with the weightings of .26 and .22 respectively.
I think the weightings were calculated properly when arriving at the final scores.
Something fishy is going on.
Yes - looks like Prime81 did not apply the weightings properly. Even with the extremely contrived example above, with the weightings applied the score is Apple 70.4, Samsung 69.4.
no doubt JDP has contrived some unseen additional methodology for its "Power Circle Ratings" that can produce such an apparently logically absurd outcome (aka "cooking the books"). but the outcome is still absurd on its face.
had JDP presented its findings as "satisfaction" = "value" (aka "bang for the buck"), that would have made more sense. if Samsung's overall score was 70% of Apple's but its price was only 50% as much, then one could say it had a higher ratio of satisfaction per dollar spent. put another way, if a SS tablet is "good enough" for what you want to do, why spend more on an iPad? you're "satisfied."
what this demonstrates beyond question once again is that all these "rankings" of products we see from such outfits like JDP and Consumer Reports et al are inherently subjective, despite their loudly-claimed objectivity, because the bias of the evaluators is fundamentally baked into the choices they make about the methodologies and data sets they choose to use.
and when they don't even disclose the full details of the calcs (CR never does), they invite speculation about having "stacked the deck" for other ulterior motives. that serves them right - even if untrue - for their lack of transparency.
Here's what I got.
Yeah, that's what I got.
Apple's Number 1 (middle finger extended)
Here is JD Power's Email back to me.
For what it's worth, here's the response I got from their media people:
Mr. MacLachlan,Thank you for your interest in the 2013 U.S. Tablet Satisfaction Study--Volume 2. It's important to note that award is given to the brand that has the highest overall index score, not the company with the most Power Circles. In this study, the index score is comprised of customer's ratings of five key dimensions or factors.The Power Circles Rankings are something we provide to consumers to understand the relative rank of brands within each of these five dimensions. The Power Circle Rankings denote the brand that has the highest score within each factor regardless of how much higher their score is. The Power Circles denote ranges, not the actual index score upon which the factors and overall rankings are based.In the case of Apple in the tablet study, although it did score higher on four out of five factors measured, its score was only marginally better than Samsung's. At the same time, however, Apple's score on cost was significantly lower than that of all other brands. In comparison Apple's ratings on cost was more than 100 points lower than Samsung's. As such, even though its ratings on other factor was slightly higher than Samsung's, Apple's performance on cost resulted in an overall lower score than Samsung.In this cost-conscious environment, cost is a key factor in many products purchase and services they use. Tablets are no exception, where cost is a key driver of the overall customer experience with their device. Although "cost" has the lowest weight among the five factors that drive satisfaction, the notable difference between Samsung's and Apple's score in the cost factor was enough for Samsung to rank highest in the study.Here is a brief explanation of how the Power Circles are calculated: Power Circle Ratings are calculated based on the range between the product or service with the highest score and the product or service with the lowest score. J.D. Power generates a Power Circle Rating of five, four, three, or two, as outline follows: 5 Among the best -- The highest-ranking company or brand in each segment receives five Power Circles*. In highly competitive segments with many companies or brands, multiple companies or models scoring in the top 10 percent of the range from the highest score can also receive five Power Circles, indicating that consumers rate them "among the best" of all companies or models in the survey. However, J.D. Power awards are based on the product or service with the highest overall index score; therefore, while more than one company may be classified as "among the best," as both Samsung and Apple are in this case, only the brand with the highest overall index score receives a J.D. Power award. 4 Better than most -- Brands or models scoring 10 percent of the range above the industry or the segment average but below the scores for 5 Power Circles receive a rating of 4 Power Circles*, indicating a classification of "better than most" among brands or models in the survey. 3 About average -- Brands or models scoring between 10 percent of the range above the industry or the segment average but below the scores for 4 Power Circles receive a rating of 3 Power Circles*, indicating a classification of "about average" among all brands or models in the survey. 2 The rest-- Brands or models scoring 20 percent of the range below the industry or the segment average receive a rating of 2 Power Circles*, indicating a classification of "the rest" among all brands or models in the survey. J.D. Power does not publish a rating lower than two Power Circles. It is also important to note that the JDPower.com Ratings may not include all information used to determine the overall rankings and J.D. Power awards.-----Original Message-----From: DAVID MACLACHLAN [mailto:[email protected]]Sent: Thursday, October 31, 2013 7:34 PMTo: J.D. Power Media RelationsSubject: Customer satisfaction ratingsHello,I am hoping that you can help me understand how Samsung managed to come out ahead of Apple in customer satisfaction rating, where the ONLY category, according to your chart, that Samsung bested Apple was in cost. Apple's ratings were all 5 stars. What kind of weighting did you use to come up with your results?DaveThe information contained in this message is intended only for the recipient, and may be a confidential attorney-client communication or may otherwise be privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, please be aware that any dissemination or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by replying to the message and deleting it from your computer. McGraw Hill Financial reserves the right, subject to applicable local law, to monitor, review and process the content of any electronic message or information sent to or from McGraw Hill Financial e-mail addresses without informing the sender or recipient of the message. By sending electronic message or information to McGraw Hill Financial e-mail addresses you, as the sender, are consenting to McGraw Hill Financial processing any of your personal data therein.
In the case of Apple in the tablet study, although it did score higher on four out of five factors measured, its score was only marginally better than Samsung's.
Let’s see… five vs. five, five vs. THREE, five vs. THREE, five vs. FOUR, and five vs. FOUR. So no, J.D. Power is lying through their teeth. If it was actually “marginally better”, Samsung’s score would be higher.
Except, by J.D. Power’s own claimed math, the weighting of cost shows that this doesn’t matter.
Demand to know their justification for stating this. On what grounds do they consider this “environment” more “cost-conscious” than the past six years (recession) where Apple beat the snot out of everyone else?
I’m glad they replied to you. Now demand from them further explanation on these points.
iPad mni with retina display?? not available publicly yet.
The world isn't always cost conscious. Now, for some people, and companies they will look at a LOT more factors than what is listed in this survey. Here is a laundry list of the POSSIBLE attributes before someone or a company will buy a platform/ product. But there might be others that I didn't list. Each person/company puts them in a listing of most important to least important and weights them differently. Some of this is done in an evaluation survey that some companies WILL DO, or it's done as they go through the decision making process.
1. Number of apps.
2. Quality of apps.
3. 3rd party hardware.
4. Quality of 3rd party hardware.
5. Ease of custom developing apps.
6. Compatibility with desktops
10. Battery LIfe
12. Resale Value
13. Support (quality)
14. Extended Server contracts (t's and c's)
15. security from malware
16. Security in terms of an enterprise level.
17. Compatibility with existing s/w and h/w.
18. Performance options
20. OS features
21, Standard apps.
22. Is it discounted/on sale
23. Brand name
25. Can I buy it from a place that will give me payment terms.
26. What are my friends/family/schools/work using.
27. What does some article say about the product.
28. Who has the biggest market share.
29. What colors does it come in.
30. Hardware specs
31. Hardware features.
32. Ease of use.
33. Updating policy of the OS. (Do I get it the same day the mfg releases it, or do I have to wait for the OEM to release it)
34. How many revs does the OS get updated. (Example, some 2.x products won't get upgraded to past 2.x, same with 3.x, 4.0, 4.1 to 4.3) etc. etc. Apple has their own policy, etc.
NONE OF THESE ARE IN ANY SPECIFIC ORDER.
These are just a PORTION of the potential things that goes through people's heads at some point in time during the evaluation period and I'm sure there are more criteria than this.
Did JD Powers or Consumer Reports evaluate based on all of the criteria mentioned? NOPE. The only did the top 5 most OBVIOUS. But I would evaluate other criteria that they wouldn't and cost wouldn't be that big of a deal since we aren't talking about expensive products to begin with. Cars have a higher spread in terms of cost, tablets with the same basic specs don't. Apple isn't 100% more expensive for the same configuration. Some are the same or VERY close in terms of Retail List price.
According to their published methodology, cost was the lowest weighted item. Looks like, as you say, someone doesn't know how to count.
In the actual final scoring it actually was rated at the same level due to the calculations they actually used. I think someone screwed up at JD Powers is what I think.
The people that are most cost conscious are the people that don't have any money in the first place. But that's not the majority of people that own a tablet.
Some people aren't interested in a tablet at this point in time or they are waiting a little while before jumping in because they don't NEED one yet, or they don't see the benefit of it.
But stating that the world is a cost conscious. That's a fallacy. Certain percentage of the world's population might be, but I think of the people that are buying tablets, I don't think they are. Just a certain percentage.
I've seen people buy computers and tablets and end up not even using them. They just bought them because their neighbors had them and they just wanted to keep up with the Jones' but in reality, they don't even use them.
I'm sorry, but where is this myth that Samsung's tablets are cheap coming from? The Galaxy Note 10.1 sells for $549, $50 more than a comparable iPad. The Note 8.0 sells for $399, $70 more than the iPad mini. If anything Samsung should get a lower rating on price than Apple. Nexus tablets I understand, but Samsung? Yet, that survey makes it seem as if there's such a massive price disparity that it outweighed everything else- combined. Makes no fucking sense. At all.
Well, the tab tablets are complete shit, in all ways. Most of the reviews are horrible to mediocre. But hey, they're cheaper, so no doubt they provide a fantastic user experience all round. By this metric, I'm sure some $50 chinese tablet is the best thing in the world.
I don't think the $50 tablets are going to give much worse of a user experience as a Galaxy Tab. They just suck mainly due to Android OS.
I'm just surprised they aren't asking the customers different types of metrics. I think that cost was STILL weighted to high. It should have add an even lower effect on the final score. Cost of the unit is only what you paid for it, and people can get discounts on units depending on when you bought the item and from whom.
I think other factors they didn't even ask about are important.
The problem is that most people aren't trained well in the art/science of how to evaluate a product or software platform.
It’s almost as though it’s a site about one company in particular or something.