Originally Posted by jragosta
Originally Posted by rcfa
While what you write here is all true, it's worth pointing out that I was talking about reliability, not availability. Lack of availability is just an occasional delay in data being pushed and working temporarily with an outdated copy.
If your computer needs to be rebooted due to an OS update, it stops being available, but it better not have lost its data and wiped out the backup when it's done being rebooted.
The problem described by developers however is data corruption and data loss. Quite different from working with good data that's a bit outdated or takes a bit longer to get refreshed.
Look at bit error rates for hard drives, and you realize that these devices' e.g. are 99.99999999999999% or there abouts reliable in reading/writing bits. And that's not good enough, which is why we invented journaling file systems, RAID-5/6, drive mirroring, backup, ZFS. And that's not good enough, which is why there is off-site, geographically diverse online-backup. That's why servers don't just have parity RAM, but ECC-RAM, why there is fail-over and clustering, etc.
The problem is, this is a multiplicative relationship of all components involved.
99.9% or 99.99% reliability in not corrupting or losing data is utterly in acceptable in any environment.
Still waiting for that evidence on what percentage of data that is lost because of CoreData sync problems.
Note that most of the problems involve loss of data on the client device. It's still possible to wipe the device and restore the data from the server. But even that has never been quantified. Somehow, you Apple-bashers are keen to make the jump from "someone has complained about something" to "Apple is a failure because there's this massive problem out there." Where's the evidence about how often it occurs or how much data is lost?
Your answer displays a ridiculous wagenburg mentaility. Apple bashers? Me? Are you kidding? I have about 6 mac, an iPad, two iPhones, three iPads, oh, and a few old NeXTs around. PCs? Zero, except for one that's a hackintosh.
So I'm hardly an Apple basher. But Apple to me isn't a religion. I've as a programmer been through the exercise often enough that Apple releases APIs that are simply not ready for consumption. It's been a Jobsian problem to toss the kid with the bathwater, and get rid of functioning things because something new and superior is on the horizon, even if that new thing isn't ready for prime time.
Just look at FinalCut X. Do I think it's the better paradigm than old-style video editing? Yes. Do I think it was ready for release when it was released? No.
The result is, that people who need to be productive with video/film editing left the platform in droves, and now, after a few years of incremental upgrades Apple is finally at a point where they are trying to relauch the product with a major marketing campagne trying to regain the customers they chased away.
Apple has many strengths, but Apple's corporate culture has also several MAJOR flaws, flaws that they get away with, because they are hidden in the bottom line by big successes elsewhere. The problem however is, eventually these flaws will also hit a user community in Apple's eco system that's not relatively marginal, like iOS app developers or video editing professionals. At some point these corporate character flaws may well hit home with facebook using teenage girls, and then you might see a major drop in iPhone popularity.
Criticizing Apple for what they are actually doing wrong has nothing to do with being an Apple basher. Apple is not a religion, there is no infalible Apple like there is an infalible pope (if your a true catholic).
iCloud is a prime example of something that was rolled out before it was ready, and where the "old" was discarded before a suitable replacement was there, while stranding customers. Just because Apple can cover up such mistakes by selling millions of iPads doesn't mean it's not a mistake. Just because Apple poaches users from an ecosystem where nobody expects a decent user experience doesn't mean that those of us who have been using the products for decades and know how the user experience should and could be are not right pointing out what gets constantly screwed up.
Apple is too much driven by the "always new" mentality, instead first fixing the old, and introducing something new only when it's ready.
http://rms2.tumblr.com/post/46505165521/the-gathering-storm-our-travails-with-icloud-sync describes quite well what are the technical issues with this, and what's described there are fundamental issues with how things work.
As to the percentage of and gravity of these incidents, that's something only Apple may say, who else has access to their server logs?
The only way to get a grip on that would be to create a test app that exercises these APIs and that is specifically designed to report back to the developer that number and kind of errors encountered by comparing a reference data set to what is in the app.
The way that could work is e.g. that updates to the dataset are send in an e-mail attachment a few times per day, the app opens the attachment, and tries to apply the changes. A Mac app, iPad app, etc. would all be active, and then the various liked apps would regularly check what the app developers web site says should be in their data set, and then compare with what's actually in their data set. If enough users would download and use such a test app, one could start getting some real statistics.