Rayz2016
About
- Banned
- Username
- Rayz2016
- Joined
- Visits
- 457
- Last Active
- Roles
- member
- Points
- 18,422
- Badges
- 2
- Posts
- 6,957
Reactions
-
Apple says it wasn't told about UK contact tracing app issues, plans
bonobob said:mr lizard said:Surprise surprise, once again the UK government tries to go it alone and messes up. Then it claims it’s cooperating with others and those others claim otherwise. Just like how they claimed they’ve been consulting with the devolved nations during Brexit (they haven’t) and how they consulted with healthcare providers before suddenly announcing dental surgeries were to reopen (they hadn’t) and how they’d agreed the timetable for schools to reopen in England with local education boards (they hadn’t).This app has been a disaster from the start. Thankfully we have no use for it in Scotland. This is England’s mess.They’re using a website accessible by NHS staff, rather than an app. Don’t know much about it, but it looks like they’re not going to rely on the public to download an app.I don’t know if it’ll work, but England’s problem is not a mess they’re going to stick their hand in. -
Apple pressures email app 'Hey' to integrate in-app purchase option [u]
elijahg said:pujones1 said:elijahg said:And this kind of thing is why the EU is investigating Apple for antitrust violations.Apple is the big bad monopoly nowadays. They built this thing from the ground up. It’s like some agency forcing Walmart to put Kroger stuff in their food department for free. Walmart should have the right to set the terms of store and if Kroger doesn’t like they should kick rocks.It’s the sense of entitlement that kills me with this guy.
Er … no, because no one bought the iPad because they could get it from Walmart; they chose the iPad and then decided where to buy it. However, if buying a product on the iPad several months later was because of the costly infrastructure that Walmart put in place AND also incurred a cost to Walmart, then yes, Apple should pay Walmart.
But it wasn't, it doesn't, so they don't.
The reason developers make money on the Apple platform is partly due to the fact that Apple has spent, and still spends, billions (yes, billions) to make the environment secure, private and easy for its customers to use. And the reason Apple can afford to do that is because it has a simple rule: if you make money off the platform, you pay for that.
If this company really believes that every penny of the subscription earned is down to them (and I am amazed that anyone would pay $99 a year for something that they can do with built-in spam handling and email filters) then they should prove it: they should remove their app from the store and just host it on Google Play and the Microsoft Store. If what they say is true, then they won't lose a penny in subscriptions because the money he's making has absolutely nothing to do with Apple's work and money Apple puts in to make the platform popular, safe and attractive to the sort of customers who will actually buy a subscription.
And it's odd that his argument is only directed at Apple, even though Google has the exact same fees: 30% of the subs revenue in the first year, dropping to 15% in the second year and subsequent years. I imagine this is because Google is happy to let him circumvent the rules because its no real loss for them: they make money on sifting personal data. Apple does not. -
Apple's shift to ARM Mac at WWDC will define a decade of computing
jharner said:Apple has already made it difficult for data scientists by not providing Nvidea drivers supporting cuda. The real problem is not virtualization for Windows. Rather it is virtualization for Docker and and to a lesser extent for VirtualBox. Docker containers herald the wave of the future for STEM, including data science and statistics. It also will be critical for reproducible research. Now the Mac is dominant in STEM---go to any related conference to see what attendees are using. Two factors may change this---ARM Macs and Windows Subsystem for Linux. STEM increasingly is running on Linux-based open-source software using multiple software ecosystems that must work together---R and PostgreSQL, R or Python with Spark and Hadoop, etc. These integrated environments are best containerized and reproducibility of research and collaboration of developers and scientists along with bash and git will make Docker and related infrastructure essential. Currently, the Mac excels in this area, but an ARM Mac may not support Docker, Singularity, Kubernetes, etc., even if it does continue to support open-source UNIX software. But now Windows runs containerized Linux. What faculty and researchers use will filter down to students. We are not talking about a 2% loss---it will be much larger for STEM related disciplines in academia and industry. In principal, Apple could support Docker and related technologies, but will they? They could support cuda, but they don't. Even if computing moves to the cloud, what technologies are behind this? Docker, Kubernetes, open-source software, etc. Everyone will need to develop, test and deploy their code from their local machine. This cannot be done using iPadOS and if cannot be done on macOS, the the Mac is done in STEM.
The first thing that needs to be remembered is that all this came from a Bloomberg report: the same people behind the "Chinese spy chips are everywhere! EVERYWHERE!" story. So before we all panic and start posting the same message to every story, it's best to wait until next week and see what happens (as Canuckstorm has already said).
Now, let's assume that this is true.
Apple has never shied away from making decisions that they know will lose them users if they believe they will it will expand the user base in the long run. Every time this happens there is several months of crying into the forum well, followed by adapting to the situation, or leaving the platform. Shame that this has to happen, but so far, Apple's decisions have kept the MacOS user base growing, even though folk keep claiming that the end is nigh. Having said that, a lot of the panic/wishful thinking posts around here don't seem to make a lot of sense.
It 'may' or 'may not' do a lot of things, posting multiple posts panicking about it before anything has been announced doesn't accomplish anything, aside from make it look like only one person actually needs it.but an ARM Mac may not support Docker, Singularity, Kubernetes, etc., even if it does continue to support open-source UNIX software.
It's the the vendor's job to make sure their stuff runs on Apple's kit. The switch, if it happens, won't happen today, so they'll have to time to work with Apple to iron out any problems. The UNIX underpinnings of MacOS will remain, so I don't see why stuff like Docker and Kubernetes should be much of a problem. Apple has been beefing up its cloud expertise for the past year, covering areas such as containerisation and java virtual machine support, so I don't really see them making life hard for container software. Linux is already on ARM and so is Docker, so I'm not sure why you're panicking just yet.
A few other unrelated bits:
Why not AMD? The whole point of this switch exercise is to give Apple control of the hardware. They can then optimise the chipset for MacOS which will allow them to do more mac-specific stuff at the hardware level. This is not just about Intel hitting roadblocks because this has been on the cards since the Intel switch. They're not going to switch to AMD because that's just moving to the same set of disadvantages: lack of control, lack of optimisations. Speed has never been the only consideration here.
What about virtual machines? No reason why they won't be supported, but without the X86 then Parallels and VMWare are going to have to work very much harder, which brings us to:
Apple will add special instructions/special translation layers to make x86 run at native speeds. I don't think this is going to happen, no. If there's one thing we know about Apple it's that they like to hack away the cruft as fast as possible. I'd be very surprised if they started adding it back into their shiny new chipset.
Apple is very mindful of the OS/2 principles:- Don't give developers an excuse to stop writing natively for your platform. (OS\2's near native support for Windows)
- Don't add code old code that folk then start to rely on and so you can't remove it later on when it starts to hamper the platform (read up on the OS/2 Single Input Queue)
- Don't add code that a disgruntled or cash-strapped vendor can use to beat you down or milk you dry with later on. (OS\2's near native support for Windows)
I think the last one is the main reason why this is not going to happen. Apple is tied to Qualcomm for a couple of years under unfavourable terms until they can come up with their own comms stack. If they're going to add support for x86 then I doubt it's going into the chip. They might come up with a co-processor box that cables in so that any future problems don't impact the architecture strategy.
But the best thing to do is just wait and see. Bloomberg isn't very good at tech journalism, so meditate for one more week … and then you can panic. -
Apple transition to own ARM chips in Macs rumored to start at WWDC
elijahg said:It’ll be interesting to see whether Apple reduces the price of the Macs to correspond to the much cheaper ARM CPUs, I bet they won’t.
Wouldn't have thought so.
With Intel, Apple didn't have the R&D costs associated with building its own processor. Now that they're doing the processor themselves, this won't make things necessarily cheaper. Bear in mind that these ARM chips aren't actually ARM. They're custom-designed silicon from the ground-up that just happens to run the ARM instruction set, and have been crafted to bleed the last iota of performance out of MacOS/iOS. Apple will use every trick in the book to surpass what they had with Intel.
-
Apparent iOS Family Sharing bug causes apps to crash