davidw
About
- Username
- davidw
- Joined
- Visits
- 187
- Last Active
- Roles
- member
- Points
- 4,776
- Badges
- 1
- Posts
- 2,204
Reactions
-
Apple's iMessage gets a reprieve from EU digital gatekeeper law
Alex1N said:How come Google Play gets a free kick? Conspicuous from its abs ence in the lait in the article.
Lots of alternatives to iMessage - Signal, Telegram, Discord, LINE…
Because Android allow third party app stores and there are already over a dozen app stores on Android. Also, Android allows for sideloading. About the only issue the EU have with Apple App Store is that it is the only way to get apps installed into iOS. The EU don't count jailbreaking or having to pay for a developer account (and learning Xcode) as other ways to install apps. Even the 15/30% commission is not an issue for the EU.
-
Three Apple executives to serve as key witnesses at Google's antitrust bench trial
From what I read and still only rumored as it's still not officially verified by either Google or Apple, Google do not "pay" Apple any negotiated lump sum to be the default search engine on Safari. It's what Apple has to do for Google in exchange for getting a commission on the ad revenue Google makes from iOS users surfing the web with Chrome and Safari (and maybe other ways like with Google Maps on iOS). This commission is what amounts to the rumored $15B Apple gets annually, for having Google as their default search engine on Safari.
This is why Google is not losing any money by "paying" Apple $15B to be their default search engine. It's just a percentage of the money they are making on iOS. iOS accounts for 50% of their ad revenue every year. This makes more sense than Apple charging Google for being the default search engine and Google willing to pay that. Or Google "paying" Apple to prevent Apple from choosing another search engine to be their default. This also explains why the rumored amount that Apple gets every year has increased significantly over the years. Google been making a ton more money on iOS over the years.
https://searchengineland.com/report-google-sharing-chrome-ios-search-revenue-with-apple-393296
https://www.seroundtable.com/google-apple-business-search-34933.html
-
Apple arguing iMessage isn't big enough to be EU gatekeeper service
Alex_V said:A lot of nonsense being expressed about the EU, by those who don’t know and don’t care. Here is basic info on the EU 'gatekeepers law':
https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_enThis nothing wrong with the concept of reining in the power of "gatekeepers" to ensure fair competition. But the way the EU went about setting the criteria to determine who are "gatekeepers" is a bunch of BS. The criteria was set to ensure the big 5 US techs can not escape being a "gatekeeper" and no EU companies get snared.>“Let’s focus first on the biggest problems, on the biggest bottlenecks. Let’s go down the line — one, two, three, four, five — and maybe six with Alibaba,” he said to the Financial Times.“But let’s not start with number 7 to include a European gatekeeper just to please [US president Joe] Biden,” he added.<Some one did a survey on all the criteria that the EU came up with to determine who was a "gatekeeper' and they concluded that the numbers were determined backwards. In other words, the EU already knew who they wanted to include as "gatekeeper" and use threshold numbers that would only included the 5 big US techs. The EU did not first do any evidence gathering to determine the threshold needed for any of the numbers that would make a company anti-competitive and then just by coincidence, the numbers manage to included all the 5 big US techs and hardly any other companies.Here's long but very informative analysis done on the DMA and just some of the flaws about it that was overlooked by the EU (or they just didn't care) because the DMA "gatekeeper" criteria was all about regulating the 5 big US tech. Now this was done in 2021 so some of the stuff mention might not be true today. But the DMA criteria for a "gatekeeper" didn't really change much since 2021. If anything, the criteria might have gotten narrower.
-
Child safety advocacy group launches campaign against Apple
foregoneconclusion said:davidw said:foregoneconclusion said:Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.
However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it.
Apple got in big trouble in nearly all the countries where they sold iPhones because they installed software that throttled the CPU, if it detected a bad battery or one about to go bad. This to prevent the iPhone from crashing without notice, due to the bad battery. The sinister were saying that Apple did this to purposely slow down the iPhone so to force customers to buy a new, faster iPhone. But Apple was not found guilty of that. What Apple was really found guilty of was that they installed software on their customers iPhone without their permission and customers had no way to disable the software, if they didn't want it on their iPhone. There were some customers that claimed the software throttled their iPhone, even though they had a good battery. Even if that was just a bug, customers should had had the choice to disable the software or to not have it installed at all. And all Apple had to do was to give their customers the choice to install or ability to disable such software, on the iPhones that is considered their private property and this would have saved Apple 100's on millions of dollars (if not over $1B by now).
Apple also caught a lot of flak for downloading a free U2 album into all their customers iTunes account, without their consent (or even knowledge.). And back then, their iTunes account was on their devices. And it could not be deleted from the library until Apple created a special tool to do just that. (The album would just download again if deleted using the usual way to deleted an album.) In the mean time, it was there taking up HD space, of the 10's of millions of iTunes customers that didn't want the album. Even if one could delete (eventually) the album from their iTunes library, it's not exactly the same as if Apple gave them a choice to not download the album into their iTunes library to begin with. Even if in the end, either way, the album is not in their iTunes library.
Now, you might not have the right to the privacy of your images from being scanned for CSAM, when it's in the iCloud, but surely you have the privacy rights to not allow Apple to install software on your iPhone to do the scanning, before they're in the iCloud servers. Scanning for CSAM is strictly voluntary on Apple part. Apple should not be forcing their customers to host the software (to do the scanning) on their customers privately own devices. Apple should have to first ask for permission, before installing the software. Just like what they should had done with the bad battery throttling software and the free U2 album. -
Child safety advocacy group launches campaign against Apple
foregoneconclusion said:Here's the obvious problem for all the people claiming there is some sort of major privacy issue at stake with CSAM scanning in this thread: Apple already scans files uploaded to iCloud for illegal content. That was a part of the user agreement long before 2021. You have to agree to Apple's terms to use the service...so you've already agreed to have your files scanned. The idea that your files residing in iCloud are totally private isn't true at all, just like it isn't true for all the other mainstream cloud services.
However, unlike the other major cloud services Apple has made the bizarre decision to say that CSAM scanning shouldn't be done...despite scanning for other illegal content.The problem of any privacy violation isn't that Apple is (and most likely been doing for a while) already scanning for CSAM on their iCloud servers but that they were planning to scan for CSAM on the owner's device. It doesn't matter that the scans were going to take place right before the images are transferred from the owner device to the iCloud. The software for the scanning process is on the owner device, the scanning process uses the owner device resources and the scanning takes place before the images are on the iCloud server. That is not the same as Apple scanning for CSAM on their servers, like how the other cloud services are doing it. This child safety advocacy group is mainly bitching about Apple not implementing their plan to scan for CSAM on the owners devices.The SCOTUS has ruled many times that electronic devices like smartphones, have the same Constitutional 4th Amendment protection as a home. Thus law enforcement must get a search warrant (along with probable cause) to perform a search. Unless there's an immediate danger to the public in waiting to obtain the warrant. But just like your landlord can not enter your rental unit to see if you have any CSAM material lying around, Apple should not be able to do the same and enter your device and look for CSAM there. It's a whole different matter when your data is already on a third party server. The SCOTUS have ruled that there is no expectation of privacy when you allow a third party to have possession of your data.Plus Apple was going to install software to scan iMessages for adult images and contents on minors account, so that parents can use the scanning software to better monitor their kids online messaging. This had nothing to do with images going to be stored on the iCloud and the software is capable of monitoring all iMessages, not just those of minors.Lets put it this way. If say that Apple will allow you to download their CSAM scanning app for your iPhone/iPad. Would you download and install the app? Why would you give up some of your 4th amendment privacy rights, if you think Apple is going to be scanning the images you're going to upload to their iCloud anyway? What difference would it make in preventing child abuse, by having this app installed on your iPhone/iPad, so that Apple can scan your images before they are uploaded and stored on their iCloud servers?