Apple_Bar

About

Username
Apple_Bar
Joined
Visits
77
Last Active
Roles
member
Points
461
Badges
1
Posts
134
  • iOS 16.2 implements 10-minute AirDrop time limit globally

    macxpress said:
    It’s just apple trying to deflect what they did in china,   to get ahead of the Curve to try to avoid criticism going forward ..  Nobody cared about airdrop feature for all of sudden “let’s limited it to 10mins” 
    My guess is you’ve missed all the articles about people being AirDrop bombed with alarming or intentionally disturbing material in small or closed spaces. 
    My guess is you missed the setting to turn airdrop off or on for contacts only. 
    My guess is most people don't do this and it shouldn't be an issue in the first place. I love everyone's tinfoil hat theories though. 
    Even when is at “Everyone” you have to accept whatever some random person is trying to send you. If you are not expecting anything from a nearby family member or friend. WHY don’t reject the requests to receive items through AirDrop. Like they said in spanish “La curiosidad mato al gato”. 
    caladanianctt_zhFileMakerFellerAlex1N
  • New Apple TV 4K with A15, HDR10+, more storage debuts

    There shouldn’t be a $129 option. Thread, Matter and an Ethernet port is well worth a 20 upgrade. You want an Apple TV because it is the portal for your media consumption apps but most importantly because it is your home hub. 

    As someone said above people are better off with the current version if they go on sale at retailers if they want a homekit hub. 

    williamlondonstarof80PatchyThePirateV.3watto_cobradocno42
  • Apple's CSAM detection system may not be perfect, but it is inevitable

    A year has gone by and there is seems people don’t understand the issue here. Including the author repeating over and over “iCloud Photos” the issue with CSAM is not on the Cloud side because scanning content on Clouds: Google, Apple, Amazon, Microsoft is already being done which is fine because they (customers) willingly upload content to clouds and even if they are aware or not the terms have explicitly said the content is being scan for x, y, z. The issue with Apple approach is bringing that scanning tool on the device itself without even explaining if you have the option to opt out. How will they ensure that governments will not force them to modify the scanning beyond CSAM.
    9secondkox2byronlzeus423elijahgbaconstangJaiOh81
  • Customer receives M2 MacBook Air a full day before launch

    jfdesigns said:
    That explains why I am now in month two waiting for my Mac Studio Display. 😡
    $1599 configuration? 
    watto_cobra
  • EU plans to require backdoor to encrypted messages for child protection

    davidw said:
    flydog said:
    Gaby said:
    And people complained about Apples system… I have to say that I am against any weakening of encryptions or privacy protections but in terms of which method is the lesser of two evils Apple’s solution is the less intrusive. The language in this Bill truly is terrifying. I have to say that considering that the police constantly complain that they don’t have the resources to deal with crimes as it is I find it farcical that more and more legislation continues to be added. Not to downplay the significance of sexual abuse in any way but one has to be pragmatic and decide whether the the attack on privacy is justified. It seems to me police forces do less and less detection and real crime fighting and are becoming merely administrative in their roles. 
    Apple’s method is not less intrusive because Apple would be scanning all communications. The EU proposal requires a court order, and affords affected individuals the right to challenge the order.  

    The language is only “terrifying” if you rely on headlines, and don’t both to read the actual bill. 
    But Apple was only scanning unencrypted photos that was in (or going to be in), their customers iCloud photo library. There is no expectation of privacy when storing unencrypted photos in a third party server. The big stink made was that Apple was planning to do the scanning on their customers devices, with software installed on those devices, instead doing it when the photos are in their iCloud servers. Like how Google, Amazon, Microsoft and Facebook is doing it and how Apple is most likely doing it now.

    Apple did have a plan to provide parents of minors a tool to help them monitor their kids iMessage activity, by scanning the message for any images or wording for adult contents, before it's sent or after it's received and then blurring them out and notifying the parents for approval. But nothing got reported to the government or some "save the children" advocacy organization. Only the parents were notified.

    Apple was not forced to do any of this scanning by the government. It was voluntary, as it should be.

    So do you think the group behind the "Pegasus" spyware, is going to get a court order, when they find a way (and they will) break the backdoor encryption key, that all encrypted messages must have, under this proposal?

    Plus the "court order" is only for the government to obtain the encrypted messages, in order to bring charges against the sender or receiver. The message service providers can not charge the customers for breaking the law. And the government can not, without proof that a crime has been committed. By law (under this proposal)  all message services must scan all their messages for CSAM, regardless if it's suppose to be "encrypted end to end", where there is some expectation of privacy. And then report any violators to the government. This is when  the government must get a court order to obtain those messages, unencrypted. You seem to be implying that the government needs to get a court order, in order for individuals messages to be scanned. If this proposal passes, it would make "end to end encryption" an oxymoron or false advertising, in the EU.
    1.
    Not “most likely” THEY ARE scanning iCloud servers like any of the other providers. That’s ok because it is their server/rules and pretty much you relinquish any privacy when you voluntary decide to upload a document to any cloud. 

    2.
    Except that is not up to Apple to voluntary decide to install software to scan files on billion of devices just because they PR it as a “Save the Children Movement”. Because today is CSAM (great cause to get people on board) but then tomorrow or next week it will be software to scan anything: anti- or pro- (political, gender, race, police, religion, etc). It’s a slippery slope.
    muthuk_vanalingamwatto_cobrabyronl