Last Active
  • New iOS 15.2 beta includes Messages feature that detects nudity sent to kids

    auxio said:
    JaiOh81 said:
    I’m legitimately curious, how can iMessage “detect” sexually explicit photos being sent to or from a phone.

    Does anyone know how this is being done on device?
    A simplified explanation is that a "fingerprint" of the image (a unique number which is calculated using the pixels in the image) is created and sent to Apple.  This fingerprint is then compared with fingerprints of known sexually explicit images to see how similar it is.  It won't be done purely on device, but only when an image is sent/received via iMessage.  The fingerprint will be calculated on device, but the comparison will be done on Apple's iMessage servers most likely.
    You're describing the CSAM scanning, which is entirely separate from and has nothing to do with this feature. 

    This is content analysis, just like Photos has been doing for years now. When it's analysing your photos for content, so that you can search for "cat", it's also analysing for nudity. 
  • Apple details headphone jack improvements on new MacBook Pro

    rundhvid said:
    elijahg said:
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    Excellent written explanation 👍👍👍

    Regarding audio quality and this 192 kHz sample rate: earlier this year,  announced immediate availability of ’s music catalog in 192 kHz/24 bit Hi-Res Lossless format (although limited to a subset of the catalog at first)—at no extra cost!!

    What is mind-boggling is that ’s hardware is limited to 96 kHz—why?
    —my antique +20 year old Denon AV-receiver happily supports uncompressed multi-channel audio in 192 kHz, but neither my  TV 4K 2nd gen., nor my Mac mini M1 is able to take advantage of  Music’s reference-class format! AFAIK, there is no technical reason for this HDMI-output buzz kill 🤒
    As an audio engineer (among other things):

    There is no practical benefit whatsoever from going above 96kHz. That sampling rate can accurately reproduce any signal up to 48 kHz. 

    Only some percussion produces audio up to that range, and virtually no microphones go beyond about 20 kHz. 

    Our hearing tops out at around 16 kHz at birth, dropping dramatically from there over time. 

    In fact, unless you are talking about super high-end converters, going to extremely high sample rates is likely to introduce ADDITIONAL intermodulation distortion that can affect the high frequencies. All double-blind tests where people have been able to distinguish super-high sample rates were cases where people were actually hearing distortion. 

    In the case of a built-in off-the-shelf $1.50 (if that) logic board D/A converter, going to 192kHz is almost *certainly* going to sound worse than staying with the already-overkill 96 kHz. 

    Even ignoring the fact that absolutely NOBODY with playback equipment capable of reproducing that resolution is going to be playing it from a 3.5mm minijack output on his laptop. 

    That’s just ridiculous. 
  • Apple execs excited about M1 Max MacBook Pro video editing capabilities

    approx said:
    Yes that’s great. But it is still not possible to import Video from Sony SXS Cards. SXS Cards are used in Sony and Arri Broadcast and Cinema cameras

    We are waiting since one year to get a updated device driver for SXS Cards. 

    Without this updated driver, the super fast M1 MacBook Pro is useless for our job. Ingesting Footage is impossible. 

    So sad!

    I hear you.  An awful lot of professional tools have not been ported to Apple Silicon yet. 

    However, in general, I’ve found that professionally used systems are ideally kept about one year behind, to let the software work out its kinks. 

    So, we’re still within limits, I suppose. 

    I’ve some plugins that didn’t see Upgrades for three years, before finally adding support for the very latest systems… I guess the team rotates between projects and only gets around to updating when it’s that plug-in‘s maintenance turn.
  • USB-C group hopes new logos will solve customer confusion

    ilarynx said:
    So... USB-Cs (plural) are supposed to reduce the number of charging cables and related waste... how?
    By providing common cables/connectivity that are interchangeable between numerous data protocols as well as charging? Are you serious?
    But the cables aren't common or interchangeable — that's precisely the clusterfuck situation here: does a given cable support power delivery? at which wattage? Does it support USB 2.0 speeds? 3.1? 3.1 gen 2? With power delivery? Which generation of DisplayPort? 

    At least we can (usually) tell if it supports Thunderbolt from the TB logo on the connector, but beyond that… 
  • iCloud Private Relay flaw leaks users' IP addresses

    scatz said:
    Isn't this service currently in beta, and as such there  will be bugs found to be ironed out ? Don't see any mention of this in the article, so i could be wrong,…..
    It's explicitly labeled as "beta" in the corresponding setting in the iCloud preferences and disabled by default, so…yup.