Last Active
  • San Francisco doctor charged with possessing child pornography in iCloud

    DAalseth said:
    This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 
    They can ... because images in iCloud aren't encrypted today. Apple's servers have the ability to see them.

    With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)

    Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
  • 'iPhone 14 Pro' may come with titanium alloy frame or enclosure in 2022

    Commercially-pure titanium is stronger than steel and significantly lighter. The perception of finish issues is mostly because titanium oxide is white, less shiny, and much softer than the unoxidized titanium underneath it. Since the oxide layer isn't protective, titanium machine parts are often coated to prevent oxidation (titanium implants are a different story; they're stored in an oxygen-free environment until implantation to improve integration with bone). For parts which aren't subject to a lot of metal-on-metal wear, a few resins have been popular for a while. I have a nice pen machined out of titanium, flame-anodized (giving it bands of vibrant colors), then resin-coated for protection. It looks really good, and doesn't show fingerprints or wear.

    In contrast, aluminum oxide is far, far harder than the unoxidized aluminum under it (emeralds, rubies, and sapphires are aluminum oxide with various impurities), and it's slightly porous which lets it accept dyes nicely. In aluminum, anodization is used to accelerate the growth of the oxide layer and optionally dye it.
  • What the M1 and Apple Silicon mean for Mac security

    Does anyone know if Spectre and Meltdown will affect M1? https://meltdownattack.com <--

    M1 isn't equal to ARM, but the ARM website itself says ARM may be affected: https://developer.arm.com/support/arm-security-updates <--

    I'm particularly curious about the Rowhammer attack.

    Apple should update this page to discuss M1 Macs: https://support.apple.com/en-us/HT208394
    Ignoring a big chunk of the post which has already been shown to be bogus.

    Spectre (CVE-2017-5753 and CVE-2017-5715) theoretically impacts any processor design which uses speculative execution (a technique used to speed up a single thread on a processor), but the negative security impact of speculative execution had been known for years beforehand. The exact vulnerabilities given the "Spectre" name require the ability to run code on the same core uninterrupted for a while to train the branch prediction, and that gets you a few bytes of target data. Then you have to start again to get a few more bytes. Bad for servers (especially servers where your adversary can buy the ability to run software, like AWS), but mostly a non-issue for personal machines.

    Meltdown (CVE-2017-5754) involves relying on out-of-order execution (again, a technique used on modern processors to improve thread performance). When you try to read a given memory location, some processor designs sometimes copy the memory into cache before they check whether you are allowed to read the data. This is more reliable, and takes much less time than training the branch predictor for Spectre, but it still requires the ability to already run arbitrary code on the system in question. Again, bad for servers, especially multi-tenant systems like AWS, but mostly not a problem for personal machines.

    Rowhammer is a physical property of dynamic RAM. With low-privilege code, you can potentially retrieve data from adjacent RAM locations to the locations you are using, but you have no control over where the system puts you, and no way to see actual memory addresses. And if you have control over the memory layout, you already have the ability to execute privileged code. This is almost entirely a non-issue for any machine, because exploiting it effectively requires having a level of access which makes exploiting it unnecessary.

    All of these are substantially overhyped. Don't run programs from sketchy sources, and they're basically non-issues.
  • 'X-ray teardown' of iPad Pro Magic Keyboard illustrates complex engineering

    The big metal blocks to the sides of the trackpad aren't reinforcement, they're weights. Just metal slugs to affect the balance so the much-heavier iPad mounted on the top doesn't make the whole thing tip over.
    firelockrandominternetpersongutengelCesar Battistini Mazieroheadfull0winefastasleepF_Kent_DGeorgeBMacStrangeDayswatto_cobra
  • Apple TV+ production of 'Metropolis' has shut down permanently

    zimmie said:
    I was really looking forward to this one, but the strike is more important, without a doubt. Oh well. We can always hope they pick it up again in a few years.
    Streaming services were already a thing in 2008 when they had the last writers strike and new contract. Can you please give a synopsis for those of us not in the industry with what writers think needs to change and why?
    Ahead of the vote on whether to strike, WGA leadership shared internally a list of the demands and what the AMPTP (Association of Motion Picture and Television Producers, representing the studios) had responded to each. Adam Conover posted the list on Twitter. Basically, the WGA's demands are about fixing flaws in the old contract which the studios have been exploiting (see: Hollywood accounting), and improving the sustainability of writing for video as a profession.

    One of the really big ones is a demand that "AI" can't be used to write or rewrite, and that contractually-covered scripts can't be used to train such systems. The AMPTP rejected this one and countered with an offer for "annual meetings to discuss advancements in technology", which is just insulting.

    The WGA demanded base pay increases roughly in line with inflation, the AMPTP countered with base pay increases lower than inflation (still, any increase is better than none). The WGA demanded minimum employment terms (minimum duration, minimum number of writers for shows based on episode count, etc.) and guarantees for things like rewrite pay (executive producers tend to demand a lot of free rewrites) and health insurance, which the AMPTP mostly rejected and refused to even counter.

    In 2008 the studios successfully argued that streaming was new and unproven (à la Spotify), and they didn't know if they would be able to afford to pay residuals. Now that everyone sees how wildly profitable streaming media is, the WGA demanded increases in streaming residuals. The AMPTP countered with much lower increases. The WGA also demanded more information about view counts for streaming episodes and features (movies) to make sure the residuals were accurate, which the AMPTP rejected and refused to counter.

    I'm 100% in favor of the writers. Studios get up to some deeply unethical nonsense to avoid paying most of the people involved in making a show.
  • Apple Watch ECG detects heart condition in German woman

    digitol said:
    Caution! Apple watch and it’s measurements/monitoring is often inaccurate. This should not substitute for a Doctor. In fact, it’s almost criminal how horrible these devices are. Many things to account for, that isn’t taken into consideration for health, which may hurt certain individuals with underlying health issues or other circumstances. Apple watch is more of a toy. 
    This is a segment of a pathological ECG recorded using an Apple Watch Series 4:
    You can clearly see
    • The arhythmic QRS complex.
    • The missing P wave.
    • The overall character of the QRS complex, and how it's the same beat-to-beat (always a qRs).
    • The shape of the T wave. Absolute and relative refractory.
    • The QT interval
    This is diagnostic-quality, without a doubt. I particularly like how you can tell the patient's heart was expecting to have longer for the T at 13.16 seconds, but it was abruptly shortened for the qRs at 13.32 seconds. The Q to T-peak interval on the subsequent beats is noticeably shorter.

    Rayz2016 said:
    svanstrom said:
    Isn’t that like saying that the iPhone gps caught a car speeding while the more advanced police radar couldn’t do that when the car later on was parked…?
    Yes, I think “seemingly missed” might not be how I would’ve phrased it.  Sometimes you have to monitor someone over a period of time before the condition presents itself. The point being made here is that doctors were willing to base a course of action on the info they’re getting from an Apple Watch, when they had no data from their own equipment. That is quite significant. 
    Exactly. A lot of heart issues only present relatively briefly. The situation above disappeared after about an hour, which is not generally enough time to get into an emergency room (without also being grievously injured, anyway), let alone to a cardiologist. The options are either monitoring over a really long period of time (certainly longer than anyone would want to be hooked up to a 12-lead), or using a personal ECG. A lot of doctors definitely realize this, and realize any hard data showing an issue is better than no data.
  • Google launches Pixel 6, Pixel 6 Pro with Tensor processor

    HBCan said:
    Bayer... as in Bayer filter.  An RGB pattern filter over the camera's sensor.  One color filter per pixel... Red, Green, or Blue.  The image colour data is captured and interpolated for the neighbouring pixels to produce a full colour image.  Virtually all commercial colour sensors employ a Bayer filter solution otherwise you would require three sensors to be used... one per colour.  Not easily implemented in such compact environments as a beam splitting prism would be required too.  Creator of the Bayer filter.... Bryce Bayer... who worked with Eastman Kodak.  Died in 2012 I believe. 
    Sure, but the Bayer pattern is naturally a tiled series of squares with two green, one red, one blue photosite per four pixels. Lines up nicely with Pentile display subpixel arrangements. So what in the world is "Quad Bayer"?

    Did a little research, and it turns out it's a Sony variant of the normal Bayer pattern. They turn each photosite into four separate, smaller photosites, then average their values as a way of reducing amplification noise. Thus, the "50 megapixels" is a lie. It has 50 million photosites, but they operate in clusters of four, producing one output pixel value which is still only one channel. It's the equivalent of a 12.5 megapixel sensor.

    Still no idea what Octa PD is supposed to be.
  • Apple working on gaze tracking to make AR & VR faster & more accurate

    Gaze tracking is critical for mass-market headset-based VR and AR. Our eyes only actually see roughly a circle ~2º across (called the fovea) with any precision. Everywhere else, we have the ability to see large patches of color, or movement, but no detail. Our brains fake all the detail we think we see outside of the fovea.

    Even more interesting, when our eyes move, we go totally blind for a few milliseconds. This phenomenon is called saccadic masking. You can verify it experimentally in a mirror. No matter how hard you try, you will never be able to see your own eyes move, because they shut down while moving.

    Taken together, these allow for something called foveated rendering, where the device tracks where the user's fovea is, and renders only that small patch with high detail. It then renders everything else much more coarsely. As long as the time to render a frame can be kept below the saccadic masking duration, rendering in this way is imperceptible to the user. This means it only needs to render around 3% of the screen, which would allow even a phone GPU to render VR/AR meeting today's standards for angular resolution.

    This has even been done before in a series of experiments in the 70s. Subjects were presented with a block of text, which was blurred outside where their fovea was predicted to land. They had no idea this was happening.
  • Apple Watch ECG could be a good early heart attack detection system

    JP234 said:
    Apple tells you that the Apple Watch ECG cannot detect a heart attack right on the app.
    No, they tell you they can't diagnose a heart attack. You can detect most heart attacks with a single-lead ECG, but you can't tell what specifically is blocked, and other things can cause the same changes in lead I which a heart attack causes. It's enough to tell something is wrong, but not exactly what.

    Detect: Something is wrong.
    Diagnose: This is what is wrong.
    A heart attack or an arrhythmia? My understanding is that a heart attack specifically refers to a blockage and that an echocardiogram (not electrocardiogram) is specifically needed to diagnose that.
    An electrocardiogram can be used to diagnose most infarctions. Most walls of the heart are fed almost exclusively by one artery each, so when that artery is blocked, you get different results in V1 through V6 which show that wall not contracting properly. That's enough to diagnose a blockage in a particular cardiac artery. Cardiac echos are more often used to diagnose blood issues like physical flaws in the septa or valves, but they are sometimes used to confirm infarction.
  • Apple's MR headset to use magnetically-attached tethered battery

    Xed said:
    tht said:
    I'm curious why Apple didn't, well, is rumored not to have, put the computing chip bits in with the battery pack too. If would have made the goggles lighter, slimmer, thermally cooler. The only reason I can think of is that it's pretty difficult to push >5K 120 Hz through a wire, and this would have 2.

    It may have to have a fan to cool all the chips and to cool your face.
    Why does it need 5K (14.7M pixels) if the screen is an inch from you eye?
    That's when you need pixel density the most.

    Our field of vision is about 165º per eye horizontally (with ~120º binocular overlap) and 130º vertically. 20/20 (6/6)  visual acuity is defined as the ability to see the separation of two lines placed one arcminute apart. There are 60 arcminutes per degree, so that comes out to 9,900 horizontal pixels by 7,800 pixels per eye just to keep up with 20/20 vision. Visual acuities up to 1.5x sharper are pretty common.

    Let's target 20/20 (6/6) vision and a 4320p (7680x4320, erroneously marketed as "8K") screen per eye. That would be able to cover 7680/60 => 128º horizontally by 4320/60 => 72º vertically, which excludes basically all of your peripheral vision. Both of my eyes are sharper than 20/20. My stronger is 20/12 (6/3.8), or 1.6x sharper than 20/20. To keep up with that over the same area, I would need a 12288x6912 display.

    Targeting 20/40 (6/12) and a 5K (5120x2880) screen per eye yields 171º by 96º. This would cover peripheral vision passably (though still not well), but the sharpness would be on the level of the pre-Retina iPhone screens.