- Last Active
DAalseth said:This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud.
With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)
Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.
Commercially-pure titanium is stronger than steel and significantly lighter. The perception of finish issues is mostly because titanium oxide is white, less shiny, and much softer than the unoxidized titanium underneath it. Since the oxide layer isn't protective, titanium machine parts are often coated to prevent oxidation (titanium implants are a different story; they're stored in an oxygen-free environment until implantation to improve integration with bone). For parts which aren't subject to a lot of metal-on-metal wear, a few resins have been popular for a while. I have a nice pen machined out of titanium, flame-anodized (giving it bands of vibrant colors), then resin-coated for protection. It looks really good, and doesn't show fingerprints or wear.
In contrast, aluminum oxide is far, far harder than the unoxidized aluminum under it (emeralds, rubies, and sapphires are aluminum oxide with various impurities), and it's slightly porous which lets it accept dyes nicely. In aluminum, anodization is used to accelerate the growth of the oxide layer and optionally dye it.
22july2013 said:Does anyone know if Spectre and Meltdown will affect M1? https://meltdownattack.com <--
M1 isn't equal to ARM, but the ARM website itself says ARM may be affected: https://developer.arm.com/support/arm-security-updates <--
I'm particularly curious about the Rowhammer attack.
Apple should update this page to discuss M1 Macs: https://support.apple.com/en-us/HT208394
Spectre (CVE-2017-5753 and CVE-2017-5715) theoretically impacts any processor design which uses speculative execution (a technique used to speed up a single thread on a processor), but the negative security impact of speculative execution had been known for years beforehand. The exact vulnerabilities given the "Spectre" name require the ability to run code on the same core uninterrupted for a while to train the branch prediction, and that gets you a few bytes of target data. Then you have to start again to get a few more bytes. Bad for servers (especially servers where your adversary can buy the ability to run software, like AWS), but mostly a non-issue for personal machines.
Meltdown (CVE-2017-5754) involves relying on out-of-order execution (again, a technique used on modern processors to improve thread performance). When you try to read a given memory location, some processor designs sometimes copy the memory into cache before they check whether you are allowed to read the data. This is more reliable, and takes much less time than training the branch predictor for Spectre, but it still requires the ability to already run arbitrary code on the system in question. Again, bad for servers, especially multi-tenant systems like AWS, but mostly not a problem for personal machines.
Rowhammer is a physical property of dynamic RAM. With low-privilege code, you can potentially retrieve data from adjacent RAM locations to the locations you are using, but you have no control over where the system puts you, and no way to see actual memory addresses. And if you have control over the memory layout, you already have the ability to execute privileged code. This is almost entirely a non-issue for any machine, because exploiting it effectively requires having a level of access which makes exploiting it unnecessary.
All of these are substantially overhyped. Don't run programs from sketchy sources, and they're basically non-issues.
greginprague said:zimmie said:I was really looking forward to this one, but the strike is more important, without a doubt. Oh well. We can always hope they pick it up again in a few years.
One of the really big ones is a demand that "AI" can't be used to write or rewrite, and that contractually-covered scripts can't be used to train such systems. The AMPTP rejected this one and countered with an offer for "annual meetings to discuss advancements in technology", which is just insulting.
The WGA demanded base pay increases roughly in line with inflation, the AMPTP countered with base pay increases lower than inflation (still, any increase is better than none). The WGA demanded minimum employment terms (minimum duration, minimum number of writers for shows based on episode count, etc.) and guarantees for things like rewrite pay (executive producers tend to demand a lot of free rewrites) and health insurance, which the AMPTP mostly rejected and refused to even counter.
In 2008 the studios successfully argued that streaming was new and unproven (à la Spotify), and they didn't know if they would be able to afford to pay residuals. Now that everyone sees how wildly profitable streaming media is, the WGA demanded increases in streaming residuals. The AMPTP countered with much lower increases. The WGA also demanded more information about view counts for streaming episodes and features (movies) to make sure the residuals were accurate, which the AMPTP rejected and refused to counter.
I'm 100% in favor of the writers. Studios get up to some deeply unethical nonsense to avoid paying most of the people involved in making a show.
digitol said:Caution! Apple watch and it’s measurements/monitoring is often inaccurate. This should not substitute for a Doctor. In fact, it’s almost criminal how horrible these devices are. Many things to account for, that isn’t taken into consideration for health, which may hurt certain individuals with underlying health issues or other circumstances. Apple watch is more of a toy.
You can clearly see
Rayz2016 said:svanstrom said:Isn’t that like saying that the iPhone gps caught a car speeding while the more advanced police radar couldn’t do that when the car later on was parked…?
- The arhythmic QRS complex.
- The missing P wave.
- The overall character of the QRS complex, and how it's the same beat-to-beat (always a qRs).
- The shape of the T wave. Absolute and relative refractory.
- The QT interval
HBCan said:Bayer... as in Bayer filter. An RGB pattern filter over the camera's sensor. One color filter per pixel... Red, Green, or Blue. The image colour data is captured and interpolated for the neighbouring pixels to produce a full colour image. Virtually all commercial colour sensors employ a Bayer filter solution otherwise you would require three sensors to be used... one per colour. Not easily implemented in such compact environments as a beam splitting prism would be required too. Creator of the Bayer filter.... Bryce Bayer... who worked with Eastman Kodak. Died in 2012 I believe.
Did a little research, and it turns out it's a Sony variant of the normal Bayer pattern. They turn each photosite into four separate, smaller photosites, then average their values as a way of reducing amplification noise. Thus, the "50 megapixels" is a lie. It has 50 million photosites, but they operate in clusters of four, producing one output pixel value which is still only one channel. It's the equivalent of a 12.5 megapixel sensor.
Still no idea what Octa PD is supposed to be.
Gaze tracking is critical for mass-market headset-based VR and AR. Our eyes only actually see roughly a circle ~2º across (called the fovea) with any precision. Everywhere else, we have the ability to see large patches of color, or movement, but no detail. Our brains fake all the detail we think we see outside of the fovea.
Even more interesting, when our eyes move, we go totally blind for a few milliseconds. This phenomenon is called saccadic masking. You can verify it experimentally in a mirror. No matter how hard you try, you will never be able to see your own eyes move, because they shut down while moving.
Taken together, these allow for something called foveated rendering, where the device tracks where the user's fovea is, and renders only that small patch with high detail. It then renders everything else much more coarsely. As long as the time to render a frame can be kept below the saccadic masking duration, rendering in this way is imperceptible to the user. This means it only needs to render around 3% of the screen, which would allow even a phone GPU to render VR/AR meeting today's standards for angular resolution.
This has even been done before in a series of experiments in the 70s. Subjects were presented with a block of text, which was blurred outside where their fovea was predicted to land. They had no idea this was happening.
JP234 said:Apple tells you that the Apple Watch ECG cannot detect a heart attack right on the app.
Detect: Something is wrong.
Diagnose: This is what is wrong.mikethemartian said:A heart attack or an arrhythmia? My understanding is that a heart attack specifically refers to a blockage and that an echocardiogram (not electrocardiogram) is specifically needed to diagnose that.
Xed said:tht said:I'm curious why Apple didn't, well, is rumored not to have, put the computing chip bits in with the battery pack too. If would have made the goggles lighter, slimmer, thermally cooler. The only reason I can think of is that it's pretty difficult to push >5K 120 Hz through a wire, and this would have 2.
It may have to have a fan to cool all the chips and to cool your face.
Our field of vision is about 165º per eye horizontally (with ~120º binocular overlap) and 130º vertically. 20/20 (6/6) visual acuity is defined as the ability to see the separation of two lines placed one arcminute apart. There are 60 arcminutes per degree, so that comes out to 9,900 horizontal pixels by 7,800 pixels per eye just to keep up with 20/20 vision. Visual acuities up to 1.5x sharper are pretty common.
Let's target 20/20 (6/6) vision and a 4320p (7680x4320, erroneously marketed as "8K") screen per eye. That would be able to cover 7680/60 => 128º horizontally by 4320/60 => 72º vertically, which excludes basically all of your peripheral vision. Both of my eyes are sharper than 20/20. My stronger is 20/12 (6/3.8), or 1.6x sharper than 20/20. To keep up with that over the same area, I would need a 12288x6912 display.
Targeting 20/40 (6/12) and a 5K (5120x2880) screen per eye yields 171º by 96º. This would cover peripheral vision passably (though still not well), but the sharpness would be on the level of the pre-Retina iPhone screens.