elijahg

About

Username
elijahg
Joined
Visits
375
Last Active
Roles
member
Points
6,377
Badges
2
Posts
2,828
  • Power button Touch ID on the iPad Air 4 was an 'incredible feat'

    hmlongco said:
    Stretches credibility since it’s not like Apple is the first to have this capability
    I suspect that once more it's a case of Apple taking a feature and actually doing it right....

    https://beebom.com/galaxy-a7-power-button-fingerprint-scanner-face-unlock/
    Well Sony had it in Q4 2015 (except in the US, because Apple had patented a fingerprint reader in a power button in Q2 2015) and it is as fast as the one on the then current 6S. So really, Sony beat them to the punch by 4 years, which also means this is 4 year old tech that Apple is touting as an "incredible feat". That said, I have no idea how secure the Sony one is comparatively. The touch sensors are essentially ultra-high resolution CMOS-based touchscreens, high enough resolution to capture the ridges of your fingerprint in detail high with enough resolution to authenticate. 

    Also:
    "On the cellular iPads, the top portion of the enclosure is the antenna," Ternus explained, which meant they had to place "this incredibly sensitive Touch ID sensor right inside an incredibly sensitive antenna, and had to figure out how to make them work with each other and not be talking over each other and causing interference."

    Surely a simple solution is turn off the RF for a few hundred milliseconds while the fingerprint snapshot is taken, as to not affect the sensor, which wouldn't drop the cellular connections as they're easily robust enough to lose several seconds of data. 

    This all sounds like a lot of marketing bluster to me. 

    williamlondonChris_Saulflyingdpprismatics[Deleted User]docbburkmuthuk_vanalingamchemengin1philboogie
  • Power button Touch ID on the iPad Air 4 was an 'incredible feat'

    Xed said:
    elijahg said:
    hmlongco said:
    Stretches credibility since it’s not like Apple is the first to have this capability
    I suspect that once more it's a case of Apple taking a feature and actually doing it right....

    https://beebom.com/galaxy-a7-power-button-fingerprint-scanner-face-unlock/
    Well Sony had it in Q4 2015 (except in the US, because Apple had patented a fingerprint reader in a power button in Q2 2015) and it is as fast as the one on the then current 6S. So really, Sony beat them to the punch by 4 years, which also means this is 4 year old tech that Apple is touting as an "incredible feat". That said, I have no idea how secure the Sony one is comparatively. 

    Also:
    "On the cellular iPads, the top portion of the enclosure is the antenna," Ternus explained, which meant they had to place "this incredibly sensitive Touch ID sensor right inside an incredibly sensitive antenna, and had to figure out how to make them work with each other and not be talking over each other and causing interference."

    Surely a simple solution is turn off the RF for a few hundred milliseconds while the fingerprint snapshot is taken, as to not affect the sensor, which wouldn't drop the cellular connections as they're easily robust enough to lose several seconds of data. 

    This all sounds like a lot of marketing bluster to me. 

    Which means you have no idea what it took to put a fast and secure version of Touch ID into the Sleep/Wake button.

    If we are talking about technically having fingerprint recognition then it could've been done decades ago with those thin bar that you swipe your finger across, but note that Apple never once added that and your comments would also imply that Touch ID was never an impressive or advanced biometric inclusion at any point simply because it was after someone else had some very basic option. Do you not see the fault in your logic? It's like saying the Sistine Chapel is on par with the botched restoration of "Ecce Homo" because they're both religious art.
    Well, I do, as I said and you conveniently edited out: 
    The touch sensors are essentially ultra-high resolution CMOS-based touchscreens, high enough resolution to capture the ridges of your fingerprint in detail high with enough resolution to authenticate.

    There is no fault in my logic because I did not imply that TouchID was not impressive, you are using a strawman argument to disprove something I did not say. 

    The original TouchID was impressive, and at the time capacitive fingerprint readers were extremely rare, no one else had them. Squeezing all that into a button for the first time was an incredible feat. An entirely new form of an existing concept (that works very well) is what made the original TouchID impressive. Reshaping the button that uses preexisting tech to match what a competitor had 4 years prior is not an "incredible feat". 

    The reason Apple isn't using the bar method is because it required IR LEDs in addition to the sensor and was about 10mm thick. Sony is using capacitive sensing in their button, just like Apple. The security of it is mostly down to the resolution and the software, and there have been no reports of the Sony sensor being fooled any more easily than TouchID, which is good, but isn't infallible. Sony's in-button sensing was a first, and as such was more of an "incredible feat" than Apple's version 4 years later, though it again is just an evolution of a capacitive sensor. Apple seems to imply their sensor is "incredible" because of its apparent immunity to RF noise. 

    Yes, what your strawman would be like saying the Sistine Chapel is on par with the restoration of Ecce Homo. But my argument is not that. It is that the original TouchID was much more impressive because whilst people always built cathedrals (sensors) from mud, Apple came along and built them from stone instead. Sony then reshaped that chapel into something much more sleek, but still out of stone, and 4 years later, Apple did the same and called it an "incredible feat" because they did it.

    Perhaps instead of lapping up Apple's marketing, try using a more neutral stance to see the difference between actual "incredible feats" such as the original TouchID, and this, an evolution of already existing tech.


    williamlondonflyingdpavon b7prismaticsMplsP[Deleted User]muthuk_vanalingambeowulfschmidtAlex1Nphilboogie
  • Tom Hanks disappointed with Apple TV+ 'Greyhound' release

    mtriviso said:
    Sigh. Just open the movie theaters. There's nothing like watching a movie in a massive IMAX 3D theater. If people are frightened they might get the rona, then just stay at home. Please, just let the rest of us who are unafraid enjoy what our acting troupes have to offer in the milieu to which we have become accustomed. 
    Yeah because watching movies at cinemas is such an essential pass-time that you'll die without it. It's not all about you, it's about protection of others. Considering the US's current trajectory, it's unlikely there will be much more easing of lockdown.
    XedforegoneconclusionmacxpressStrangeDaystmaystevemebsdjames4242roundaboutnowcharlesatlasmontrosemacs
  • Apple details headphone jack improvements on new MacBook Pro

    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    IreneWmuthuk_vanalingamrundhvidmacplusplusPShimijony0
  • Facebook tests Face ID authentication for iOS Messenger app

    I avoid that app for many reasons.   This will just add to that list.

    We know FB has no loyalty, scruples, values or integrity.   We know they already know more about us than we know ourselves.  So, how do they plan to benefit from this?

    The irony of it is:   People are alarmed at governments ID'ing faces --- but they'll be fine with FB doing it!
    I also avoid it for many reasons. But since this is using Apple's FaceID APIs, it can't actually ID your face. Well, not thought FaceID at least - it can through normal use of the camera.
    GeorgeBMacjony0PetrolDaveviclauyycwatto_cobra
  • Netgear has a new $1,500 Wi-Fi 6e mesh router

    ITGUYINSD said:
    elijahg said:
    Not sure why anyone would pay such an absurd amount for this from what is a pretty poorly regarded networking company - when you could get a prosumer solution from a professional networking company like Cisco (Meraki) or Ubiquiti (Unifi) for 1/4 the price; companies with actual networking expertise who write their own network stack with excellent UIs, reliability, security and updates for 5 years+. Or you could get an overpriced Netgear box with a generic version of Linux underneath and a crap GUI that doesn't work properly ontop with 6 months of software support for $1500.
    Because most people don't have the stack of networking certifications needed to properly install and configure Meraki and Unifi equipment.  They are NOT user-friendly and most of the advanced options are only accessible via command line.  No thanks.

    I had Ubiquiti UAC-AC-PRO's (multiple) in my house for years.  Junk.  Nothing but trouble (slow, would stop passing traffic, poor range).  Replaced with TP-Link Deco and never looked back.  Deco (in the words of Apple) "just works".
    Wow if you think you need a "stack of networking certifications" to use Meraki and Unifi equipment, you need to step back and get someone in who knows how to use a simple iOS app. Running the APs in standalone mode only needs the (excellent) Unifi app. It is as simple as setting up an Airport Basestation. And no, the advanced options are not "only accessible via command line". That's complete FUD.

    I've installed many many UAP-AC-Pro's alongside nanoHDs, flexHDs, AC-Meshes, AC-IWs and AC-Lites, with associated switches and routers. Never a problem with them, apart from the odd FW update causing oddities. But I have come across situations where entirely clueless people have screwed up standalone installations by doing things like having a different SSID or different WPA key for each base station, that understandably doesn't work too well. Maybe you did that, since many people get on just fine with 1000+ clients in schools with AC-Pros, sometimes upwards of 50 clients per AP.

    I have replaced many TP-link systems. They're in general trash, and there are botnets that use TP-Link routers. Terrible security, probably intentionally terrible so China can - as others have said - use them as they see fit.
    GG1rundhvidTripleTroubleFileMakerFeller
  • iPhone 15 has new battery health controls to prevent charging past 80%

    Actually... Most wear to lithium batteries is in the topping charge (80%+). Preventing the battery exceeding 80% does genuinely increase battery lifetime quite significantly. If a user doesn't discharge their battery to <20% by the end of the day, there is no point in charging it to 100% and causing more wear. Better to use the lower 0-80% than the upper 20-100%. This is why Apple limits the charge to 80% as much as possible at night, which is a relatively small proportion of time, but the battery improvement is enough that they deem it worth it.

    I hope this comes to older iPhones too.
    Alex1NmaltzbyronlappleinsideruserFileMakerFellerjony0
  • Why AAA games promoted by Apple flop in the App Store

    The cost is certainly a factor, as is the incompatibility with anything else. I can get Stray on the Mac App Store for $30, or I can wait for a Steam sale and get it for $20. The Mac App Store doesn't have sales that I am aware of. And if I get the game on Steam, I can play it on Mac or Windows. Why would I ever buy it on the MAS?
    dewmeneoncatJBKravnorodomAlex1N
  • Apple & EU slammed for dangerous child abuse imagery scanning plans

    "Confusion" was not the issue. Apple of course would say it is because to do otherwise would be an admission that the feature was toxic and entirely contradictory to their public privacy stance. Privacy organisations and governments weren't "confused", they could foresee the potential privacy consequences. Apple knows full well the pushback was due to their public "what's happens on your phone stays on your phone" stance, the polar opposite to scanning phones for CSAM - and the potential for further encroachment on privacy.
    muthuk_vanalingamxyzzy-xxxOferentropyscat52byronldarkvaderSage1111
  • Apple's generative AI push includes Xcode tools, auto-summarizing features in apps

    Auto-summary is nothing new and it does not use AI. It has been around since OS X.4ish afaik. In fact Malcom you did an article on it in 2018: https://appleinsider.com/articles/18/05/07/how-to-shorten-long-text-documents-in-macos-with-the-summarize-service


    iOSDevSWEappleinsiderusermattinoz