Apple and the future of photography in Depth: part 2, iPhone X

Posted:
in iPhone edited October 2017
When iPhone launched ten years ago it took basic photos and no videos. Apple has since rapidly advanced its capabilities in mobile photography to the point where iPhone is now globally famous for its billboard-sized artwork and has been used to shoot feature length films for cinematic release. iOS 11 now achieves a new level of image capture with Depth. Here's a look at the future of photos, focusing on new features in the upcoming iPhone X.

TrueDepth sensor, iPhone X


The previous segment of this series looking at depth-based photography focused on Portrait features in iPhone 7 Plus and Portrait Lighting features new to iPhone 8 Plus (as well as the upcoming iPhone X). This segment will look at an expansion of TrueDepth sensing specific to iPhone X, as well as new camera vision and machine learning features common across iOS 11 devices.

iPhone X: selfie depth using TrueDepth sensors

At WWDC17 this summer, Apple presented iOS 11's new Depth API for using the differential depth data captured by the dual rear cameras of iPhone 7 Plus. It also hinted at an alternative technology that could enable even more accurate depth maps than the iPhone 7 Plus dual cameras captured.

By then, we were well aware of the work the company was doing with 3D structure sensors since its $345 million acquisition of PrimeSense back in 2013, making it obvious that Apple had created its Depth API to handle a range of sophisticated imaging capabilities that took advantage of the camera not just as an aperture used to electronically expose digital film, but as a machine eye.

However, one of the first applications of the new TrueDepth sensor is again Portrait Lighting, this time for use in front facing "selfie" shots. This effectively holds out to the millions of iPhone 7 Plus and 8 Plus users a recognizable extension of a feature they're already familiar with using on the rear dual lens camera.

While the rear dual camera calculates a depth map using differential math on various points of two images taken by the two cameras, the TrueDepth sensor array uses a pattern of reflected invisible light, giving it a more detailed depth map to compare to the color image taken by the camera.


iPhone X: Animoji using TrueDepth

In addition to being able to handle depth data similar to the dual lens 7/8 Plus models to perform Portrait and Lighting features on still images, the TrueDepth sensors on iPhone X can also track the more detailed depth information over time to create an animated representation of the user's face, creating an avatar that mimics their movements. Apple delivered this in the form of new "animoji," patterned after its existing, familiar emoji, now brought to life as animated 3D models.




Apple could have created DIY cartoonish animated avatars, patterned after Snapchat's acquired Bitmoji or Nintendo's Wii Mii avatars. Instead it took a selection of the iconic emojis it created internally and turned them into 3D masks that anyone can "wear" to communicate visually in iMessage. Google's latest Pixel 2 is currently only attempting to copy Portrait mode from last year and Live Photos from two years ago. It doesn't yet even know about Portrait Lighting yet, let alone Animoji, and lacks a real depth camera to support such capabilities

In iOS 11, Apple is also making the underlying TrueDepth technology available to third parties to use to create their own custom effects, as demonstrated with Snapchat filters that apply detailed skin-tight effects to a user in real-time. App and game developers can similarly build their own avatars synced with the user's facial expressions.

It will be a little harder for Android licensees to copy Apple's Animoji, because the emoji created by Google, Samsung and others are generally terrible. Google has dumped its oddball emojis in favor of iOS-like designs for Android 8 "Oreo," but it will take a long time for the new software to trickle out.

Additionally, there is no centralized focus in Android for deploying the kind of features Apple can deploy rapidly. Hardware licensees are all trying out their own hardware and software ideas, and even Google's "do it like this" Pixel project lags far behind Apple.

Google's latest Pixel 2 is currently only attempting to copy Portrait mode from last year and Live Photos from two years ago. It doesn't yet even know about Portrait Lighting yet, let alone Animoji, and lacks a real depth camera to support such capabilities.

iPhone X: Face ID using TrueDepth

In parallel, Apple is also doing its own facial profile of the enrolled user to support Face ID, the modern new alternative to Touch ID. Critics and skeptics started complaining about Face ID before having even tried it, but the new system provides an even greater set of unique biometric data to evaluate than the existing, small Touch ID sensor--which only looks at part of your fingerprint when you touch it.

Face ID is not really "using your face as your password." Your security password remains the passcode you select and change as needed. Remote attackers have no way to attempt to present a 3D picture of your face or a fingerprint scan to log in as you remotely. In fact, our initial attempts to unlock iPhone X by holding it up to an enrolled presenter's face failed to unlock the device, even when standing next to the person.

"There's a distance component. It's hard to do it when somebody else is holding it," the presenter noted before taking the phone and promptly unlocking it at a natural arms' length. In the video clip below, we had present Face ID authentication in slo-mo because it was so fast once it was in the hand of the enrolled user.



Face ID (like Touch ID before it) simply provides a more secure way to conveniently skip over entering your passcode by proving to the device who you are in a way that's difficult to fake--or to coerce from another person, as we discovered in Apple's hands-on area in September. Apple has made this easy to disable, so anyone trying to exploit the biometric system on a stolen phone would quickly run out of time and chances to do so.

Biometric ID can also be turned off entirely on iOS. However, it's important to recognize that the result of Apple's biometric ID system since iPhone 5s has not been an increase in vulnerability or stolen data that some pundits once predicted, but rather a massive decrease in stolen phones and widespread, effective protection of users' personal data, to the point where national law enforcement has raised concerns about iOS's ability to protect saved content on a phone using very effective encryption.

Despite all the sweaty handwringing, biometric ID on iOS has been implemented correctly due to the thoughtful consideration invested in its technical design and engineering.

Face ID


The same can't be said of Android, where leading, major licensees including Samsung and HTC first screwed up their own implementations of fingerprint authentication, then rushed out gimmicky face-photo recognition schemes that were not effective and easy to exploit.

Using its unique, custom designed and calibrated TrueDepth camera system on iPhone X, Apple has far more data to evaluate to reject fakes and verify the user for convenient yet secure biometric identification. The cost of the sensor system will prevent most Android licensees from being able to adopt similar technology. Google recently noted that it expects one third of the Android phones sold this year to cost less than $100.

In dramatic contrast, Apple is expected to sell its $1,000-and-up iPhone X to more than a third--and perhaps as much as half--of its new buyers this year. There are a number of reasons for users to want to upgrade to Apple's newest iPhone, but the fact that all of them will also get the TrueDepth sensor means that it will immediately get a very large installed base in the tens of millions, enough to attract the attention of third party developers.

In addition to obtaining functional 3D sensing hardware and deploying in their new phones, Android licensees also face another issue Apple has already solved: building an ecosystem and demonstrating real-world applications for the technology.

A report stating that Chinese vendors were struggling to source 3D sensors built by Qualcomm/Himax also noted that "it would take a longer time for smartphone vendors to establish the relevant application ecosystem, involving firmware, software and apps, needed to support the performance of 3D sensing modules than to support the function of fingerprint recognition or touch control," concluding that "this will constitute the largest barrier to the incorporation of 3D sensors into smartphones."

Android's shallow depth of platform

Other attempts to sell third party 3D camera sensors for mobile devices have made little impact on the market. Google has worked with PrimeSense depth-based imaging for years under its Project Tango, but hasn't been able to convince its price-sensitive Android licensees to adopt the necessary technology in a way that matters. Google has worked with PrimeSense depth-based imaging for years under its Project Tango, but hasn't been able to convince its price-sensitive Android licensees to adopt the necessary technology in a way that matters

Once Apple demonstrated ARKit, Google renamed portions of Tango platform as "ARCore" to piggyback on Apple's publicity, but again there is not an installed base of supported Androids capable of functionally using AR, and even fewer have the capacity to work with real depth data, whether from dual cameras or any type of depth sensor.

Also, the very decentralized nature of Android not only introduces problems with fragmentation and device calibration, but also trends toward building super cheap devices, not powerful hardware with specialized cameras and high performance, local computational power that is required to handle sophisticated AR and depth-based camera imaging.

Rather than developing powerful hardware, Google has for years promoted the idea that Android, Chrome and Pixel products could be cheap devices powered by its sophisticated cloud services. That vision is oriented around the idea that Google wants users' data, not that it's the best way to deploy advanced technology to individuals.


Google's latest Pixel 2 uses the same chip as Xiaomi's Mi 6


Apple has increasingly widened its lead in on-device sophistication and processing power, meaning that iOS devices can do more without needing a fast network connection to the cloud. Critical features like biometric authentication can be handled locally with greater security, and other processing of user data and images can be done without any fear of interception.Apple has increasingly widened its lead in on-device sophistication and processing power, meaning that iOS devices can do more without needing a fast network connection to the cloud

For years, even premium-priced Android devices remained too slow to perform Full Disk Encryption because Google focused on a slow, software-based implementation aimed at making Android work across lots of different devices rather than designing hardware accelerated encryption that would require better hardware on the same level as Apple's iPhones.

That was bad for users, but Google didn't care because it wasn't making money selling quality hardware to users; instead it was focused on just building out a broad advertising platform that would not benefit from users having effective encryption of their personal content.

Further, Android's implementation of Full Disk Encryption delegated security oversight to Qualcomm, which dropped the ball by storing disk encryption keys in software. Last summer, Dan Goodin wrote for Ars Technica that this "leaves the keys vulnerable to a variety of attacks that can pull a key off a device."

If Google can't manage to get basic device encryption working on Android, what hope is there that its platform of low-end hardware and partner-delegated software engineering will get its platform's camera features up to speed with Apple, a company with a clear history of delivering substantial, incremental progress rather than just an occasional pep rally for vanity hardware releases that do not actually sell in meaningful volumes?

Beyond deep: Vision and CoreML

On-device machine vision is now becoming possible as Apple's mobile devices are packed with lots of computational power with the capability to discern objects, position, movement and even recognize faces and specific people--technology Apple had already been applying to static photographs in its Photos app, but is now doing live in-camera during composition.

Depth aware camera capture certainly isn't the only frontier in iOS imaging. In addition to the new features that require a dual camera Plus or the new iPhone X TrueDepth camera sensor, Apple's iOS 11 also introduces new camera intelligence that benefits previous generations of iOS devices.

Apple's new iOS 11 Vision framework provides high-performance image analysis, using computer vision techniques to recognize faces, detect facial features and identify scenes in images and video.

The Vision framework can be used along with CoreML, which uses Machine Learning to apply a trained model against incoming data to recognize and correlate patterns, to suggest what the camera is looking at with a given degree of certainty. Apple noted this summer that CoreML was already six times faster than existing Androids, and that was on iPhone 7, before it released the radically enhanced A11 Bionic powering iPhone 8 and iPhone X.




Apple has already been using ML in its camera, as well as in Siri and the QuickType keyboard. Now it's opening that capability up to third party developers as well.

CoreML is built on top of iOS 11's Metal 2 and Accelerate frameworks, so its tasks are already optimized for crunching on all available processors. And paired with Vision, it already knows how to do face detection, face tracking (in video), find landmarks, text detection, rectangle detection, bar code detection, object tracking, and image registration.

Those sophisticated features are things users can feel comfortable with their personal device doing internally, rather than being cloud services that require exposing what your camera sees to a network service that might collect and track them or expose sensitive details to corporate hackers or nefarious governments.

There's also another layer of camera-based imaging technology Apple is introducing in iOS 11, described in the next upcoming article segment.
«1

Comments

  • Reply 1 of 30
    AvieshekAvieshek Posts: 100member
    The telephoto lens needs to be given equal importance and emphasis as the wide-angle lens. Starting with f/1.8 Aperture.

    It's also prime time that other areas are met like the 6-element lens which are made of plastics. During the iPhone 4s-5 period Apple planned of succeeding to glass citing Nokia's examples but it wasn't considered cost effective at time being ($1). Now, LG out-of-the 6 has replaced one with Glass as a start but Apple is already familiar with replacing all of them with Glass from Plastics which doesn't absorb natural colours and light.

    Apple had even planned out successors with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays, Quantum-Dots & microLED are just evolution to OLED. If the plastics are replaced than truer colours can be retained instead of artificially saturating them but also purer light can be achieved eliminating noise even at 50mp. Software based artificial saturation can be replaced with natural colour and also more light.

    Sadly, it seems Apple forgot this crucial exploration after Steve Jobs expired.
    edited October 2017 SoundJudgmentEngDevjony0
  • Reply 2 of 30
    tmaytmay Posts: 6,329member
    Avieshek said:
    The telephoto lens needs to be given equal importance and emphasis as the wide-angle lens. Starting with f/1.8 Aperture.

    It's also time that other areas are met like the 6-element lens which are made of plastics. During the iPhone 4s-5 period Apple planned of succeeding to glass but it wasn't cost effective at the time. Now, LG has out-of-the 6 has one replaced with Glass.

    One could go even with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays. If the plastics are replaced than not only more truer colours can be achieved but purer light will eliminate noise even at 50mp.
    I'm always happy to see an Optical Engineer comment on the design, and manufacturing, of a smartphone camera module.  /s

    I'm guessing that building a telephoto module with glass element(s), large aperture, and a larger sensor in a stack height less than 8 mm, and manufacturing that in volumes of at least 50 million units is non trivial.
    JWSCdiegogwatto_cobrajony0
  • Reply 3 of 30
    AvieshekAvieshek Posts: 100member
    tmay said:
    Avieshek said:
    The telephoto lens needs to be given equal importance and emphasis as the wide-angle lens. Starting with f/1.8 Aperture.

    It's also time that other areas are met like the 6-element lens which are made of plastics. During the iPhone 4s-5 period Apple planned of succeeding to glass but it wasn't cost effective at the time. Now, LG has out-of-the 6 has one replaced with Glass.

    One could go even with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays. If the plastics are replaced than not only more truer colours can be achieved but purer light will eliminate noise even at 50mp.
    I'm always happy to see an Optical Engineer comment on the design, and manufacturing, of a smartphone camera module.  /s

    I'm guessing that building a telephoto module with glass element(s), large aperture, and a larger sensor in a stack height less than 8 mm, and manufacturing that in volumes of at least 50 million units is non trivial.
    Has it become second nature of internet forums to be readily pessimistic and try to be sarcastically smart as well? Will you prefer to carry the same excuse 100yrs later when the population would expand in multiplications and it's hard for Apple to meet the scale.

    However, aside the fanboyism, The discussion mentioned above is not a random idea of an individual but an explorative discovery evaluated & portrayed by Apple alone. I hope Apple is conscious enough to not live in the same bubble as it's worshippers develop upon. Pixel XL 2 made quite a good show today.
    Apple saved that idea when they might feel the need to increase resolution. They have avoided that road for sometime now. Largan have started production of 18mp Image Sensor this October.
    edited October 2017 SoundJudgmentMintzkevin keemuthuk_vanalingam
  • Reply 4 of 30
    It is amazing what Apple and others have been able to do with camera phones, but as someone who used to make their living as a photographer during the film era I have a real problem with the idea of investing so much into something that will be traded or given away in about 2-3 years. Maybe it’s generational, but $1,000 for a throwaway point and shoot camera- even nice ones like the iPhone 8 and X are- is just not something I have any interest in.

    I would love to see what Apple could do if they tried to make a serious stand alone camera that could work with and be controlled by a Mac or iOS device. If they built a camera on the Micro 4/3rds platform there would be a whole universe of great glass that could find new life.
  • Reply 5 of 30
    AvieshekAvieshek Posts: 100member
    It is amazing what Apple and others have been able to do with camera phones, but as someone who used to make their living as a photographer during the film era I have a real problem with the idea of investing so much into something that will be traded or given away in about 2-3 years. Maybe it’s generational, but $1,000 for a throwaway point and shoot camera- even nice ones like the iPhone 8 and X are- is just not something I have any interest in.

    I would love to see what Apple could do if they tried to make a serious stand alone camera that could work with and be controlled by a Mac or iOS device. If they built a camera on the Micro 4/3rds platform there would be a whole universe of great glass that could find new life.
    Considering the expenses for licensing Adobe Photoshop and having a Mac to run along with the 'variety' of Lens kit for the Camera, aside the other equipments as minimum investment. I guess, it's fairly economical at today's technical age & capability. I won't put my money yet on 2017 iPhones seeing what Google with limited hardware could do, the competition just heated up. But yeah, I could just halt OS updates after 3yrs and use the device solely as a camera and have a run for a whole 5-7yrs. The iPhone 4s & 5 is still running.
    edited October 2017
  • Reply 6 of 30
    bill42bill42 Posts: 131member
    It is amazing what Apple and others have been able to do with camera phones, but as someone who used to make their living as a photographer during the film era I have a real problem with the idea of investing so much into something that will be traded or given away in about 2-3 years. Maybe it’s generational, but $1,000 for a throwaway point and shoot camera- even nice ones like the iPhone 8 and X are- is just not something I have any interest in.

    I would love to see what Apple could do if they tried to make a serious stand alone camera that could work with and be controlled by a Mac or iOS device. If they built a camera on the Micro 4/3rds platform there would be a whole universe of great glass that could find new life.
    I'm 47 and I grew up with film SLR cameras. It took many years for digital cameras to surpass the image quality of a good film camera but now even the cheaper ones leave our old film cameras in the dust. Now we are approaching the era when phone cameras take sharper and less noisy/grainy photos that our old film SLR cameras. Is $1000 a lot? Maybe compared to my pocket SONY camera but an iPhone is even more convenient as it fits in my jeans pocket. But you aren't just buying a camera, right? $1000 gets you a 4k video camera as well in your pocket. And a computer more powerful than desktop computers before 2013. And a photo viewer that contains every single photo and video I have ever taken since I started scanning photos or shooting with digital cameras. (I have well over 40,000 photos in my pocket) Add a GPS navigator for your car. I even use my iPhone for my motorcycle GPS. Add a date planner, a jukebox of every song I ever bought in my life, a personal assistant, an encyclopedia of every fact known to mankind, a pocket store to order anything I could ever want with free 2 day shipping, an instant messaging device and oh yeah, a phone and a video phone to call anyone on the entire planet for free.  If we paid $10,000 that would still be cheaper that it should be. And, in 3 years you can sell that iPhone X for $400.
    edited October 2017 CuJoYYCStrangeDaysradarthekatstarwarsuniscape[Deleted User][Deleted User]watto_cobra
  • Reply 7 of 30
    JWSCJWSC Posts: 1,203member

    Face ID is the science fiction we see in the movies come to life.  Who needs a fingerprint or retinal scanner when you’ve got the whole face including thermal maps to work with.  And Apple made it work!

    That, and power efficient desktop class processors are more examples of technologies Apple’s competitors will find difficult to replicate.

    radarthekatstarwarswatto_cobrajony0
  • Reply 8 of 30
    OferOfer Posts: 241unconfirmed, member
    ....and that was on iPhone 7, before it released the RACIALLY enhanced A11 Bionic powering iPhone 8 and iPhone X.

    Now that’s technological advancement. Using race to power a chip. Apple is waaaay ahead of the competition :wink: 
    JWSCbonobob2old4funwatto_cobra
  • Reply 9 of 30
    entropysentropys Posts: 4,166member
    I am not even going to ask.
    edited October 2017
  • Reply 10 of 30
    tmaytmay Posts: 6,329member
    Avieshek said:
    tmay said:
    Avieshek said:
    The telephoto lens needs to be given equal importance and emphasis as the wide-angle lens. Starting with f/1.8 Aperture.

    It's also time that other areas are met like the 6-element lens which are made of plastics. During the iPhone 4s-5 period Apple planned of succeeding to glass but it wasn't cost effective at the time. Now, LG has out-of-the 6 has one replaced with Glass.

    One could go even with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays. If the plastics are replaced than not only more truer colours can be achieved but purer light will eliminate noise even at 50mp.
    I'm always happy to see an Optical Engineer comment on the design, and manufacturing, of a smartphone camera module.  /s

    I'm guessing that building a telephoto module with glass element(s), large aperture, and a larger sensor in a stack height less than 8 mm, and manufacturing that in volumes of at least 50 million units is non trivial.
    Has it become second nature of internet forums to be readily pessimistic and try to be sarcastically smart as well? Will you prefer to carry the same excuse 100yrs later when the population would expand in multiplications and it's hard for Apple to meet the scale.

    However, aside the fanboyism, The discussion mentioned above is not a random idea of an individual but an explorative discovery evaluated & portrayed by Apple alone. I hope Apple is conscious enough to not live in the same bubble as it's worshippers develop upon. Pixel XL 2 made quite a good show today.
    Apple saved that idea when they might feel the need to increase resolution. They have avoided that road for sometime now. Largan have started production of 18mp Image Sensor this October.
    Pessimism?

    I've seen some of your previous comments. You seem more interested in listing off bleeding edge specs than actually taking advantage of nearly released technology.

    Yeah, the Pixel 2 XL made a nice show, but so has the entire line of new Apple iPhones, and frankly, so have most of the premium smartphones available today. In fact, the differences are so slight in actual practice, that people don't even have to worry about making a misinformed decision.

    I'm not buying an iPhone this year, as I'm completely happy with my iPhone 7 Plus until this time next year, when I'll likely opt for an X2. Still, I have to say that today, I would much prefer the iPhones imaging features over those of the Pixel's, whatever the minor differences in IQ. 
    radarthekat2old4funwatto_cobra
  • Reply 11 of 30
    StrangeDaysStrangeDays Posts: 12,877member
    Avieshek said:

    Apple had even planned out successors with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays, Quantum-Dots & microLED are just evolution to OLED. If the plastics are replaced than truer colours can be retained instead of artificially saturating them but also purer light can be achieved eliminating noise even at 50mp. Software based artificial saturation can be replaced with natural colour and also more light.

    Sadly, it seems Apple forgot this crucial exploration after Steve Jobs expired.
    I've read that that sapphire transmits less light than glass and is less good for camera element optics, not better. I'm not familiar with any SLR lenses using sapphire elements. Why do you want Apple to?

    Sorry but I think anything you can think of Apple has as well, and hasn't forgotten about. They have entire teams of people working on this stuff. But as always, ideas are the easy part and implementation is the hard part. Especially at scale of millions and millions and millions and...
    tmayradarthekat2old4funJWSCwatto_cobra
  • Reply 12 of 30
    k2kwk2kw Posts: 2,075member
    Well DED should do a review of the Pixel 2. 
    Give us a real assessment of it.  You know when Android reviewers will ignore the problems.
    JWSCradarthekat2old4funwatto_cobra
  • Reply 13 of 30
    foggyhillfoggyhill Posts: 4,767member
    Avieshek said:
    The telephoto lens needs to be given equal importance and emphasis as the wide-angle lens. Starting with f/1.8 Aperture.

    It's also prime time that other areas are met like the 6-element lens which are made of plastics. During the iPhone 4s-5 period Apple planned of succeeding to glass citing Nokia's examples but it wasn't considered cost effective at time being ($1). Now, LG out-of-the 6 has replaced one with Glass as a start but Apple is already familiar with replacing all of them with Glass from Plastics which doesn't absorb natural colours and light.

    Apple had even planned out successors with Crystals like Sapphire and eventually Diamond but taking the first with Glass is like moving from transmissive to emmisive displays, Quantum-Dots & microLED are just evolution to OLED. If the plastics are replaced than truer colours can be retained instead of artificially saturating them but also purer light can be achieved eliminating noise even at 50mp. Software based artificial saturation can be replaced with natural colour and also more light.

    Sadly, it seems Apple forgot this crucial exploration after Steve Jobs expired.
    You’re mention of 50 mp in a god damn smart phone and then propping Jobs corpse  means you are full of crap
    2old4fun
  • Reply 14 of 30
    foggyhillfoggyhill Posts: 4,767member
    It is amazing what Apple and others have been able to do with camera phones, but as someone who used to make their living as a photographer during the film era I have a real problem with the idea of investing so much into something that will be traded or given away in about 2-3 years. Maybe it’s generational, but $1,000 for a throwaway point and shoot camera- even nice ones like the iPhone 8 and X are- is just not something I have any interest in.

    I would love to see what Apple could do if they tried to make a serious stand alone camera that could work with and be controlled by a Mac or iOS device. If they built a camera on the Micro 4/3rds platform there would be a whole universe of great glass that could find new life.
    The phones still have a value after 3 years and has 100 different other uses unlike the camera. A desktop computer of 2000 had very little different uses compared to.a current smart phone yet people paid as much as that. People don’t use inflation adjusted prices... when you factor that in, a cell phone is a miracle of usefulness
    radarthekat[Deleted User]2old4funwatto_cobra
  • Reply 15 of 30



    So what is it now with the A11?

    The Kirin 970 is quite a bit faster than the iPhone 7 Plus.


  • Reply 16 of 30
    Portrait mode with the front facing camera:


    radarthekat[Deleted User]2old4fun
  • Reply 17 of 30
    tmaytmay Posts: 6,329member
    EngDev said:



    So what is it now with the A11?

    The Kirin 970 is quite a bit faster than the iPhone 7 Plus.


    Laughable.

    We used to get real troll's around here. 

    Here's an Android fella to explain it to you.

    http://www.androidauthority.com/why-are-apples-chips-faster-than-qualcomms-gary-explains-802738/
    edited October 2017 StrangeDays[Deleted User]2old4funpatchythepiratewatto_cobra
  • Reply 18 of 30
    EngDevEngDev Posts: 76member
    tmay said:
    EngDev said:



    So what is it now with the A11?

    The Kirin 970 is quite a bit faster than the iPhone 7 Plus.


    Laughable.

    We used to get real troll's around here. 

    Here's an Android fella to explain it to you.

    http://www.androidauthority.com/why-are-apples-chips-faster-than-qualcomms-gary-explains-802738/
    That didn't answer my question. It also didn't discuss the Kirin 970, a chip that has a 1.92 TFLOPS neural processing unit.
  • Reply 19 of 30
    foggyhillfoggyhill Posts: 4,767member
    tmay said:
    EngDev said:



    So what is it now with the A11?

    The Kirin 970 is quite a bit faster than the iPhone 7 Plus.


    Laughable.

    We used to get real troll's around here. 

    Here's an Android fella to explain it to you.

    http://www.androidauthority.com/why-are-apples-chips-faster-than-qualcomms-gary-explains-802738/
    EngDev truly doesn't give a hoot if what he says his pure shil, considering there is no real standard for what NPU actually means, this sound like pure unadulterated marketing pixie dust straight from the mouth of their pixie dust producing department. What does this Unicorn Processing Unit actually do? Well who knows really since it seemingly didn't come out clearly in that presentation... Beyond vague specs jockey points... And Android natively has no real support for those activities.

    2000 image recognized by per minute is the most laughable thing... Unless its a standard suite of images used in Computer Vision.
    If that's the case, well they'll just point me to that info.

    There are many coprocessors on the A11, maybe Apple should just define some random activity and say how much of this activity it can do per second.
    Seems that's all it takes... Set up the goalpost exactly want so you can kick X balls in in X seconds. Voila! It has been done.

    Funny that when Apple says 25%, and 70%, when you actually get verifiable benchmarks on that, it's EXACTLY that.

    tmay2old4funwatto_cobrabrucemc
  • Reply 20 of 30
    tmaytmay Posts: 6,329member
    foggyhill said:
    tmay said:
    EngDev said:



    So what is it now with the A11?

    The Kirin 970 is quite a bit faster than the iPhone 7 Plus.


    Laughable.

    We used to get real troll's around here. 

    Here's an Android fella to explain it to you.

    http://www.androidauthority.com/why-are-apples-chips-faster-than-qualcomms-gary-explains-802738/
    EngDev truly doesn't give a hoot if what he says his pure shil, considering there is no real standard for what NPU actually means, this sound like pure unadulterated marketing pixie dust straight from the mouth of their pixie dust producing department. What does this Unicorn Processing Unit actually do? Well who knows really since it seemingly didn't come out clearly in that presentation... Beyond vague specs jockey points... And Android natively has no real support for those activities.

    2000 image recognized by per minute is the most laughable thing... Unless its a standard suite of images used in Computer Vision.
    If that's the case, well they'll just point me to that info.

    There are many coprocessors on the A11, maybe Apple should just define some random activity and say how much of this activity it can do per second.
    Seems that's all it takes... Set up the goalpost exactly want so you can kick X balls in in X seconds. Voila! It has been done.

    Funny that when Apple says 25%, and 70%, when you actually get verifiable benchmarks on that, it's EXACTLY that.

    He's a measurebator; they get off on arcane spec sheet comparisons, and when they can't win in standard benchmarks, they always fall back to those "special" benchmark that help their team win. Pretty pathetic. Apple dropping the A11 on them is like a red flag.

    You see them hanging out on all the tech sites. There was another one today, a short timer, Avieshek, posting right in this thread, that's exactly the same. Give them a perfectly fine, state of the art smartphone, and they complain because it isn't bleeding edge, or they have to talk about what it's missing, or some arcane feature that some Chinese OEM has.

    Really tired of them, but I've eased up on telling them to fuck off. It never worked for sog35, why would it work for anyone else?

    Yeah, that image processing benchmark is a winner...
    edited October 2017 StrangeDaysuniscapepatchythepirate2old4funwatto_cobra
Sign In or Register to comment.