Samsung's Exynos 9810 mobile processor follows Apple's A11 chip with machine learning feat...

13»

Comments

  • Reply 41 of 42
    avon b7avon b7 Posts: 7,667member
    tmay said:
    avon b7 said:
    avon b7 said:
    yonis said:
    Emphases mine:
    Samsung has launched its newest application processor for mobile devices, the Exynos 9 Series 9810, but claimed new features to the processor line, including depth-sensing face detection and deep learning capabilities, suggest Samsung may be borrowing some concepts from in Apple's A11 Bionic, used in the iPhone 8 and iPhone X.
    ...
    Notably for this launch, Samsung is touting the chip's capabilities relating to image processing and "neural network-based deep learning," in what could be considered an attempt to match Apple and the new features introduced with the iPhone X.
    The design lifecycle for these products is measured in years, not months. In other words, Samsung was working on this stuff in parallel to Apple, not serially.
    If we are to believe Apple, the X wasn't even planned for 2017. That would have put Samsung ahead of Apple (early 2018 release for S9) but, like I said, this is roadmaps and product release cycles, not much more (unless Apple borrowed the NPU from Huawei of course  ;) )
    Nope. Apple said the X with its final feature set, namely Face ID, wasn’t originally expected to be ready so early, but they never said whether or not the A11 was or not. 
    If the X hadn't been released this year, it would have put Samsung ahead - yes, on depth sensing for FaceID-style use.

    First paragraph of this article:

    "Samsung has launched its newest application processor for mobile devices, the Exynos 9 Series 9810, but claimed new features to the processor line, including depth-sensing face detection and deep learning capabilities, suggest Samsung may be borrowing some concepts from in Apple's A11 Bionic, used in the iPhone 8 and iPhone X."



    Your supposition is that Apple was "fortunate" to be able to incorporate depth sensing and deep learning into the A11 Bionic for the iPhone X, whereas I suspect that it was actually the availability of VCSEL's of sufficient capability and in sufficient volume for the iPhone X. 

    I would note, again, that Apple's new models in the fall will likely be all FaceID, and second generation VCSEL"s. For that, Apple is funding Finisar to renovate a facility in Texas to create and produce the next generations of VCSEL's.

    https://www.cnet.com/news/apple-invests-390m-in-finisar-iphone-laser-chip-maker/

    "VCSELs power some of the most sophisticated technology we've ever developed and we're thrilled to partner with Finisar over the next several years to push the boundaries of VCSEL technology and the applications they enable," said Jeff Williams, Apple's chief operating officer, in a statement. 

    "We're extremely proud that our involvement will help transform another American community into a manufacturing powerhouse."

    I have no idea about depth sensing being as you say 'fortunate' (or where you got that idea) nor have I any 'suspicion' beyond the information that Apple itself has provided.

    You seem to place much importance on VCSELs.

    VCSELs are nothing new. Nor is Finisar. 

    You make a point of Apple partnering with them. There is no real news here. Not even in terms of handsets.

    I would point out that Huawei has had major dealings with Finisar since before the iPhone even existed and has carried out joint research with them on VCSELs for years.

    Just one reference (from 2008)

    http://investor.finisar.com/releasedetail.cfm?releaseid=351504

    And just one reference on VSCEL and joint research between Huawei and Finisar:

    http://ieeexplore.ieee.org/document/7121732/

    It's only an abstract but look at the dates, authors and keywords. 

    Apple's investment in Finisar is logically in the interests of Apple and only Apple but they are a late entry into this field as a whole. That isn't a criticism. Apple, unlike Huawei, had little need for VCSELs until relatively recently even if Huawei's needs were related to other fields. If they want to move in that direction, it makes sense to invest but the rest of the handset industry is using depth sensing too, just on different roadmaps.

    Huawei has been working on its own depth sensing options for a while. Of course, given its very long standing relationship with Finisar involving optical component technology as well as its own research and other partners, it should not surprise you that just weeks after After revealed FaceID, Huawei took the wraps off its own similar approach with a system using potentially ten times the precision of Apple's.

    https://www.gizmochina.com/2017/11/29/huawei-demonstrates-iphone-x-like-3d-facial-recognition-feature-animated-emoji/

    This hardware is nothing really new save for a few modifications here and there for size and power efficiency. There are lots of companies working in the field of depth sensing. The fact that Apple bought Prime Sense a few years ago didn't alter the playing field.

    Although the hardware is a key element, the real success (or failure) lies with the implementation and the software behind it.

    Huawei had, for example, facial recognition tied to AI running in December 2015, including eye tracking and multiple sensors to determine if the user was holding the phone or not. The difference was that they reserved it for screen unlocking. They Included things like turning the screen off when the user wasn't looking at it or only revealing onscreen notifications to the owner of the phone as opposed to any other user that might pick it up.

    The software angle is what completes the picture and especially the AI being used locally through the NPU. Apple appears to have done a great job here. Without that intelligence, I doubt things would have been the same. 


  • Reply 42 of 42
    wizard69wizard69 Posts: 13,377member
    Who knows if this 2x gain will show up in the real world.   I suspect it is a real possibility.   What i think is great is that these extremely low power chips are to the point that they effectively can replace power hungry Intel chips in laptops and tablets.   Here it doesnt matter if they are Samsung Apple or somebody else, there is no embarrassment to putting these chips into things other than cell phones.   In otherwords ARM based chips have come a very long way in a short period of time.   Just imagine what this years IPad will be like with an A11x.  

    As for machine learning hardware it isnt really a good idea to try to credit any of these companies as AI type research has literally been going on for decades.    So you have to credit the universities and probably DARPA for not giving up on the technology when everybody was laughing at the early results.   So we have the likes of Apple, Samsung and others leveraging research that has gone on long before these chips where designed.  In fact some if that research likely started before these chip designers where born.  So in that respect the 6-12 months delta between Apples and Samsungs hardware releases doesnt mean much.  
Sign In or Register to comment.