Apple considers bringing iPhone camera sensor designs in-house

2»

Comments

  • Reply 21 of 24
    danoxdanox Posts: 3,272member
    avon b7 said:
    Xed said:
    avon b7 said:
    Perhaps in the distant future or maybe the rumours is wrong. 
    When you're designing something as advanced as a camera component (or a cellular baseband chip) you assume the distance future, not something arriving in the next product release.
    Absolutely. However, timeframes are dependent on when the rumoured strategy got the green light. If it was already slow cooking somewhere and waiting for the OK or if they have simply decided now to shoot for their own sensor.

    Job listings in sensor imaging might point to something although I'm inclined to think this kind of move would play off a company they may have quietly snapped up in the past. 

    There are several candidates dating back to 2005. And due to their recent work on the Apple Vision Pro combined with the last few iPhones, Apple probably has come across several areas where they want to tightly integrate the OS with the camera sensor, after all how does that get done? And more importantly, how does it get smaller and more energy efficient if you don’t take a hand in creating it. If the eventual goal is to make a device to fit onto a frame of glasses in the future.

    https://en.wikipedia.org/wiki/List_of_mergers_and_acquisitions_by_Apple
    tmay
  • Reply 22 of 24
    danoxdanox Posts: 3,272member
    Normally, I would say that a bit of competition is beneficial for the climate of innovation. However, in the case of SONY and their near-complete dominance in the camera sensor market, they have consistently impressed with their achievements in repeatedly disrupting their own products. This disruption has been in a very positive sense, and almost entirely without any pressure from competition.

    With that said, I still think Apple could find some upside in designing their own camera sensors. And perhaps they have found a completely new and clever way of integrating this with other tech skills that no company has yet thought of.

    I’m just giving one idea here: The first thing that happens after a picture is taken in an iPhone today is it’s sent through various AI processing. This means the signals move from light capturing on the sensor plate, which is parallel, to shift-out serialization, and then back to parallel processing in the neural net. It would have been so much more efficient if the camera sensor elements (or bins of them) were directly connected to the AI circuit. Of course I’m not talking about simply slapping the two chips onto each other. This would require a bit more of chip engineering, but it would revolutionize image capturing/processing in ways that SONY couldn’t. It would resemble how a biological eye-brain system works.

    In short, someone has to build it, the combination of OS and hardware/sensor, that someone has to have deep pockets be motivated to do so and also be in a position to sell it at a profit, in order to stay in the game of making slight iterations on the product/device as time goes on, with the goal of getting that device into that frame of glasses. Who might that be?
    edited November 2023
  • Reply 23 of 24
    melgrossmelgross Posts: 33,600member
    To burst some people’s bubbles, Canon has had global sensors for years. But they don’t make them for consumer cameras, just for high speed industrial uses. While some have been proclaiming that this is a game changing thing, it’s not. Sony’s sensor has some problems related to this type of design. I’m not going to go into all of that here, but this is a first attempt in their part, and it isn’t perfect. It did surprise everyone though.

    i imagine the second version, two or three years from now, will be better. But this doesn’t affect anything Apple may do. I’m far more interested in higher s/n ratios, dynamic range, and the possibility of going away from quad sensors to dual sensors.
Sign In or Register to comment.