Apple considers bringing iPhone camera sensor designs in-house
Apple may continue its trend of designing more of its components in-house by coming up with its own camera sensors, it is rumored, giving it more control over how the iPhone takes and processes photographs.

iPhone 15 Pro Max cameras
Over the years, Apple has gradually brought the design work of its many components to within the company, using its own internal ideas instead of relying on third-party component suppliers. Now, it seems that cameras could be the next component to be targeted by the iPhone maker.
In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman writes that Apple is looking towards creating an "in-house strategy" for designing camera sensors. Photography is a major selling point for iPhones, reasons Gurman, thanks to initiatives such as Shot on iPhone.
However the design work may go beyond just iPhone. With cameras important to fields such as mixed reality and self-driving vehicles, creating its own cameras could be an opportunity to improve future models of the Apple Vision Pro and the often-rumored Apple Car.
Bringing a design process in-house gives Apple opportunities to not only improve how a component functions, but also to better plan future developments and to deeply integrate the hardware with its software.
This is far from the first time Apple has shifted design duties away from suppliers and to its own engineers. It has famously designed its own processors for use in iPhones, as well as Apple Silicon for Mac hardware.
Its efforts to produce a modem may result in new components arriving in late 2025. Work on battery cells could allow iPhones to last even longer in the future, though this is at an exploratory stage for the moment.
Then there's the development of a noninvasive blood glucose sensor which could eventually make its way to the Apple Watch. That project is said to be led by Apple platform architecture group head Tim Millet.
Read on AppleInsider
Comments
I always hoped Apple would buy Sony, after ESPN that is
I bet Apple Silicon would be great for PS6.
As for rumours of phone manufacturers brewing their own image sensors, there have been rumours of Huawei doing the same but as a way of protecting itself against sanctions shenanigans.
That isn't the case with Apple, and Sony's new shutter technology really seems to be game changing on the sensor front so it's hard to see why Apple might want to go that route without that kind of technology, especially as over the last few years more and more weight has been put behind post processing of images. Sony has also been very accommodating of custom modifications of its sensors.
Perhaps in the distant future or maybe the rumour is wrong.
Job listings in sensor imaging might point to something although I'm inclined to think this kind of move would play off a company they may have quietly snapped up in the past.
Specifically to 12Strangers’ concerns: nothing in this report talked about lenses at all, but I would remind you that Apple may ALREADY be in the lens business with the forthcoming Apple Vision Pro, which will features custom lenses for those with imperfect eyesight.
https://www.zeiss.com/vision-care/en/highlights/home.html
They don't need to beat Sony; they need to beat them to ideas that suit Apple but might not be in Sony's focus at this point. To keep the development heading in the direction Apple wants it to move.
Sony is best in class and will be very hard to beat and the question is - aside from unit price, what is currently a pain point with Sony’s sensors?
I just do not see any limitations at this stage and seems like the ISP component inside the SoC combined with the Neural Engine may be better areas of deep innovation and investment due to so much more being done in computational photography.
Photo sensor is not a piece of silicon that can be integrated into the AS chip as is physically separate. Baseband modem seems like a more logical SoC integration piece which they are already working on.
Vision Pro may throw up a range of pain points worthy of investment though.
Personally i am hoping they are doubling down on GPU, ISP and Neural Engine advancements instead. More direct to satellite tech in the iPhone would also be fantastic.
doing their own would be advantageous not necessarily because Apple could design a better sensor, but because they could design one that matches the way their SoC and software manipulates the info coming off the sensor. Right now, they have to work with whatever the sensor provides. That means the specific limitations of the sensor has to be understood and worked around. Their own sensor could be worked to give better results where Apple believes they can do less with the results, and slightly worse where they can do better with it. I know that sound like stepping backwards, but it isn’t. It’s playing into Apple’s strengths
id like to see Apple design a sensor with more onboard processing than they have now. This has been a trend over the years, but it has a long way to go. Apple can do this much better than Sony could.
nikon has an agreement with Sony to modify their designs which Sony then produces from them. It’s a reason why Nikon is understood to be able to pull slightly more IQ out of it than Sony does in their own cameras with the same sensor. That shows that Sony’s designs, by themselves, aren’t the best that can be done. There’s plenty of room for improvement in camera sensors. I’d love to see if Apple can make those improvements.
With that said, I still think Apple could find some upside in designing their own camera sensors. And perhaps they have found a completely new and clever way of integrating this with other tech skills that no company has yet thought of.
I’m just giving one idea here: The first thing that happens after a picture is taken in an iPhone today is it’s sent through various AI processing. This means the signals move from light capturing on the sensor plate, which is parallel, to shift-out serialization, and then back to parallel processing in the neural net. It would have been so much more efficient if the camera sensor elements (or bins of them) were directly connected to the AI circuit. Of course I’m not talking about simply slapping the two chips onto each other. This would require a bit more of chip engineering, but it would revolutionize image capturing/processing in ways that SONY couldn’t. It would resemble how a biological eye-brain system works.