Apple ramping up hiring to bolster in-house camera hardware, software teams
The number of open positions in Apple's camera engineering group has risen dramatically in recent weeks, as the company looks to maintain its lead in an increasingly competitive mobile imaging market.

Apple advertised 9 openings in February and has added at least 15 more just 12 days into March, with a majority located at the company's Cupertino, Calif. corporate headquarters. A smattering of other positions, mostly related to manufacturing and vendor relations, are available in Tokyo, Taipei, Shanghai, and Tel Aviv, AppleInsider discovered.
While the camera modules that make it into Apple products are manufactured by partners like Sony, Apple is said to make significant contributions to the components' design. The unique five-element lens system that debuted in the iPhone 5, for instance, is an in-house innovation.
In that spirit, recent postings touch on nearly every facet of the camera design process. Sensor design, lens metrology, camera module integration, mechanical engineering, firmware, and computational imaging software positions are all among those represented.
One posting in particular, for a camera prototyping engineer, gives a unique insight into the way Apple develops its products.
"The Camera Experience Prototyping team is responsible for the early prototyping of the potential experience of new products or features to the wider team," the posting reads. "The technologies used for the demonstrations do not have to be representative of what will be used in production. They should be able to provide a real demonstration of the user experience defined by the User Experience Leader that allows the user value of the feature to be correctly assessed."
This process is representative of late Apple CEO Steve Jobs's goal "to start with the customer experience and work backwards to the technology," and likely explains Apple's desire to bring as much of its component development as possible in-house rather than relying on off-the-shelf hardware from third-party suppliers.
Among the mobile imaging upgrades Apple is known to be working on are interchangeable lenses, refocusable light-field modules, multi-device synchronized flash, and multi-sensor, multi-lens cameras. The company has also made a number of acquisitions in the field, most notably last year's $360 million deal for Israeli 3D imaging company PrimeSense.

Apple advertised 9 openings in February and has added at least 15 more just 12 days into March, with a majority located at the company's Cupertino, Calif. corporate headquarters. A smattering of other positions, mostly related to manufacturing and vendor relations, are available in Tokyo, Taipei, Shanghai, and Tel Aviv, AppleInsider discovered.
While the camera modules that make it into Apple products are manufactured by partners like Sony, Apple is said to make significant contributions to the components' design. The unique five-element lens system that debuted in the iPhone 5, for instance, is an in-house innovation.
More than half of the positions have been posted in the last two weeks.
In that spirit, recent postings touch on nearly every facet of the camera design process. Sensor design, lens metrology, camera module integration, mechanical engineering, firmware, and computational imaging software positions are all among those represented.
One posting in particular, for a camera prototyping engineer, gives a unique insight into the way Apple develops its products.
"The Camera Experience Prototyping team is responsible for the early prototyping of the potential experience of new products or features to the wider team," the posting reads. "The technologies used for the demonstrations do not have to be representative of what will be used in production. They should be able to provide a real demonstration of the user experience defined by the User Experience Leader that allows the user value of the feature to be correctly assessed."
This process is representative of late Apple CEO Steve Jobs's goal "to start with the customer experience and work backwards to the technology," and likely explains Apple's desire to bring as much of its component development as possible in-house rather than relying on off-the-shelf hardware from third-party suppliers.
Among the mobile imaging upgrades Apple is known to be working on are interchangeable lenses, refocusable light-field modules, multi-device synchronized flash, and multi-sensor, multi-lens cameras. The company has also made a number of acquisitions in the field, most notably last year's $360 million deal for Israeli 3D imaging company PrimeSense.
Comments
Yeah, these two articles and the comments should be combined.
FWIW, I believe the standalone camera and/or the iDevice camera are a much bigger deal than an AppleTV Motion Sensor.
I do wish that Apple would create a dedicated camera to rival the Galaxy camera. It's a nice camera but the OS is terrible, nothing is intuitive and transferring photos with the Kies software is traumatic.
Several years ago one of the major compact digital camera manufacturers brought to market a camera with the lens mounted "sideways", for want of a better term: Instead of having to squeeze the optics into a camera less than 3/4 of an inch thick, they were able to incorporate a longer zoom with the mechanics fitted into the camera's longer dimension, so there was no sacrifice of either focal length or thickness. The lens viewed through a 45-degree mirror so the camera still "pointed" towards the subject the same as conventional cameras, and the viewfinder was still on the back as in other cameras--and phones.
Might that be a strategy for iPhones? Does anyone remember that camera? Does anyone know more about that tech?
You got two subsequent stories out of this?
Page clicks trump all other considerations...evidently.
The feature I would most welcome in a new camera design is high quality zoom across a wider focal length range. Do this while maintaining the low light possibilities so that the zoom is actually usable in a wide array of settings. This comes back to being a physics problem because a wide zoom and low light capability leads to larger optics.
The other possibility here is that Apple can leverage new sensor technologies to improve low light and possibly help maintain optics size. Here I'm thinking quantum dots.
In any event I could see Apple simply mold a bump into the new iPhone to accommodate better optics. D me right it could become an attractive design element.
Several years ago one of the major compact digital camera manufacturers brought to market a camera with the lens mounted "sideways", for want of a better term: Instead of having to squeeze the optics into a camera less than 3/4 of an inch thick, they were able to incorporate a longer zoom with the mechanics fitted into the camera's longer dimension, so there was no sacrifice of either focal length or thickness. The lens viewed through a 45-degree mirror so the camera still "pointed" towards the subject the same as conventional cameras, and the viewfinder was still on the back as in other cameras--and phones.
Might that be a strategy for iPhones? Does anyone remember that camera? Does anyone know more about that tech?
Well, the tech is old, really old. That's a reflex mirror. The origin of the reflex mirror dates back to 1676 and the camera obscura.
There are inherent limitations/compromises that are made when using a reflex mirror. The retrofocus lens design typically results in expense and bulk (additional lens elements and groupings to provide adequate optical performance), particular for wide-angle optics. For an SLR, you just make the lens housing bigger, put the lens motor in the lens itself and tell the user to get a monopod/tripod if the lens weighs too much (and buy some extra rechargeable batteries).
When you get into zoom optics, the problem is exacerbated since the lens design to provide both good wide-angle and good zoom performance generates lens groupings that are costly (lots of elements, lots of groups, lots of exotic components like rare earth glass or aspherical elements).
These are not reasonable options for the maker of a cellphone camera module.
Also, for reflex optical designs, you really need to put the primary lens group in front of the reflex mirror. Again, that doesn't work well for the cellphone concept.
I don't remember the camera itself, but that's probably why that point-and-shoot reflex camera design died ignominiously.
No bumps. Samsung has bumps. Bumps are for chumps.
And yes, I have in fact done comparisons, calling my home phone from my iPhone 4 and several different phones, including a Galaxy S3, an HTC model I don't recall, my daughter's "dumb" phone, a Nokia 521 and one of the new Lumias. The only one that was worse was the old Nokia 521.
It's a great device, but the phone is not what it could be, and I wish they'd concentrate on improving that before mucking with a camera that is already pretty darn good.