So my one big and only complaint is, why are they still at 12 megapixels?!
I was surprised by this too. There has been a move to pixel binning from larger sensors over the last couple of years and the results have consistently put those phones ahead of Apple in the areas like low light situations.
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
So my one big and only complaint is, why are they still at 12 megapixels?!
I was surprised by this too. There has been a move to pixel binning from larger sensors over the last couple of years and the results have consistently put those phones ahead of Apple in the areas like low light situations.
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
In an era of computational photography, higher megapixels is more bragging rights than anything else, and is just more data that has to be processed. That Apple has such advanced realtime image and video processing, and post processing, now with ProRAW, and Dolbyvision, is likely because they aren't buying into the "more MP is always better" argument.
Let's wait to see what the actual IQ is before dismissing Apple's iPhone, but from the images that I have seen in the presentation, Apple doesn't need to apologize for the IQ.
Maybe you should actually watch the presentation before worrying about comparative tech specs.
So my one big and only complaint is, why are they still at 12 megapixels?!
I was surprised by this too. There has been a move to pixel binning from larger sensors over the last couple of years and the results have consistently put those phones ahead of Apple in the areas like low light situations.
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
In an era of computational photography, higher megapixels is more bragging rights than anything else, and is just more data that has to be processed. That Apple has such advanced realtime image and video processing, and post processing, now with ProRAW, and Dolbyvision, is likely because they aren't buying into the "more MP is always better" argument.
Let's wait to see what the actual IQ is before dismissing Apple's iPhone, but from the images that I have seen in the presentation, Apple doesn't need to apologize for the IQ.
Maybe you should actually watch the presentation before worrying about comparative tech specs.
I'm not an expert or a professional photographer, but this matches the reading I've done. It's not unlike clock speed with processors - in the early days, higher clock speeds meant faster, better devices. Now, clock speed is only part of the story. The same goes with camera sensors. The number of pixels plays a role, but so does the quality of the sensor and the software processing.
Several years ago we went on a trip. I brought a point and shoot camera we had along with my iPhone (I think it was an iPhone 6.) The camera nominally had a better lens and higher resolution sensor, but the phone took better pictures.
The 12 seems like a solid device but I’m not really bowled over by it. I don’t need Dolby HDR video recording. 5G is a big ‘meh’ as well - there’s precious little coverage and no actual use for it so I see no need to pay money for it. The A14 is an impressive chip as well, but my Xs is still plenty fast enough, so I won’t notice any difference there, either. The new camera features are nice but not really worth $1,000-1,400 when the phone I have is paid for and I don’t have anybody that needs a hand-me-down so I’ll be waiting another year or two.
This may end up being a bigger issue than we (and the analysts) realize. The X was so good (and the 11 only only a marginal improvement), I can see many people who don't particularly care about the new, whiz-bang camera features saying, "why don't I wait another year or two." The main thing I like about it is the retro form factor (I absolutely loved that style) and the new dark blue option in the Pro series.
(Anyone else seriously bothered by the stupid iOS keyboard in 13+ that constantly inserts a capitalized first letter when one is editing a sentence by inserting a new word? On what f'in planet does that make sense?! Is there no AI of any kind? Surely, Apple employees use the same keyboard? AAARGH...)
I think this has been a growing problem with smart phones for a few years and I'm sure Apple and other manufacturers are aware. Early on, there were significant upgrades from one device to the next. Even in the 'off' years, the speed difference from an initial version to an 's' version was significant an noticeable. Now we're hitting a plateau for many people. Even if the 12 is an awesome device, for many consumers the differences are not enough to justify the cost of upgrading. Apple is also hurt by the fact that their devices tend to have a longer useful lifespan than many other manufacturers'. Good for us as consumers but it can hurt sales. Sites like this have a disproportionate number of people who will upgrade regardless.
So my one big and only complaint is, why are they still at 12 megapixels?!
I was surprised by this too. There has been a move to pixel binning from larger sensors over the last couple of years and the results have consistently put those phones ahead of Apple in the areas like low light situations.
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
In an era of computational photography, higher megapixels is more bragging rights than anything else, and is just more data that has to be processed. That Apple has such advanced realtime image and video processing, and post processing, now with ProRAW, and Dolbyvision, is likely because they aren't buying into the "more MP is always better" argument.
Let's wait to see what the actual IQ is before dismissing Apple's iPhone, but from the images that I have seen in the presentation, Apple doesn't need to apologize for the IQ.
Maybe you should actually watch the presentation before worrying about comparative tech specs.
I'm not an expert or a professional photographer, but this matches the reading I've done. It's not unlike clock speed with processors - in the early days, higher clock speeds meant faster, better devices. Now, clock speed is only part of the story. The same goes with camera sensors. The number of pixels plays a role, but so does the quality of the sensor and the software processing.
Several years ago we went on a trip. I brought a point and shoot camera we had along with my iPhone (I think it was an iPhone 6.) The camera nominally had a better lens and higher resolution sensor, but the phone took better pictures.
The 12 seems like a solid device but I’m not really bowled over by it. I don’t need Dolby HDR video recording. 5G is a big ‘meh’ as well - there’s precious little coverage and no actual use for it so I see no need to pay money for it. The A14 is an impressive chip as well, but my Xs is still plenty fast enough, so I won’t notice any difference there, either. The new camera features are nice but not really worth $1,000-1,400 when the phone I have is paid for and I don’t have anybody that needs a hand-me-down so I’ll be waiting another year or two.
This may end up being a bigger issue than we (and the analysts) realize. The X was so good (and the 11 only only a marginal improvement), I can see many people who don't particularly care about the new, whiz-bang camera features saying, "why don't I wait another year or two." The main thing I like about it is the retro form factor (I absolutely loved that style) and the new dark blue option in the Pro series.
(Anyone else seriously bothered by the stupid iOS keyboard in 13+ that constantly inserts a capitalized first letter when one is editing a sentence by inserting a new word? On what f'in planet does that make sense?! Is there no AI of any kind? Surely, Apple employees use the same keyboard? AAARGH...)
I think this has been a growing problem with smart phones for a few years and I'm sure Apple and other manufacturers are aware. Early on, there were significant upgrades from one device to the next. Even in the 'off' years, the speed difference from an initial version to an 's' version was significant an noticeable. Now we're hitting a plateau for many people. Even if the 12 is an awesome device, for many consumers the differences are not enough to justify the cost of upgrading. Apple is also hurt by the fact that their devices tend to have a longer useful lifespan than many other manufacturers'. Good for us as consumers but it can hurt sales. Sites like this have a disproportionate number of people who will upgrade regardless.
The 12 seems like a solid device but I’m not really bowled over by it. I don’t need Dolby HDR video recording. 5G is a big ‘meh’ as well - there’s precious little coverage and no actual use for it so I see no need to pay money for it. The A14 is an impressive chip as well, but my Xs is still plenty fast enough, so I won’t notice any difference there, either. The new camera features are nice but not really worth $1,000-1,400 when the phone I have is paid for and I don’t have anybody that needs a hand-me-down so I’ll be waiting another year or two.
This may end up being a bigger issue than we (and the analysts) realize. The X was so good (and the 11 only only a marginal improvement), I can see many people who don't particularly care about the new, whiz-bang camera features saying, "why don't I wait another year or two." The main thing I like about it is the retro form factor (I absolutely loved that style) and the new dark blue option in the Pro series.
(Anyone else seriously bothered by the stupid iOS keyboard in 13+ that constantly inserts a capitalized first letter when one is editing a sentence by inserting a new word? On what f'in planet does that make sense?! Is there no AI of any kind? Surely, Apple employees use the same keyboard? AAARGH...)
Regarding the random begining-or-end-of-word capital insertion - me. It’s driving me bonkers (well, more so than usual). I’m still on (unlucky) 13, and am disappointed to hear that it infects 14 as well. It’s a bug, not a feature. At least, I hope it’s a bug, otherwise the developers responsible have lost their minds.
Accurate cursor placement has become more difficult in mid-13 updates onwards, as well (despite fiddling with the settings). I would be happy to see the magnifying glass make a comeback as my fat fingertips (not really) often obscure where the cursor is.
Having whinged about that, the iPhone 12 Pro looks like being a reason to mothball my ageing 7 Plus. I can see LiDAR having (eventually) application in, e.g., 3D modelling and printing.
The 12 seems like a solid device but I’m not really bowled over by it. I don’t need Dolby HDR video recording. 5G is a big ‘meh’ as well - there’s precious little coverage and no actual use for it so I see no need to pay money for it. The A14 is an impressive chip as well, but my Xs is still plenty fast enough, so I won’t notice any difference there, either. The new camera features are nice but not really worth $1,000-1,400 when the phone I have is paid for and I don’t have anybody that needs a hand-me-down so I’ll be waiting another year or two.
This may end up being a bigger issue than we (and the analysts) realize. The X was so good (and the 11 only only a marginal improvement), I can see many people who don't particularly care about the new, whiz-bang camera features saying, "why don't I wait another year or two." The main thing I like about it is the retro form factor (I absolutely loved that style) and the new dark blue option in the Pro series.
(Anyone else seriously bothered by the stupid iOS keyboard in 13+ that constantly inserts a capitalized first letter when one is editing a sentence by inserting a new word? On what f'in planet does that make sense?! Is there no AI of any kind? Surely, Apple employees use the same keyboard? AAARGH...)
Regarding the random begining-or-end-of-word capital insertion - me. It’s driving me bonkers (well, more so than usual). I’m still on (unlucky) 13, and am disappointed to hear that it infects 14 as well. It’s a bug, not a feature. At least, I hope it’s a bug, otherwise the developers responsible have lost their minds.
Accurate cursor placement has become more difficult in mid-13 updates onwards, as well (despite fiddling with the settings). I would be happy to see the magnifying glass make a comeback as my fat fingertips (not really) often obscure where the cursor is.
The removal of the magnifying glass was strange because they didn't highlight what you're supposed to do instead. To place the cursor now, you do a long press on the spacebar and it turns into a trackpad to move the cursor. This is more accurate than the magnifying glass but not intuitive to figure out. I feel like they should have a popup message when you try to long-press or tap often to position the cursor, at least the first few times, to explain how to use the spacebar instead.
Or they can have a button at the top of the keyboard like the ones in the Notes app that is a toggle button and pressing it turns the whole screen into a trackpad for moving the cursor around and they can have a marker pen for select/copy/paste.
A lot of the time it will be for URL editing. That could be handled better like when the cursor is in the URL, treat each part of a URL as a block and tapping on a block would edit that. Deleting a block higher up can remove the rest of the URL that likely came from an autocomplete. Spacebar can add a block after the current one to save opening the panel for / and URL queries can be split too.
The auto-capital insert isn't very useful either. If you type something like "This is what Tim" and then delete Tim and type something else, it keeps the shift key highlighted even if it's not going to be a name that is put in its place and it does this sometimes for autocorrected words. It would probably be best not to put capitals in at all, it's easy enough to hit the shift key again if it's needed. Autocorrect on spacebar usually inserts the wrong thing for me.
Turning off both auto-capitalization and autocorrect in settings makes for a nicer experience and the correction suggestions still show so they can be tapped when needed:
So my one big and only complaint is, why are they still at 12 megapixels?!
I was surprised by this too. There has been a move to pixel binning from larger sensors over the last couple of years and the results have consistently put those phones ahead of Apple in the areas like low light situations.
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
In an era of computational photography, higher megapixels is more bragging rights than anything else, and is just more data that has to be processed. That Apple has such advanced realtime image and video processing, and post processing, now with ProRAW, and Dolbyvision, is likely because they aren't buying into the "more MP is always better" argument.
Let's wait to see what the actual IQ is before dismissing Apple's iPhone, but from the images that I have seen in the presentation, Apple doesn't need to apologize for the IQ.
Maybe you should actually watch the presentation before worrying about comparative tech specs.
You couldn't be more wrong and clearly haven't done enough reading on the situation.
Apple has a disastrously poor record with low light photography and noise and it is largely down to the hardware and not the software.
But, seeing as you mention software algorithms embedded in ISPs and computational photography in general, Apple was late to that party too with Huawei and Google squeezing the most out of their 'specs' via computational photography. Even before 'computational photography' was a term in its own right.
And by such a large margin that Apple really wasn't even in the game when it came to low light results.
Apple will move to higher specs - I can guarantee you that, but it is moving at its usual slow pace and making users wait.
That's fine of course but don't try to re-write the history of hardware improvements in photography over the last few years. Basically none of them have come from Apple. Just look at optical zoom for another example. Or using AI to stabilise handheld camera shake while taking those industry leading mobile night shots from Huawei phones a few years ago.
Comments
However, I didn't see the presentation so I don't know if they tried to justify not going for more megapixels in some way.
Let's wait to see what the actual IQ is before dismissing Apple's iPhone, but from the images that I have seen in the presentation, Apple doesn't need to apologize for the IQ.
Maybe you should actually watch the presentation before worrying about comparative tech specs.
Several years ago we went on a trip. I brought a point and shoot camera we had along with my iPhone (I think it was an iPhone 6.) The camera nominally had a better lens and higher resolution sensor, but the phone took better pictures.
I think this has been a growing problem with smart phones for a few years and I'm sure Apple and other manufacturers are aware. Early on, there were significant upgrades from one device to the next. Even in the 'off' years, the speed difference from an initial version to an 's' version was significant an noticeable. Now we're hitting a plateau for many people. Even if the 12 is an awesome device, for many consumers the differences are not enough to justify the cost of upgrading. Apple is also hurt by the fact that their devices tend to have a longer useful lifespan than many other manufacturers'. Good for us as consumers but it can hurt sales. Sites like this have a disproportionate number of people who will upgrade regardless.
https://twitter.com/asymco/status/1316326234612269058
It describes all to the price points by year for Apple, and shows the ASP for iPhones for the years that Apple provided it.
Having whinged about that, the iPhone 12 Pro looks like being a reason to mothball my ageing 7 Plus. I can see LiDAR having (eventually) application in, e.g., 3D modelling and printing.
Or they can have a button at the top of the keyboard like the ones in the Notes app that is a toggle button and pressing it turns the whole screen into a trackpad for moving the cursor around and they can have a marker pen for select/copy/paste.
A lot of the time it will be for URL editing. That could be handled better like when the cursor is in the URL, treat each part of a URL as a block and tapping on a block would edit that. Deleting a block higher up can remove the rest of the URL that likely came from an autocomplete. Spacebar can add a block after the current one to save opening the panel for / and URL queries can be split too.
The auto-capital insert isn't very useful either. If you type something like "This is what Tim" and then delete Tim and type something else, it keeps the shift key highlighted even if it's not going to be a name that is put in its place and it does this sometimes for autocorrected words. It would probably be best not to put capitals in at all, it's easy enough to hit the shift key again if it's needed. Autocorrect on spacebar usually inserts the wrong thing for me.
Turning off both auto-capitalization and autocorrect in settings makes for a nicer experience and the correction suggestions still show so they can be tapped when needed:
https://www.howtogeek.com/693539/how-to-disable-auto-correct-on-iphone-and-ipad/
Apple has a disastrously poor record with low light photography and noise and it is largely down to the hardware and not the software.
But, seeing as you mention software algorithms embedded in ISPs and computational photography in general, Apple was late to that party too with Huawei and Google squeezing the most out of their 'specs' via computational photography. Even before 'computational photography' was a term in its own right.
And by such a large margin that Apple really wasn't even in the game when it came to low light results.
Apple will move to higher specs - I can guarantee you that, but it is moving at its usual slow pace and making users wait.
That's fine of course but don't try to re-write the history of hardware improvements in photography over the last few years. Basically none of them have come from Apple. Just look at optical zoom for another example. Or using AI to stabilise handheld camera shake while taking those industry leading mobile night shots from Huawei phones a few years ago.