Google's Pixel 8 series offers extended software support & AI camera features

2»

Comments

  • Reply 21 of 25
    tmaytmay Posts: 6,453member
    gatorguy said:
    danox said:
    tmay said:
    gatorguy said:
    Marvin said:
    byronl said:
    The camera and AI camera features on the 7 Pro were amazing, i’m looking forward to watching reviews of this

    their Video enhancement feature also looks promising and may even finally be able to compete with the iPhone’s video
    The video enhancement uploads to Google's server for processing then downloads it back onto the phone.

    https://www.androidpolice.com/google-pixel-8-pro-best-video-feature-magically-processed-cloud/

    I wouldn't trust sending private videos to any tech company, certainly not an ad company known for multiple privacy violations. The bandwidth requirements to upload 4K footage means either the footage is heavily compressed to begin with or not feasible to upload/download over cellular. People will most likely be stuck with a loading bar for an hour, get fed up waiting and never use the feature again.

    This won't compete with recording ProRes HDR to an SSD and being able to do high-end color grading on it. Apple gives people practical features they will use, companies like Google just need bullet points for their marketing pages and more ways to collect personal data.
    Marvin, while I generally respect your opinions, aren't you simply guessing at the quality, how it will be accomplished, and the privacy implications? As far as I know the service has not been reviewed at all, not even by beta. Even then the number of videos a typical Apple users high-end color grades must be a teeny-tiny percentage of the ones captured. Post-processing is not an easy task, particularly for the impatient or newbie.  If this makes it easier to improve a video someone takes, a birthday party, company event, some wedding or baby shower you attend, how could it not be a good thing for those of us who don't have the PP skills required to call ourselves professionals at it? 

    So as far as what quality you get back from an uploaded video no one knows. It may be amazing, crappy, or somewhere in between. In between is my expectation which would result in a no-work and no-skill-needed improvement over the original. What I suspect is the end-game is eventually moving this on-device, but to begin with testing various algorithms, and encouraging user feedback. Google seems to be all-in with on-device AI processing. and I expect this one will make its way there at some point. 
    Many years ago, Google innovated imagining features (low and night light) that required cloud computing, so it's obvious that Google will eventually move this processing to a future Pixel Phone. Apple was late to the party on that, but when it arrived, it arrived in phone, with very low latency, which was an obvious benefit to the user.

    My recollection is that Apple has never offloaded imagining or video processing to the cloud, so in some ways, this was advantageous to Android OS early on, and later, to the Pixel. Siri was certainly an exception to that, as were other Apple services, and yet, Apple didn't seem to suffer from adding features when the silicon was there to allow it, presumable given consumers ability of extrapolate Apple's advantage in silicon.

    One of the constant talking points of the Android OS world, is how Android almost always has features before iPhone does, and that is certainly true, but it is also true that Apple, with a limited number of models, stresses the supply chain even to add a folded lens to its current iPhone 15 Pro Max, which will sell in the ten's of millions of units, sales that no other flagship phone model would ever see.

    While I'm happy to see Google's success with the Pixel, it's still apparent to me that there is notable differentiation between iPhone's video capability, and Pixel Phone's, a great example of consumer choice at work.





    Choice has worked. Apple dominates the upper end and Google gets to do me too projects. Apple as a vertical hardware/software company, has never been rootless like a software only company, and Google is just dabbling in hardware. Googles (Pixel) geekbench me too position (53) says it all. And when Google fails with a hardware project they have shown that they will just cut the cord and run.

    https://browser.geekbench.com/mobile-benchmarks





    Rather than promoting raw speeds as Apple does, which says little about a phone's capabilities and features, hasn't Google's Pixel all along put a greater emphasis on photography and smart features as their diffentiator? I've often seen comments from AI members that it they had to buy an Android phone it would probably be a Google Pixel. 
    iPhone 15 Pro murders any Android OS device for video quality, thanks to its A Series "raw speed" and ProRes Log video, and that doesn't require access to the cloud. Pack an iPad along and edit what you shoot, wherever you are. According to DXOMark, iPhone 15 Pro nearly matches the Huawei P60 Pro for imaging, so probably not much of an advantage to the Pixel 8, but that's to be seen.

    It would be an easy argument that computational imaging doesn't require the same level of performance, but gee, I can imagine using the Pixel without cell coverage is comparatively "limiting". I live in the foothills of the Sierra, so outdoor excursions may not have coverage at times.

    Again, choices.

    https://halide.cam/

    https://www.lux.camera/halide-for-ios-17/
    watto_cobra
  • Reply 22 of 25
    gatorguygatorguy Posts: 24,564member
    danox said:
    gatorguy said:
    danox said:
    tmay said:
    gatorguy said:
    Marvin said:
    byronl said:
    The camera and AI camera features on the 7 Pro were amazing, i’m looking forward to watching reviews of this

    their Video enhancement feature also looks promising and may even finally be able to compete with the iPhone’s video
    The video enhancement uploads to Google's server for processing then downloads it back onto the phone.

    https://www.androidpolice.com/google-pixel-8-pro-best-video-feature-magically-processed-cloud/

    I wouldn't trust sending private videos to any tech company, certainly not an ad company known for multiple privacy violations. The bandwidth requirements to upload 4K footage means either the footage is heavily compressed to begin with or not feasible to upload/download over cellular. People will most likely be stuck with a loading bar for an hour, get fed up waiting and never use the feature again.

    This won't compete with recording ProRes HDR to an SSD and being able to do high-end color grading on it. Apple gives people practical features they will use, companies like Google just need bullet points for their marketing pages and more ways to collect personal data.
    Marvin, while I generally respect your opinions, aren't you simply guessing at the quality, how it will be accomplished, and the privacy implications? As far as I know the service has not been reviewed at all, not even by beta. Even then the number of videos a typical Apple users high-end color grades must be a teeny-tiny percentage of the ones captured. Post-processing is not an easy task, particularly for the impatient or newbie.  If this makes it easier to improve a video someone takes, a birthday party, company event, some wedding or baby shower you attend, how could it not be a good thing for those of us who don't have the PP skills required to call ourselves professionals at it? 

    So as far as what quality you get back from an uploaded video no one knows. It may be amazing, crappy, or somewhere in between. In between is my expectation which would result in a no-work and no-skill-needed improvement over the original. What I suspect is the end-game is eventually moving this on-device, but to begin with testing various algorithms, and encouraging user feedback. Google seems to be all-in with on-device AI processing. and I expect this one will make its way there at some point. 
    Many years ago, Google innovated imagining features (low and night light) that required cloud computing, so it's obvious that Google will eventually move this processing to a future Pixel Phone. Apple was late to the party on that, but when it arrived, it arrived in phone, with very low latency, which was an obvious benefit to the user.

    My recollection is that Apple has never offloaded imagining or video processing to the cloud, so in some ways, this was advantageous to Android OS early on, and later, to the Pixel. Siri was certainly an exception to that, as were other Apple services, and yet, Apple didn't seem to suffer from adding features when the silicon was there to allow it, presumable given consumers ability of extrapolate Apple's advantage in silicon.

    One of the constant talking points of the Android OS world, is how Android almost always has features before iPhone does, and that is certainly true, but it is also true that Apple, with a limited number of models, stresses the supply chain even to add a folded lens to its current iPhone 15 Pro Max, which will sell in the ten's of millions of units, sales that no other flagship phone model would ever see.

    While I'm happy to see Google's success with the Pixel, it's still apparent to me that there is notable differentiation between iPhone's video capability, and Pixel Phone's, a great example of consumer choice at work.





    Choice has worked. Apple dominates the upper end and Google gets to do me too projects. Apple as a vertical hardware/software company, has never been rootless like a software only company, and Google is just dabbling in hardware. Googles (Pixel) geekbench me too position (53) says it all. And when Google fails with a hardware project they have shown that they will just cut the cord and run.

    https://browser.geekbench.com/mobile-benchmarks





    Rather than promoting raw speeds as Apple does, which says little about a phone's capabilities and features, hasn't Google's Pixel all along put a greater emphasis on photography and smart features as their diffentiator? I've often seen comments from AI members that it they had to buy an Android phone it would probably be a Google Pixel. 


    What it explains is why Google is sending the information to the cloud, and then ricocheting that info back to the phone...
    It would be an easy argument that computational imaging doesn't require the same level of performance, but gee, I can imagine using the Pixel without cell coverage is comparatively "limiting". 
    (The rest is TLDR)
    You haven't done your research. All Pixel computational imaging is done on-device, not over a cell signal to Google's home, and has been for years.

    Almost none of the services on the latest Pixels are done in the cloud and then sent back. Call-screening? On-device. Zoom enhance? On-device. Voice recognition and text entry? On-device. Magic Editor? On-device. AudioEraser? On-device. HeyGoogle? On-device. This particular video enhancement feature "coming soon" is an outlier, with the vast majority of data processing happening on-device, including essentially all information requests, voice recognition, and generative AI searches. Check and verify if you wish to keep me honest. Google's hardware is plenty fast enough to handle it.

    https://technews180.com/googles-pixel-8-pro-to-pioneer-on-device-ai-power/
    edited October 2023
  • Reply 23 of 25
    danoxdanox Posts: 3,240member
    gatorguy said:
    danox said:
    gatorguy said:
    danox said:
    tmay said:
    gatorguy said:
    Marvin said:
    byronl said:
    The camera and AI camera features on the 7 Pro were amazing, i’m looking forward to watching reviews of this

    their Video enhancement feature also looks promising and may even finally be able to compete with the iPhone’s video
    The video enhancement uploads to Google's server for processing then downloads it back onto the phone.

    https://www.androidpolice.com/google-pixel-8-pro-best-video-feature-magically-processed-cloud/

    I wouldn't trust sending private videos to any tech company, certainly not an ad company known for multiple privacy violations. The bandwidth requirements to upload 4K footage means either the footage is heavily compressed to begin with or not feasible to upload/download over cellular. People will most likely be stuck with a loading bar for an hour, get fed up waiting and never use the feature again.

    This won't compete with recording ProRes HDR to an SSD and being able to do high-end color grading on it. Apple gives people practical features they will use, companies like Google just need bullet points for their marketing pages and more ways to collect personal data.
    Marvin, while I generally respect your opinions, aren't you simply guessing at the quality, how it will be accomplished, and the privacy implications? As far as I know the service has not been reviewed at all, not even by beta. Even then the number of videos a typical Apple users high-end color grades must be a teeny-tiny percentage of the ones captured. Post-processing is not an easy task, particularly for the impatient or newbie.  If this makes it easier to improve a video someone takes, a birthday party, company event, some wedding or baby shower you attend, how could it not be a good thing for those of us who don't have the PP skills required to call ourselves professionals at it? 

    So as far as what quality you get back from an uploaded video no one knows. It may be amazing, crappy, or somewhere in between. In between is my expectation which would result in a no-work and no-skill-needed improvement over the original. What I suspect is the end-game is eventually moving this on-device, but to begin with testing various algorithms, and encouraging user feedback. Google seems to be all-in with on-device AI processing. and I expect this one will make its way there at some point. 
    Many years ago, Google innovated imagining features (low and night light) that required cloud computing, so it's obvious that Google will eventually move this processing to a future Pixel Phone. Apple was late to the party on that, but when it arrived, it arrived in phone, with very low latency, which was an obvious benefit to the user.

    My recollection is that Apple has never offloaded imagining or video processing to the cloud, so in some ways, this was advantageous to Android OS early on, and later, to the Pixel. Siri was certainly an exception to that, as were other Apple services, and yet, Apple didn't seem to suffer from adding features when the silicon was there to allow it, presumable given consumers ability of extrapolate Apple's advantage in silicon.

    One of the constant talking points of the Android OS world, is how Android almost always has features before iPhone does, and that is certainly true, but it is also true that Apple, with a limited number of models, stresses the supply chain even to add a folded lens to its current iPhone 15 Pro Max, which will sell in the ten's of millions of units, sales that no other flagship phone model would ever see.

    While I'm happy to see Google's success with the Pixel, it's still apparent to me that there is notable differentiation between iPhone's video capability, and Pixel Phone's, a great example of consumer choice at work.





    Choice has worked. Apple dominates the upper end and Google gets to do me too projects. Apple as a vertical hardware/software company, has never been rootless like a software only company, and Google is just dabbling in hardware. Googles (Pixel) geekbench me too position (53) says it all. And when Google fails with a hardware project they have shown that they will just cut the cord and run.

    https://browser.geekbench.com/mobile-benchmarks





    Rather than promoting raw speeds as Apple does, which says little about a phone's capabilities and features, hasn't Google's Pixel all along put a greater emphasis on photography and smart features as their diffentiator? I've often seen comments from AI members that it they had to buy an Android phone it would probably be a Google Pixel. 


    What it explains is why Google is sending the information to the cloud, and then ricocheting that info back to the phone...TLDR
    Almost none of the services on the latest Pixels are done in the cloud and then sent back. This particular video enhancement feature is an outlier, with the vast majority of processing happening on-device. Check and verify if you wish. Google's hardware is plenty fast enough to handle it.
    Google is adding that as an upgrade, it was not ready at the time of the intro, somewhat similar to Apple with the spatial video recording capability for the 15 Pro and the 15 Pro Max, which will be upgraded before the release of the Apple Vision Pro next year. It is a feature Google needs to have If they don’t, they really don’t have much going for them with this upgrade to the Pixel 8.
    gatorguy
  • Reply 24 of 25
    There’s no way to know now but I’m curious how 7 years of updates is going to go on the lackluster Tensor processor. Numerous comments in forums point out how their last gen processor left a lot to be desired with battery life and speed. 
  • Reply 25 of 25
    gatorguygatorguy Posts: 24,564member
    frost_0ne said:
    There’s no way to know now but I’m curious how 7 years of updates is going to go on the lackluster Tensor processor. Numerous comments in forums point out how their last gen processor left a lot to be desired with battery life and speed. 
    Early (very early!) indications are that this one is much better, as well as easier on the battery and thermals.

    Another interesting thing: Pixel 6 and 7's were apparently gifted better battery life and thermal performance on those older Tensor processors via Android 14. I guess like Apple, Google figured out how to address those issues in firmware post-sale. 
    ctt_zh
Sign In or Register to comment.