Godofbiscuitssfca
About
- Username
- Godofbiscuitssfca
- Joined
- Visits
- 1
- Last Active
- Roles
- member
- Points
- 23
- Badges
- 0
- Posts
- 5
Reactions
-
Twitter Inc. no longer exists, now X Corp.
-
Apple TV+ considering bid to stream UK soccer
MickeN71 said:The name of the sport is FOOTBALL!
Who name a game football, when you most of the time use hands?It was the Brits who came up with the name soccer in the beginning, so get off your high horse. If it’s shown in the USA — which I really hope it is — they’re going to have to avoid name collision/confusion. That’s just a fact.How well would a chip shop anywhere in Britain or the rest of the UK do if it served “fish & fries” on its menus? -
Samsung expands OLED 4K TV lineup with two new series
-
iPhone vs Android: Two different photography and machine learning approaches
avon b7 said:Godofbiscuitssfca said:You’re not quite understanding the line that’s being crossed here: Samsung is introducing foreign imagery to the shots you are taking. Apple is not. With Samsung, you cannot be sure that what you’re seeing is what YOU captured. This is a completely different class of modification vs what’s going on with Apple’s computational photography. Apple is enhancing or otherwise applying modifications to image features that already exist. Samsung is introducing image features your image MAY NEVER HAVE HAD.
On Huawei phones you can toggle the AI enhancements on/off even from the finished image in the Gallery.
In the specific case of the moon, I'm sure that the vast majority of people who see a great moonscape in real life but end up with a blurry white blob in their photo, would take the 'enhanced' version every time.
It might not be what the 'sensor/ISP' saw but it would look more like what the user actually saw.Not nearly good enough. Not even close. “Scene optimization”? Sure. That’s whats going on.This shouldn’t be on by default. It should be explained. It frankly shouldn’t exist as a feature without a secure paper trail to go with it on every photo that’s taken. “You’re sure the vast majority”. “Might not bel” what the sensor saw. “Would look more like” what the user actually saw.These qualified statements should scare the living sh*t out of everyone. -
iPhone vs Android: Two different photography and machine learning approaches
You’re not quite understanding the line that’s being crossed here: Samsung is introducing foreign imagery to the shots you are taking. Apple is not. With Samsung, you cannot be sure that what you’re seeing is what YOU captured. This is a completely different class of modification vs what’s going on with Apple’s computational photography. Apple is enhancing or otherwise applying modifications to image features that already exist. Samsung is introducing image features your image MAY NEVER HAVE HAD.