analogjack

About

Username
analogjack
Joined
Visits
297
Last Active
Roles
member
Points
1,772
Badges
2
Posts
1,073
  • Compared: Fortnite on the iPhone XR versus Note 9

    I never thought I'd see the day but I genuinely believe that we've already hit Peak iPhone vs Samsung.
    williamlondon
  • Blind comparison of photography on the iPhone XR versus Google Pixel 3 XL

    I started working in a small professional laboratory in 1973, I used to process E3 by hand in a sink line for some of Sydney's top photographers, most notably Phil Grey. Colour was super critical. For the reversal exposure I had to unroll all of Phil's 120 reels and wave a 500W tungsten lamp over them after they came out of the hardener, otherwise there'd be an almost imperceptible 5R colour shift on the first 3 frames. Nearly everything was clip tested.

    I moved to London in '75 and worked at Morgan & Swan in Camden Town, just when E6 was invented, this was the most anal lab I've ever seen, they specialised in large format duplicates and we used 6121 dupe film, but they were so friggen fussy that potential duplicates were split off into ones for the new 6121 dupe film and ones that used tungsten camera film EHB, however to use this we first had to make highlight and contrast masks, using lith film for the highlight and a pan masking film for the contrast mask. The  contrast mask would also have specified filters to further control the colour contrast of the black and white mask. I mention all this because it's burned into my psyche. 

    I worked in various other pro labs and my last proper job was in London doing various stuff for Adam B but one of our clients was Actis, who would explode with rage if there was the slightest colour imbalance in their E6 processed digital images that sometimes took all night to render. That was in the days when the G3 was king and the G4 just got released, although Actis were using Sun workstations running at a blindingly fast 300mhz. Ha!

    I mention all this because I feel qualified to make judgements between two images, because this is what my job was, having to weigh up all the various considerations. In the end I had to produce an image that would focus on what was important. This is not something that most people understand. Sometimes the colour of the footpath in a shot that has nothing to do with the footpath is important. Other times it isn't. Sometime you want to capture the fact that it's late afternoon and an overly warm shot is obviously right. Other times in a very similar situation, some or all of that warmth is better removed, it's a cliché but every shot is different. Often it's instantly recognisable what is right.

    Some of the shots offered up are confusing, for example the bear in the shot. One shot has the bear with the background the other shot focuses solely on the bear and blurs everything else. The one on the left with the background is better, but if the bear was the hero and it's impossible to tell from the shot, then the one on the right is better even if it's a bit warm.

    Some shots are just taken badly, you have the opportunity to use fill flash on a portrait shot but it's not used. Other times you might be taking portrait shots and would be quite happy to use a simply gobo, which could just be someone with a white shirt nearby, or you might take advantage of a bright wall to help reflect light into the shot. In cases like this we might prefer a camera that is inherently cold or one that captures the warmth. 

    In other words, many of these tests are pretty well useless. But the camera tests that are done in labs with specialised equipment like DXOmark, are total bullshit. Sure the science is right but the reality is entirely different. To make these comparisons meaningful you have to first interrogate the photographer and ask them what exactly is it that they want to capture?

    Do you shoot for the lowest common denominator like the average point and shoot person. There's nothing wrong with that, but they are not really going to give a fuck about minor difference in hue. Or do you shoot for someone who is smart enough to know that they are going to get a very blue shot in shade but know they can point the camera at something else and lock the exposure and colour, and then take the shot. 

    Having said all that, and without knowing which was which, I ended up with 7 of A and 7 of B.


    EDIT: I see there are 15 shots and I recounted my choices 7A and 8B, for the final shot I wrote, 'stupid' but A' stupid because either could be considered better. This is one of those instances where a *small* correction for the late reddish light, is better, but it's a tad too dark. It's also dumb because it is two totally different shots. Really it's still 7 of each with one split decision. 

     Overall though I think the Pixel is doing a better job, the iPhone is trying to be all things to all people. The Apple's algorithms  do not realise that it's better to not pull out too much detail from shadows, like in the shot with the river and pine trees, the iPhone loses the drama but the iPhone shot might be preferable to an amateur. The Pixel is not always successful but it's clearly aimed at photographers it seems to have put more thought into its algorithms. Yet, looking at Apple's non pro customers, I think they'd be happier with the iPhone. Horses for courses. 

    If you guys want to do a proper test you have to go out with a photographer and ask them what they hope to achieve then take the shot with both cameras and then see how each one did compared to what was wanted. 

    So in the end, what is the best shot, the one that I know is better or the one that the client likes. I think Apple is better at turning out a shot that the amateur customer will be happier with, but the Pixel is turning out a shot that is better. However the pixel often is comically inept at this like the portrait with the woman with her hands behind her back, whereas overall Apple makes fewer blunders.
    larryaMplsPpscooter63
  • Apple's work towards waterproof iPhones continues with new sealing technology

    nunzy said:
    Can't innovate my ass!
    Do not underestimate Apple, if they put their mind to it they probably could innovate your ass.
    nunzymuthuk_vanalingamwatto_cobra
  • Messages in iCloud: Everything you need to know

    Apparently apple stores messages in iCloud but I have done a clean install on my Mac and iPhone and now I'm trying to work out how to bring back a text message thread from last week? Is this possible? 

    Something unrelated that I also find odd is that even though I have save for offline viewing on iOS and Mac for reading list, they are not available when I go offline.

    jcbigears said:
    Ive updated on my Mac and IOS devices and its not showing up in the preferences on any device. Maybe not yet for outside USA?
    I can confirm that the 'enable messages in iCloud button' is there in Australia
    jcbigearsjcbigearswatto_cobra
  • Apple hit with patent suits over iPhone X camera, iOS 11 'Do Not Disturb While Driving' fe...

    Soli said:
    BTW a 'velocity sensor' on a phone is a violation of Einstein's special relativity. Accelerometer, yes, 'velocity sensor', no. 
    Can you clarify and explain? 
    Well all *unaccelerated* motion is relative. The laws of physics are the same in all inertial (not accelerating) reference frame. If you're on a plane travelling at a constant 600km/hour with no turbulence then then your iPhone cannot tell if you have a velocity relative to when you were waiting in line at the airport. The gps can calculate your velocity but that's not a 'sensor', in the sense that we use the word for an iPhone, like a heart rate sensor which senses either an electric pulse shining a light through your finger to read the variation in colour as the blood pulses through or via other means. An aeroplane has a pitot tube to measure the air speed nothing like that on the phone. I guess the accelerometer could measure steps and approximate a velocity, but there's no 'velocity sensor' as such.

    Solibonobobavon b7jony0dasanman69ols