sloaah

About

Username
sloaah
Joined
Visits
29
Last Active
Roles
member
Points
143
Badges
0
Posts
31
  • Apple Vision Pro review: six month stasis

    I think the frustrating thing is how hands off Apple is with development. They're approaching it like the iPhone – build it and the developers will come. But the limited market means that it's in danger of being more like the Apple Watch, which continues to have a lackluster App Store. At least with the Watch the apps aren't mission critical, since there's already a strong use case with just the in-built health trackers.

    The only reason the Quest etc. have an organic market as they do is because Meta poured in money into games development for that. None of those games are being ported over to the Vision Pro, because 1) the market is too small, 2) some of them are still Meta exclusives (typically for 3 years), and 3) it requires rethinking input because of the lack of controllers on the Vision Pro.

    As a VR studio ourselves, we'd love to create content for the Vision Pro... but when the top apps have in total around 10K downloads, of which maybe a quarter at best are paying... who would build an app for a device that only has 2500 paying users? You're talking about 10-25k in revenue after you've taken Apple's 30% cut. The major XR experiences are all high six figures or even seven figures in terms of costs of development.

    IMO it's a bit of an own goal from Apple... If they offered even a $50mn fund, you'd have around 50 solid apps for the Vision Pro that push its limits. At the moment there's basically nothing that makes full use of it.
    williamlondon
  • Apple Intelligence wasn't trained on stolen YouTube videos

    wdowell said:
    macca said:
    Correction England doesn’t have a Parliament. Its the British Parliament
    It’s not British Parliament - it’s UK parliament - (Northern Ireland..  ) https://www.parliament.uk/
    The majority of Northern Irish (ie non-republicans) would describe themselves as British, so colloquially it’s fine to say the British Parliament. After all, the island of Ireland is part of the British Isles. 
    williamlondonwatto_cobra
  • Canon: No camera can truly capture video for Apple Vision Pro

    yeah, I’m missing something here too. The canon virtual 3-D lens they talk about when coupled with an R5 will do four megapixels for each eye at 30 frames for second. What happens if you show 30 frames per second on Apple vision that’s rendering it 60? Seems like it would still just be fine.

    I'm a filmmaker and have worked in VR in the past, so I can give some insight.

    The reason why the resolution is apparently so high is because this is for 180VR films. The videos occupy half of a sphere (180º). Though the Apple displays are 3.6k horizontally, that's at roughly 105º FoV; so 3.6k/105*180 = 6.2k resolution per eye. 

    If you're recording both frames on one sensor, which is how it's done on the Canon Dual Fisheye lens (and which is the easiest way to keep the lenses to an inter-pupillary distance of 60mm (roughly the distance between our eyes), then you need a resolution of 12.4k (Horizontal) x 6.2k (Vertical) = 77MP. There is also some resolution loss given the fact that the fisheyes are not projecting onto the full sensor – they are project just two circles side by side on a rectangular sensor – so I would imagine 100MP would be roughly right to retain resolution across the scene.

    As to frame rate. Cinema is 23.98fps and 180º shutter, which means that the shutter is actually closed half the time and open the other half of the time. It leads to a certain strobing which subconsciously we associate with the world of cinema. Nobody really knows why this is so powerful, but maybe it helps remove us a bit from the scene so our brains associate it more as something we're observing rather than us being part of. Tbh I'm not really sure.

    But with immersive video, we want to do the opposite. Rather than emphasise detachment, we want to emphasise immersion. And so we want to shoot at a frame-rate which is roughly at the upper end of what the human eye can discern, removing as much strobing as possible. That means roughly 60fps. The fact that there are two frames being shown, one for each eye, doesn't alter this equation. It still needs to be 60fps per eye.

    The Canon dual fisheye on an EOS R5C produces two image on an 8k frame. The two images render to a single image which is half of 8k. This suggests two synced 8k cameras could work and that it doesn’t all have to occur on a single sensor as is suggested in that statement. 
    That is true, but it is difficult to get the lens spacing to match the 60mm inter-pupillary distance that I mentioned. If you remain constrained to this distance, then a single sensor is the most effective way to achieve this, because you don't have any dead space between the sensors and thus you can maximise sensor size. It can also ensure that you don't have any sync drifting between left and right eyes, which can be a tricky problem to solve.

    In theory you could presumably also create some sort of periscope system so that the two sensors can be entirely detached; but I imagine this would be very costly.

    Looking at the BTS shots of the Apple cameras, they interestingly don't follow this inter-pupillary distance rule. Nor does the iPhone 15 Pro for that matter. The Vision Pro isn't available in my region, so I haven't had a chance to see what these spatial videos look like, but I wonder if there is some interesting computational work happening to correct for this. That sort of computational photography work – which essentially repositions the lenses in software by combining the image data with depth data – is definitely implemented in how the Vision Pro does it's video pass-through, where the perspective of the cameras at the front of the headset are projected back to where the user's eyes are.

    If there is a computational element going on here, then that's hugely interesting because a) it effectively solves this issue with needing to use one sensor, and b) it opens up intriguing possibilities of allowing a little bit of head movement, and corresponding perspective changes (i.e. true spatial video rather than just 3D video – or what is called 6DoF in the industry).
    badmonkcg27damonbpooleAlex_Vailoopedchasmwatto_cobra
  • Apple Vision Pro followup could be 18 months away

    This is purely speculative, but I would have imagined the next Vision Pro would be launched alongside or after the release of the M5 chip… end of 2025 or even 2026. 

    With everybody treating this version as Series 0, Series 1 would need to iron out a lot of the short fallings of the current headset, as well as any supply issues. 18 months just doesn’t seem enough time for substantive technological progress to do that.
    badmonkwilliamlondon
  • Kuo reiterates 120 mm tetraprism camera coming to iPhone 16 Pro

    Agreed on all the comments about 120mm not being useful. 70-90mm is the ideal portrait range; even up to 105mm in some situations. 120mm is too tight for portraits but also not useful for wildlife etc.

    i actively chose the 15 Pro over the Pro Max to avoid the 120mm. Initially marketing was persuading me in the other direction, but having used the Pro Max, the 120mm is just an underwhelming camera - too tight and (to my eyes) quite noisy.
    muthuk_vanalingambeowulfschmidtdewme