Only months after launch, 'Disney Infinity' ditches support for Apple TV

12346»

Comments

  • Reply 101 of 109
    volcanvolcan Posts: 1,799member
    auxio said:

    To do interactive 4K smoothly with the A-series architecture, they'd need, at minimum, the same hardware they're using in the 12.9" iPad Pro.  And even it doesn't run at 4K resolution (only around 3K), so that may not be enough.

    If you've got a 4K video on your device, the bottleneck is mostly in moving the data from storage to RAM and then to the GPU. It hardly matters whether the display is 3K or 4K, the GPU still needs to process the full frame and then interpolate to scale it. That is a lot of data to move without dropping frames. Shooting 4K is more CPU intensive but you still need to move the data into storage. Reading from NAND on iPhone 6s is around 248 MB/s and writing is 163 MB/s. So in the case of shooting the 4K video it is going to be heavily compressed, and of course when you play it back it is also compressed. Otherwise it would never be able to keep up without dropping frames. I/O performance is the primary bottleneck in both situations.
  • Reply 102 of 109
    auxioauxio Posts: 2,760member
    sog35 said:

    auxio said:
    How is it a slap in the face if the video quality matches their setup perfectly?  You're in a niche with your 4K setup at this point, and simply delusional in believing that you're not.

    To do interactive 4K smoothly with the A-series architecture, they'd need, at minimum, the same hardware they're using in the 12.9" iPad Pro.  And even it doesn't run at 4K resolution (only around 3K), so that may not be enough.

    I've had an ATV4 since the developer prerelease boxes were handed out.  And I attended an AppleTV tech talk about how to develop apps for it.  I also spoke in-depth with Apple engineers about some more advanced things I wanted to do with it.  But clearly you are the expert here.
    That makes sense.  From a developer standpoint its much cheaper to produce HD content instead of 4k.
    Not at all.

    Apps are not simply prerecorded videos, they are interactive.  As you move your finger on the remote, the app scrolls the content on the screen to match your movement.  As you press buttons to select things, it changes the screen to match that selection.

    In all but the very simplest apps, it's not possible to prerecord every possible action you could perform and simply show a video of what's happening.  As you interact with it, the app needs to perform calculations and dynamically update the screen (move things around, highlight things, load new content and show it, etc).  To do this fast enough so that it feels fluid and natural at 4K resolutions requires a very powerful CPU & GPU.  This is the difference between interactive applications, and ones which simply show a video.
  • Reply 103 of 109
    auxioauxio Posts: 2,760member

    sog35 said:

    auxio said:
    How is it a slap in the face if the video quality matches their setup perfectly?  You're in a niche with your 4K setup at this point, and simply delusional in believing that you're not.

    To do interactive 4K smoothly with the A-series architecture, they'd need, at minimum, the same hardware they're using in the 12.9" iPad Pro.  And even it doesn't run at 4K resolution (only around 3K), so that may not be enough.

    I've had an ATV4 since the developer prerelease boxes were handed out.  And I attended an AppleTV tech talk about how to develop apps for it.  I also spoke in-depth with Apple engineers about some more advanced things I wanted to do with it.  But clearly you are the expert here.
    I'm done arguing.  I like 4k now. I will LOVE 4k in the future.  I don't want to buy a new streaming box every year.  I'm sick of Apple withholding key technologies to insure yearly upgrades.  Still love the iPhone/iPad/Mac but I won't support a crap platform like ATV.
    One last question: if Apple had never opened up the iPhone & iPad to apps and simply left them with the same functionality they had back in 2007/2010 when they first launched (with only small, incremental improvements), would you still be using them?

    It's the same situation with the AppleTV: if they simply made it a video streamer, in a couple years time when all the other TV boxers have all sorts of cool new features and apps, you'd be dumping it.  They're getting a jump on competitors by putting in the ability to extend it with apps first -- and that comes at the expense of 4K (for now) to make it cheap enough.

    Either way you're probably going to have to update in a couple of years because the current generation of Roku/FireTV boxes aren't going to get many new apps outside of more video streamers (similar to the old AppleTV).  Just because you think you don't need more than that now, doesn't mean that it won't be important in the future as people come up with new ways to make use of a TV.
    edited April 2016
  • Reply 104 of 109
    auxioauxio Posts: 2,760member

    And I have to say, I don't think it matters if Apple had 4K in the Apple TV because most people don't have 4K televisions. Hell, most people can't tell the difference between 720 and 1080.
    Not even that -- I always have a good laugh when I go over to a neighbour's house and he has a 60" LCD TV, full cable package, and all of the channels he's watching are SD!  They look absolutely terrible stretched out on that 60" TV.

    I tried to explain to him that he should watch the HD channels instead, but he doesn't understand or never remembers or something (I dunno).  There are plenty of people who just don't know and/or don't care about these things.
    edited April 2016
  • Reply 105 of 109
    auxioauxio Posts: 2,760member
    volcan said:
    auxio said:

    To do interactive 4K smoothly with the A-series architecture, they'd need, at minimum, the same hardware they're using in the 12.9" iPad Pro.  And even it doesn't run at 4K resolution (only around 3K), so that may not be enough.
    If you've got a 4K video on your device, the bottleneck is mostly in moving the data from storage to RAM and then to the GPU. It hardly matters whether the display is 3K or 4K, the GPU still needs to process the full frame and then interpolate to scale it. That is a lot of data to move without dropping frames. Shooting 4K is more CPU intensive but you still need to move the data into storage. Reading from NAND on iPhone 6s is around 248 MB/s and writing is 163 MB/s. So in the case of shooting the 4K video it is going to be heavily compressed, and of course when you play it back it is also compressed. Otherwise it would never be able to keep up without dropping frames. I/O performance is the primary bottleneck in both situations.
    With prerecorded 4K video, you just send the compressed video data to a dedicated hardware decoder chip (in the case of H.264 and H.265 anyways).  That chip decompresses the video data and sends it directly to the GPU so that it can be displayed onscreen.  So the only thing which needs to travel across the system bus is the compressed video data going from RAM to the decoder chip.  Hence why even the cheapest 4K hardware can handle showing video.

    Shooting 4K video is similar: the camera sensor collects the image data, sends it to the hardware encoder chip to be compressed, and the only thing you need to save to storage is the compressed video.  Though I suppose displaying it onscreen at the same time (composed with an interface for camera controls) is where it gets complicated.  But I'd assume the decoder can route the video to the GPU (to be shown on part or all of the screen) without involving the CPU.  The CPU would only need to update the camera control UI parts so that they could be composed with the video on the GPU.

    Editing that 4K video on the iPhone is the main reason why you need powerful hardware.  If you simply had to record and save it, any old phone with a hardware encoder chip and a hi-res camera sensor would work.

    Where you really need the powerful/fast hardware is when you have to dynamically compose what's onscreen using data generated by the CPU.  Let's take a case where you can't avoid involving the CPU: smoothly scaling up a hi-res image.  Sure you can use things like mipmaps (pregenerated copies of the image at various resolutions)  to "snap" the scaling to look good at certain points, but let's say you really want it to be as high resolution as possible at all points while scaling.  In which case, you need to use the CPU to re-render the image to a pixel buffer on each frame as the size changes.  Then, you need to copy that pixel buffer to the GPU and compose it with whatever else is showing onscreen (background and whatnot).  That's where the amount of data you're generating and sending to the GPU for each frame at 4K resolution is huge.  Trying to achieve 60 frames per second with that much data being passed around is a very difficult task.

    So you're right: not only do you need a powerful CPU (to regenerate the dynamic/changing images) and a powerful GPU (to compose that much pixel data onscreen), but you also need a very fast system bus to move it all around.  Though you can do things like trade CPU/GPU power for bus speed by compressing the pixel buffer before you send it across the system bus (and decompressing it on the GPU).
    edited April 2016 propod
  • Reply 106 of 109
    auxio said:

    And I have to say, I don't think it matters if Apple had 4K in the Apple TV because most people don't have 4K televisions. Hell, most people can't tell the difference between 720 and 1080.
    Not even that -- I always have a good laugh when I go over to a neighbour's house and he has a 60" LCD TV, full cable package, and all of the channels he's watching are SD!  They look absolutely terrible stretched out on that 60" TV.

    I tried to explain to him that he should watch the HD channels instead, but he doesn't understand or never remembers or something (I dunno).  There are plenty of people who just don't know and/or don't care about these things.
    Exactly. The regular person just doesn't care or doesn't know better. My inlaws still have an old tv wrapped in those wood frames and they won't get a new tv until that one dies. Besides, 4K televisions are just now starting to come down into the realm of affordability and there still isn't content out there unless you're creating it yourself. It's a future tech that's available now, just like Bluray was when it launched. It took years to (mostly) phase out DVD's and it will take even longer for 4K stream to become a big thing.
    edited April 2016
  • Reply 107 of 109
    This is an unfortunate choice by Disney.  The game actually is decent, once one figures out the menus and ugly UI.

    I've noticed most of the comments are about the atv4 unit itself though, so I thought I'd share my perspective.

    I absolutely love my atv4.  It is an excellent multi function TV device, and I do understand some of the complaints, but, for me, it's still the best game in town.  A few reasons:

    1. Plex and third part apps continue to excel on the platform.
    2. Lots of truly innovative 'indie' games (check out 'Back to Bed') that get the whole 'casual gaming' approach.
    3. Ability to side load without jailbreaking.
    4. Great use of the existing apple ecosystem.
    5. Extremely high spousal acceptance factor.
    6. The platform is very stable.
    7. The streaming management lets you do more with less storage and is probably the future of how these things will work.  As games get bigger and bigger, you will probably never have enough storage to store all the content for all the games you own.  This is innovative, and works seamlessly. 
    8. Mainstream apps and games, of which there are many to choose from are gorgeous, fluid, with no lag.
    9. Games run along side other apps, without a problem.

    Having run an android unit previous to this, I can tell you, there is no comparison, the experience blows the doors off a typical streamer.

    Disney deserves a sound kick in the pants for abandoning multiple platforms a few months after launching a product, that is for sure.  I think their loss will be another company's gain. 










  • Reply 108 of 109
    saltyzipsaltyzip Posts: 193member
    Apple always release incremental upgrades, they could provide all the latest core bells and whistles now, but that wouldn't help keep the money coming in with a yearly release schedule, as R&D doesn't advance enough to warrant big technology jumps.

    It can't be long before iPhones switch to a 2 year upgrade cycle and then what will that do to apples iPhone cash cow. Apple don't actually sell that much stuff, could they creep back into a niche product over the next decade. Google and Tesla are reinventing the car, but it's not mass production device like a phone. Apple got lucky, right product at the right time, but that headstart only lasts for a while. Look at what happend to BlackBerry and Nokia.
    edited April 2016
Sign In or Register to comment.