Apple's A8X powers iPad Air 2 graphics faster than Google's Nexus 9 with Nvidia Denver Tegra K1

1468910

Comments

  • Reply 101 of 181
    I often read reference to Metal API and its ten fold performance increase.

    But is there any benchmark done that would show how app or any test run with and without Metal acceleration? I have not found any relevant one. For me as regular user is difficult find out what is just marketing and what real change.

    Thanks for tip.
  • Reply 102 of 181
    haarhaar Posts: 563member
    ... wow, Gatorguy is all over this thread!...by implying that DED needs to be corrected, the article is somehow inaccurate.
    BUT in every DED article, DED brings a gun to a knife fight!... and DED is an excellent Shot.

    and even though Gatorguy brings a 9 inch knife, DED still has the gun... and a quick and accurate draw!.


    lollipop (lollypop) are for suckers... LOL such a clichè, but every post of gatorguy seens to be that of a PR person.


    lol... how about licensing the trademark "chupa chup" for "lollipop" this time... i wonder does Nestlè use iPhone in corporate, or Nexus (andriod)? (i ask because of the kit-kat name is a trademark of Nestlè.)...
  • Reply 103 of 181
    relicrelic Posts: 4,735member
    I presume he's a guy as that's part of his name. Couldn't be clearer.

    Relic is our resident doll.

    Hopefully not one of those sex dolls, I would hate to go through life with my mouth stuck in the "0" position being passed around a frat house on a slow Saturday night. You could at least wash me guys before re-using, men.
  • Reply 104 of 181
    MacProMacPro Posts: 19,817member
    relic wrote: »

    Yea, I was about to say, you've posted some books your self. Nah, I'll leave it, besides you guys, no one is going to read it anyway. The longest you can hold someones attention is about 4 lines, unless it's full of pictures.  I can't sleep, these meds are making me loopy as all hell, very trippy.

    Can you say 'you guys' these days? I thought that was the height bad of PC! :D
  • Reply 105 of 181
    MacProMacPro Posts: 19,817member
    haar wrote: »
    ... wow, Gatorguy is all over this thread!...by implying that DED needs to be corrected, the article is somehow inaccurate.
    BUT in every DED article, DED brings a gun to a knife fight!... and DED is an excellent Shot.

    and even though Gatorguy brings a 9 inch knife, DED still has the gun... and a quick and accurate draw!.


    lollipop (lollypop) are for suckers... LOL such a clichè, but every post of gatorguy seens to be that of a PR person.


    lol... how about licensing the trademark "chupa chup" for "lollipop" this time... i wonder does Nestlè use iPhone in corporate, or Nexus (andriod)? (i ask because of the kit-kat name is a trademark of Nestlè.)...

    Love the knife to a fight analogy when arguing with DED ... 'The Gun Slinger'!

    In my head I hear the music of Ennio Morricone and see a Sergio Leone-esque back drop.
  • Reply 106 of 181
    MacProMacPro Posts: 19,817member
    I presume he's a guy as that's part of his name. Couldn't be clearer.

    Relic is our resident doll.

    Well of course it's Gator'guy' but that doesn't prove anything these days ... "Hey you guys' is said to and by women to women just as much as men. I'm sure he is too, but I was not wanting to assume without knowing. It was after his suggestion we meet I was needing to know so as to know how to arm myself ... :D

    Resident 'doll' ... Wow, that's talk that's asking for a PC lashing these days. ;)
  • Reply 107 of 181
    gatorguygatorguy Posts: 24,575member
    haar wrote: »
    ... wow, Gatorguy is all over this thread!...by implying that DED needs to be corrected, the article is somehow inaccurate.
    BUT in every DED article, DED Corrections brings a gun to a knife fight!... and DED Corrections is an excellent Shot when his bullets don't misfire
    Fixed

    The article is fine. Some of Corrections thread comments concerning Nexus devices are pretty off-base.
  • Reply 108 of 181
    I presume he's a guy as that's part of his name. Couldn't be clearer.

    Relic is our resident doll.

    Well of course it's Gator'guy' but that doesn't prove anything these days ... "Hey you guys' is said to and by women to women just as much as men. I'm sure he is too, but I was not wanting to assume without knowing. It was after his suggestion we meet I was needing to know so as to know how to arm myself ... :D

    Resident 'doll' ... Wow, that's talk that's asking for a PC lashing these days. ;)

    No lashing required.

    As to women referring to themselves as guys: they shouldn't.
  • Reply 109 of 181
    I have just placed a preorder for the Nexus 9. While not superior to the Apple Air 2, I feel the Nexus 9 is superior to the Apple Air. While the Air 2 is clearly superior, one feature scares me, and I have had no real reassurance. Specifically the Anti-reflective coating. I have had a lot of experience with AR coatings on eyeglass lenses, and it has left me skittish. Any contact with fingers transfers oil which makes the surface unsightly and dirty looking. A tablet inherently will be in contact with fingers. The surface cannot be protected by a cover or the AR property doesn't work. I also fear an AR coating that is constantly being touched is subject to scratching. While I trust Apple is the best tablet maker, for this particular new technology, I probably will wait for the Air 3 or Air 4.
  • Reply 110 of 181
    Quote:
    Originally Posted by Relic View Post

     

    I really like the new iPad, it’s fast, beautiful screen, great apps, etc. I also really like gadgets that utilize the Nvidia K1 processor. I currently have a Jetson development board, a Lenovo ThinkVision 4K, touch monitor that has an embedded Nvidia K1 and I’m impatiently but with great anticipation waiting for the Nexus 9. Why, because so far every device that uses the K1 has been an open platform, meaning I can install alternative OS’s like Linux, specifically Nvidias development OS, the properly named; Linux4Tegra, based on Ubuntu 14.04. Then there is the fact that it is an extremely quick ARM CPU, regardless of comments made by members of this board, which I have to mind you, has most likely never used a device containing the K1, but still so confident in their resolve that they call it crap. Again why, well because some arbitrary benchmark shows that the Apple A8x is king of the hill which ego dictates, all other competing CPU’s are automatically labeled as inferior and crap.

     

    No one questioned as to why the Nvidia Shield with the 32bit variant of the K1 managed to beat the A8x and 64Bit version of the K1 in this OpenGL tests. Even though the Nexus 9, has the same graphic card and a faster 64Bit CPU, something is defiantly off. So once I get mine I will immediately install Linux4Tegra and run some benchmarks of my one, particularly, encoding a large media file using CUDA and the 192 CUDA cores in the K1. I have zero doubt that the Nexus 9’s GPU performance will be on par if not faster than the Nvidia Shield or the Jetson Dev Board. In fact I would bet money on it. There is something wrong with either the drivers in Android or the GFX Benchmark app used was 32Bit. Regardless, as it stands now, the K1 is still incredibly quick.

     

    Why am I so adamant in defending this chip? Well like I said before I have been using the K1 Jetson dev board since its release in April. It has been a fantastic platform to work on and learn about GPU computing. When I’m home, I actually use it as a normal development desktop machine, has a monitor, keyboard, mouse, external drive, the works. I am so fascinated with GPU computing, streaming processors and parallel computing. I have been using CUDA and Bionic the most as their the easiest technologies for me to utilize and implement a working application, it also means I can start dipping into other projects like Milky@home, PrimeGrid, Einstein@home, etc. as their CUDA based, OpenCL versions also exist for some. I have learned so much because of that little board, sure, yes, you can use OpenCL or another GPU eccentric platform/language but I got to tell, if you’re a newbie, it’s not going to very be easy in the beginning, especially starting to produce viable applications of you own, who knows though you’re probably a genius and if that is the case then I’m sorry I insulted your intelligence.

     

    Anyway, using the Linux4Tegra OS that is freely available for download, not only comes pre-loaded with hundreds of apps, how to’s, example code, help manuals, etc. but if your like me and learn how to program by deconstructing other programmers code, ripping out snippets of code for your own means, then CUDA is a fantastic start as there is so much code and how-tos’ available that it will make you head spin. Not to mention it comes with a fantastic SDK called Vision Works Computer Vision ToolKit, spend a good day with it and you will be spewing code out faster than 13 year boy who found his daddy’s PlayBoys.

     

    But why the Nexus 9 than Relic, it’s a tablet, well using just one example from the heap of things I have planned for it, imagine that you’re using Blender or Lightwave on your MacBook. Your all finished with your wire works and it’s time to render, wait time, an hour plus. I haven’t done this yet but it will be one of my priorities after I get the Nexus 9. Continued: No problem, I have my handy portable render box right here in my bag, you grab your Nexus 9, plug in a host cable and USB to Ethernet adapter, start up the tablet, but since it’s dual or even triple boot you choose Linux. Not just any Linux, a tiny custom stripped down version of Linux4Tegra specifically designed to start-up directly into a render program, still working out as to what render software I will be using but for the sake of continuing this fascinating example, we’ll use LuxRender, as it is free and compatible with ARM and of course using CUDA’s MPI to communicate, as it will also play an important part for connecting multiple devices with a Nvidia K1 into a cluster, parallel computing because you know you can never have too much power. Oh gosh, render wait time just went down to 20 minutes, yippie. I of course left out the exact specifics on how to connect your MacBook to the Nexus 9 but you get the picture. I’m currently doing this and a whole lot more on the Jetson K1.

     

    I went into all of this to try and put a personal connection to this technology. Even though most of you will never do or even think of doing anything that I do, did that make sense. The Nvidia Denver K1 is a fantastic CPU and if you’d spent just a few minutes in the Nvidia development forums to see what people are doing/creating using this thing I believe your perceptions would change. The A8x is an absolute beast but for the time being all of that power is locked away into a single platform, a great platform, but locked away none the less. The k1 however is free to roam around and like I said at the beginning of my post, regardless of your silly need for Apple to always be on top, the K1 is still one of the fastest ARM chips available, only the A8x, and that’s because it has 3 cores vs. 2 as the K1 has, beats it. Look how many chips are beneath it though, a whole lot huh, so is it really junk, what’s going to happen in Q1 2015 when Nvidia releases its 4 core variant. What I’m trying to get at, did the K1 really need to be faster than the A8x to not to be called Junk.

     

    I'm not talking to anyone person directly, please forgive the tone or content of this post. I have new med's that are effecting me profoundly. 


     

    I swear that every time you open your mouth I learn something.

     

    Please keep typing!

     

    Edit: I just thought that you should mine some bitcoin while you happen to be in the hospital and use their electricity ;)

  • Reply 111 of 181
    tht wrote: »
    Huh? 5.0 is 32-bit only? (I don't follow Android closely). And a 64-bit release is waiting on a 5.0.x or 5.x release in 2015?

    Yep, Android 5.0 is 32-bit even if compatible with 64-bit CPUs.

    At present there is only one publicly available 64-bit OS image in the SDK, but it is Intel-based (Atom) and experimental. So my guess is that the first 64-bit Android OS will be released with x86 hardware.

    1000
    tht wrote: »
    For GPU, 32-bit vs 64-bit won't make much difference. Then, with Denver, I kind of wonder what the performance will really be like with 64-bit instructions.

    I don't know, but my feeling is that K1 is quite close to A8x and probably even better in some areas, while android+k1 is definitely inferior to iOS+A8x.
  • Reply 112 of 181
    bradipao wrote: »
    my guess is that the first 64-bit Android OS will be released with x86 hardware.
    Wouldn't x86 be a battery killer? I guess it could happen on a device like Nexus Player where power consumption is less of an issue.

    ARMv8-A looks like a much more likely target for 64 bit android on handhelds.
  • Reply 113 of 181
    Originally Posted by frankfarmer View Post

    Wouldn't x86 be a battery killer? I guess it could happen on a device like Nexus Player where power consumption is less of an issue.

     

    Isn’t Broadwell set to launch with... what, 10w chips? x86 is definitely worse in that department than ARM, but Android is no stranger to THE WORST BATTERY LIFE IMAGINABLE. Remember the first LTE phones? “Phones”, I guess, since they were the precursors to today’s Palm Pilots. Guess we need a new name for those, though. ‘Phablet’ is stupid and autocorrect even refuses to let me keep it typed. :p

  • Reply 114 of 181
    Dan_DilgerDan_Dilger Posts: 1,584member
     
    Originally Posted by Relic View Post

     

    I really like the new iPad, it’s fast, beautiful screen, great apps, etc. I also really like gadgets that utilize the Nvidia K1 processor. I currently have a Jetson development board, a Lenovo ThinkVision 4K, touch monitor that has an embedded Nvidia K1 and I’m impatiently but with great anticipation waiting for the Nexus 9. Why, because so far every device that uses the K1 has been an open platform, meaning I can install alternative OS’s like Linux, specifically Nvidias development OS, the properly named; Linux4Tegra, based on Ubuntu 14.04.[...]

     

    I'm not talking to anyone person directly, please forgive the tone or content of this post. I have new med's that are effecting me profoundly. 


     

    Don't worry, nobody is taking your K1 development devices.



    However, it's interesting that you find it critically important on an ideological level that mobile devices be able to run an open OS (at least Nvidia's Linux) while you also find it compelling to learn about GPU development using Nvidia's proprietary CUDA graphics language. Sounds more like you're just a Nvidia fan, as revealed in your repeating the idea that the K1 as "192 cores," pure marketing nonsense as the article stated.

     

    Which is fine, but looking at how few people are using the K1 in shipping products, its mobile products aren't likely to remain around. So if you're really wanting to learn about mobile GPUs, it might be smarter to learn something that's actually open (like OpenGL ES) or something that's going to be commercially viable (Metal).

     

    Also, the idea of using a Nexus 9 for distributed rendering is completely silly. Despite all of Nvidia's marketing, the K1, like the A8X, Snapdragon/Adreno, etc, is a mobile chip optimized for energy efficient rendering of 2D/3D on relatively small displays with 1-4K pixel resolutions.

     

    The Nvidia/AMD graphics in a laptop or desktop completely blow away the graphics power being installed in mobile devices. Nvidia might like the idea of selling a string of proprietary devices with its Tegra K1, but that doesn't even make sense on any level outside of marketing fluff. More problematically, nobody is using the K1 apart from a couple demonstration tablets designed to look like an iPad rather than a conventional 16:9 Android tab.

     

    And that really makes you wonder: why are Xiaomi and Google leaving the Honeycomb model and adopting Apple's 2010 format for tablets? But the answer is too obvious. 

     

    Also, when you say "I have zero doubt that the Nexus 9’s GPU performance will be on par if not faster than the Nvidia Shield or the Jetson Dev Board. In fact I would bet money on it," it also highlights what else you missed by not actually reading the article: it's not that the Nexus 9's K1 is slower than the Shield's K1, it's that its doing more. The Shield has a stretched smartphone resolution. Nexus 9 has 50% more pixels because it has the same resolution as an iPad. So the Nexus 9 (and iPad Air 2) are working on adult tasks while the Shield is doing kid stuff.

     

    In other benchmarks, the two K1 variants deliver slightly different CPU benchmarks (the 64-bit K1 has half the cores, so its larger, two CPU cores beat the smaller quad CPU cores of the 32-bit K1 in single core tests; the 64-bit K1 is about as fast as the A8X in single core, but the A8X has three cores, so it's faster than the Nexus 9 in CPU). 

     

    ?But as Gatorguy notes, Google only makes Nexus products as a way to spoonfeed the third rate hardware vendor who can't figure out how to effectively copy Apple on their own, so nobody--not even Google--expects the Nexus 9 to actually sell in meaningful quantities. The non-iPad tablets sold in 2015 will likely be just like those sold this year: some Lenovo Atoms, some Samsung Qualcomm/Exynos, and lots of MediaTek stuff that sells for $50 in China and on walmart.com.

     

    ?Apart from Google experiments, no mobile makers have expressed any real interest in Nvidia's 64-bit Tegra K1, which is unsurprising after four years of over-promising, under-delivering Tegra flops. 



    And really, Nvidia's Project Denver is looking a lot like another ambitious VLIW project that was similarly hyped to hell before it finally arrived with performance no better than its competitors: Intel's Iitanium. Funny how history keeps repeating like that. 
     

  • Reply 115 of 181
    Quote:
    Originally Posted by Tallest Skil View Post

     
    Originally Posted by frankfarmer View Post

    Wouldn't x86 be a battery killer? I guess it could happen on a device like Nexus Player where power consumption is less of an issue.

     

    Isn’t Broadwell set to launch with... what, 10w chips? x86 is definitely worse in that department than ARM, but Android is no stranger to THE WORST BATTERY LIFE IMAGINABLE. Remember the first LTE phones? “Phones”, I guess, since they were the precursors to today’s Palm Pilots. Guess we need a new name for those, though. ‘Phablet’ is stupid and autocorrect even refuses to let me keep it typed. :p


    4.5 watt TDP if you are talking about this broadwell chip:

     

    http://www.anandtech.com/show/8515/quick-look-at-core-m-5y70-and-llama-mountain

     

    Not too shabby for performance per watt.

     

    Edit:  Hey this was post #100.  Couldn't have gotten here without you Tallest Skil :) woot!

  • Reply 116 of 181
    Originally Posted by TechLover View Post

    4.5 watt TDP if you are talking about this broadwell chip:



    Oh, yes, that’s it. That’s really impressive, in my mind, particularly for x86. Compare to, well, the low end Celeron chip in the first-gen Apple TV that pulled 100 watts (or was that “idled at 100º”... I’m so worthless...) and the current ARM system in the Apple TV that pulls 6 watts at full draw. Anyway, now that Intel has something with comparable power draw to ARM (or will if they ever get the family out) I think the boys at their lab can breathe a sigh of semi-relief for the near future.

  • Reply 117 of 181

    GPU benchmarks in the absence of image quality comparisons are meaningless. In the past you'd AnandTech reviewing a desktop discrete GPU without considering what's being rendered. What kind of texture filtering quality is PowerVR vs K1 doing? What kind multisampling or super-sampling? How is performance of geometry shaders or tessellation? Too many apples-to-oranges cheats in 3D rendering HW in the past.

     

    GFXBench shows the A8 PowerVR is roughly 50% worst than the K1 in Rendering Quality PSNR tests. This indicates that the PowerVR pipeline is taking shortcuts with image quality.

     

    We used to tolerate this on the desktop, but we don't anymore, because it allowed GPU vendors to get away with claiming the benchmark crown by lowering image quality. Many vendors used to cheat (ask for tri-linear filtering, and get a trilinear approximation instead)

     

    The PowerVR Rogue which the A8X has inside of it is a DirectX 10 class GPU, whereas the Tegra K1 is a DirectX 12 class GPU, which is another difference. K1 simply has to spend more transistors on having a more powerful, general purpose rendering block.

     

    One reason why the PowerVR lost on the desktop was that they could not deliver performance with equality quality to traditional immediate mode renderers, and this was especially true as you scaled up the geometry workload, so I'd be really interested in A8X's tesselation performance vs the K1.

     

    The story is not cut and dried. if you want to see a real comparison between the A8X and Denver K1, wait for someone like Anandtech to do it, and encourage them to measure framerates at equal image quality.

     

  • Reply 118 of 181
    Quote:

    Originally Posted by Tallest Skil View Post

     
    Originally Posted by TechLover View Post

    4.5 watt TDP if you are talking about this broadwell chip:



    Oh, yes, that’s it. That’s really impressive, in my mind, particularly for x86. Compare to, well, the low end Celeron chip in the first-gen Apple TV that pulled 100 watts (or was that “idled at 100º”... I’m so worthless...) and the current ARM system in the Apple TV that pulls 6 watts at full draw. Anyway, now that Intel has something with comparable power draw to ARM (or will if they ever get the family out) I think the boys at their lab can breathe a sigh of semi-relief for the near future.


    So this is kind of off topic, but isn't it amazing how less power-hungry everything has become?

     

    We used to run desktops with CRT monitors 24/7 that consumed 200 watts no matter what was going on.  Now we use laptops, tablets, and phones (well some of us still have desktops) that sip power at a fraction of what they have replaced, all while achieving orders of magnitude more work.

  • Reply 119 of 181
    Originally Posted by TechLover View Post

    So this is kind of off topic, but isn't it amazing how less power-hungry everything has become?

     

    We used to run desktops with CRT monitors 24/7 that consumed 200 watts no matter what was going on.  Now we use laptops, tablets, and phones (well some of us still have desktops) that sip power at a fraction of what they have replaced, all while achieving orders of magnitude more work.


     

    Indeed. And not only less power, but less waste heat! Computer-wise, conversions aren’t getting all that much more  efficient (I don’t know about the newest Mac Pro, but between the 2006 and 2009 models the PSU went from 86 to 90% efficient), but lightbulbs? Incandescent was roughly 90/10 heat/light. LEDs have flipped that, meaning less power for the same brightness and cooler running to boot.

  • Reply 120 of 181
    Quote:

    but lightbulbs? Incandescent was roughly 90/10 heat/light. LEDs have flipped that, meaning less power for the same brightness and cooler running to boot.


    Totally.  I'll admit it, I am a light bulb dork....

     

    Get this, I have this one light that is outdoors in a covered fixture that I leave on 24hs/365.  I write the date on it so I know how long they last.  It goes through the 4 seasons hot and cold and burns out with a compact fluorescent in about 10,000 hours. So I always get just about over a year with a CFL with this outdoor fixture turned on all the time.  Been doing this for a few years now. 

     

    This last time I replaced it with an LED.  Supposed to last like 5 years.  So we'll see.  Where's that receipt anyway?

     

    Go ahead and make fun :)  pretty sure we're all nerds here.

     

    Edit: I meant to quote @Tallest Skil and managed to bork it up.

Sign In or Register to comment.