Nvidia unveils new quad-core Tegra 3 processor to challenge Apple's A5

245

Comments

  • Reply 21 of 82
    aizmovaizmov Posts: 989member
    Quote:
    Originally Posted by Apple ][ View Post


    Are Android devices still going to be shit? Plagued by lag and choppiness that comes from using a shitty and inferior OS?



    At the end of the day, the specs don't mean much. Quad-core, schmad-core, what matters is how a device actually works and operates. And if a device is not silky smooth, then who gives a crap about how many cores it has?



    Fandroid: My new Android tablet is so fucking amazing! I just bought it today!

    iPad user: Oh yeah, what's so great about it?

    Fandroid: It's much better than your iPad 2!

    iPad user: Oh really, why is that?

    Fandroid: It has a new octo-core chip, 1.8 Ghz!

    iPad user: That's nice, but I notice that the UI on your tablet is still very choppy. Why is that?

    Fandroid: Ever since my mother dropped me on my head when I was an infant, I lost the ability to see more than 10 frames per second, so to me, I can't really tell the difference. And besides, did I mention that it has a new octo-core chip, 1.8 Ghz?

    iPad user: Yeah, you did. Are you leaving already?

    Fandroid: Yep, I have to split. I just got a tweet and I found out that they're releasing a newer model of my tablet in 3 weeks time, 1.9 Ghz! So I'm going to try and sell mine before it becomes obsolete. The new model is going to be running the newest OS, sweet vanilla yogurt muffins mixed with drunken puke, I have to get me that! It's not easy being a Fandroid!



    Hilarious

    May I use this mock dialogue? in some other site?
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 82
    Why some sites says Tegra 3 has 5 cores?



    http://www.ubergizmo.com/2011/09/tegra-3-5-cores/
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 82
    cmvsmcmvsm Posts: 204member
    Everyone should have learned by now that higher iPad type device specs do not translate into a better user experience, not by a large margin. The hardware is just part of the equation. The Tegra 2 was a perfect example of that.
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 82
    Quote:
    Originally Posted by techpr View Post


    a Quad Core to Challenge a Dual Core LOOL



    I'd go for it.



    The Cortex A15 dual-core is supposed to be a better performer than a Cortex A9 quad-core -- AND use less power too!



    http://www.youtube.com/watch?v=F_wwgTMcGXI



    Then just add PowerVR SGX 600 series (Rogue), and the A5 will be quite the slouch...



    http://www.electronista.com/articles...gx600.details/
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 82
    ruel24ruel24 Posts: 432member
    Quote:
    Originally Posted by Apple ][ View Post


    Are Android devices still going to be shit? Plagued by lag and choppiness that comes from using a shitty and inferior OS?



    At the end of the day, the specs don't mean much. Quad-core, schmad-core, what matters is how a device actually works and operates. And if a device is not silky smooth, then who gives a crap about how many cores it has?



    Fandroid: My new Android tablet is so fucking amazing! I just bought it today!

    iPad user: Oh yeah, what's so great about it?

    Fandroid: It's much better than your iPad 2!

    iPad user: Oh really, why is that?

    Fandroid: It has a new octo-core chip, 1.8 Ghz!

    iPad user: That's nice, but I notice that the UI on your tablet is still very choppy. Why is that?

    Fandroid: Ever since my mother dropped me on my head when I was an infant, I lost the ability to see more than 10 frames per second, so to me, I can't really tell the difference. And besides, did I mention that it has a new octo-core chip, 1.8 Ghz?

    iPad user: Yeah, you did. Are you leaving already?

    Fandroid: Yep, I have to split. I just got a tweet and I found out that they're releasing a newer model of my tablet in 3 weeks time, 1.9 Ghz! So I'm going to try and sell mine before it becomes obsolete. The new model is going to be running the newest OS, sweet vanilla yogurt muffins mixed with drunken puke, I have to get me that! It's not easy being a Fandroid!



    That's pretty accurate... lol Android is one choppy OS. When my finger swipes across the screen, the icons lag behind my finger swiping speed, which grows as my finger swipes. My finger got to the edge of the screen, but the icons my finger was just under when I first touched the screen were only half way across in the animation. They still haven't fixed this issue after all this time? The iPad and iPhone just feel natural to work with.
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 82
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by irnchriz View Post


    The A5 is clocked at 800mhz and performs better than dual core 1.4ghz cpus on Android due to the operating system. Android is the weak link making the silicon look bad. Until this is addressed nothing is going to change.



    It would be very interesting to see Linux, with a light weight desktop environment, running on this chip.
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 82
    ruel24ruel24 Posts: 432member
    Quote:
    Originally Posted by uncore View Post


    Why some sites says Tegra 3 has 5 cores?



    http://www.ubergizmo.com/2011/09/tegra-3-5-cores/



    That seems like a pretty novel approach, on paper. Real world experience will tell whether they succeeded or failed.
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 82
    blastdoorblastdoor Posts: 3,843member
    Quote:
    Originally Posted by nonimus View Post


    I'd go for it.



    The Cortex A15 dual-core is supposed to be a better performer than a Cortex A9 quad-core -- AND use less power too!



    http://www.youtube.com/watch?v=F_wwgTMcGXI



    Then just add PowerVR SGX 600 series (Rogue), and the A5 will be quite the slouch...



    http://www.electronista.com/articles...gx600.details/



    That makes a lot of sense to me.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 82
    ruel24ruel24 Posts: 432member
    Quote:
    Originally Posted by wizard69 View Post


    It would be very interesting to see Linux, with a light weight desktop environment, running on this chip.



    It'll be a slug and a battery waster. Linux is not optimized to run on battery powered devices and such low resources.



    http://www.phoronix.com/scan.php?pag..._regress&num=1



    There will have to be significant work done to Linux to make it viable. Otherwise, it'll be more like a Windows Slate device, where battery runtime is very low and the OS very sluggish. The problem with Linux (and I'm a big fan and user of it and actually writing this from PCLinuxOS right now) is that the kernel is predominately developed by groups whose interest is in the server arena. Groups like Novell, Red Hat, IBM, and HP have very little interest in the desktop. Con Kolivas used to write patches for the Linux Kernel to improve desktop performance, as it was largely ignored by the mainline developers. He quit doing so in frustration in 2007. He later, in 2009, released the BFS Scheduler to improve desktop performance. Google, has undoubtedly, worked the Linux kernel to improve battery life to some degree. I'm not sure how much of that code got sent upstream.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 82
    ronboronbo Posts: 669member
    Quote:
    Originally Posted by GalaxyTab View Post


    I've never understood the obsession with calling pre-announced products "Vapor". It screams of people with their head in the sand or acting like a child with its eyes shut and fingers in its ears screaming to try to ignore what is happening.



    Speaking of obsession, indeed In the future, it would help to remember: such points work much better if you wait until someone actually mentions your pet peeve before you go off on them about it.
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 82
    I noted that they have the half-speed processor in there for regular duty -which is probably why "in average use" they can claim 10 hours of battery life in the Asus tabbie. It will be interesting to see what the wattage is for the quad-core in actual use. Then again given the Android devs "freedom" to do what they want on the platform, I expect that a number of the games will leap immediately to using the full speed cores and graphics processing and erode that battery life to a couple of hours. For the average Android user, this shouldn't be an issue, but for the incessantly self-congratulatory geek users/gamers they will be running the bad boy tethers to power to save battery.



    I'm hoping that the Android team is looking this over carefully so that Ice Cream Sandwich, or Cotton Candy, or Cream Puff or whatever the next "dessert" is can provide users with some dedicated core management controls to maintain battery life for the phones and tabbies. The number one complaint from my less rabid Android owning friends is battery life under the current distros (Eclair/Froyo/Gingerbread). Stuttery controls the next favorite is a moot point at best. In fact I have to caution my average user friends to not talk too much about battery life or controls around my phAndroid phriends - or else they will get a 10 minute long tirade about how sweet everything is and how they should be revelling in their "phreedom" and not trapped like all the Apple users. Ironically a recent tirade like that actually caused one of my friends to actually go try out and ultimately purchase an iPad 2. Boy, did that unleash a followup floor-kicking tantrum! Sigh.



    Anyway, will Intel be able to drive competition for this level of development with the Atom series?
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 82
    drdoppiodrdoppio Posts: 1,132member
    Quote:
    Originally Posted by nonimus View Post


    I'd go for it.



    The Cortex A15 dual-core is supposed to be a better performer than a Cortex A9 quad-core -- AND use less power too!



    http://www.youtube.com/watch?v=F_wwgTMcGXI



    ...



    He was comparing quad core A9 to dual core A9, not to dual core A15. Newer processors will always be better, duh...
     0Likes 0Dislikes 0Informatives
  • Reply 33 of 82
    ronboronbo Posts: 669member
    I'm delighted to see people pushing the ARM forward. I think Apple is pragmatic enough that if nVidia does come up with a really terrific design that blows away Apple's next candidate processor, they could just switch to the nVidia chip. They're all just ARMs, right? From a developer standpoint, nothing would need to change. We wouldn't even need to recompile.



    Can someone with knowledge on the matter answer these questions: are these things pin compatible? Do they all have identical instruction sets, or does nVidia or Apple get to add custom instructions without breaking their licenses?
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 82
    Quote:
    Originally Posted by ruel24 View Post


    It'll be a slug and a battery waster. Linux is not optimized to run on battery powered devices and such low resources.



    http://www.phoronix.com/scan.php?pag..._regress&num=1



    There will have to be significant work done to Linux to make it viable. Otherwise, it'll be more like a Windows Slate device, where battery runtime is very low and the OS very sluggish. The problem with Linux (and I'm a big fan and user of it and actually writing this from PCLinuxOS right now) is that the kernel is predominately developed by groups whose interest is in the server arena. Groups like Novell, Red Hat, IBM, and HP have very little interest in the desktop. Con Kolivas used to write patches for the Linux Kernel to improve desktop performance, as it was largely ignored by the mainline developers. He quit doing so in frustration in 2007. He later, in 2009, released the BFS Scheduler to improve desktop performance. Google, has undoubtedly, worked the Linux kernel to improve battery life to some degree. I'm not sure how much of that code got sent upstream.



    I dunno. Google talked a lot about pushing Google innovations back to Linux, but since the Android team itself is pretty small at Google, I don't think they have the resources, nor that Google would for appearances sake /PR spend the dollars to staff for Linux support. Commentary out there is not promising:



    Linus Torvalds via ZDNet: Linus said he thought that Android could be merged back into Linux as a common kernel, but not effectively for 4-5 years.



    Linux.com and Greg Kroah-Hartman: Linux kernel maintainers felt that Google did not show any intent to maintain their own code. G K-H also indicated he was concerned that Google was no longer trying to get their code changes included in mainstream Linux.



    Computerworld: Android devs have claimed that "the Android team was getting fed up with the process", felt this was a low priority due to more pressing Android core development needs and inadequate staffing to address it.



    The Linux teams haven't been sitting still though: last year Raf Wysocki patched the mainline Linux wakeup events framework so that it was easier to merge Android device drivers that use wakelocks into mainline Linux, but recommended that Android's opportunistic suspend should not be included.



    So the question is, is Google serious about maintaining compatibility with Linux, or is it just pandering to the Linux community as a PR tool?
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 82
    Quote:
    Originally Posted by GalaxyTab View Post


    http://en.wikipedia.org/wiki/Vaporware

    Nope.

    Nope.

    Nope.

    I've never understood the obsession with calling pre-announced products "Vapor". It screams of people with their head in the sand or acting like a child with its eyes shut and fingers in its ears screaming to try to ignore what is happening.



    Kal-El is due out in December. It's very much real.



    Just to be clear it's "vapor" not "vaporware".



    "Vaporware" refers to something that is announced but the commenter presumes won't be released. For example "almost every article predicting what the next iDevice will be like is useless because they are just describing vaporware".



    "Vapor" is something that will be released, but the commenter believes it may not live up to the hype. Kind of like saying "I'll believe it when I see it". For example "I know you said your car was that fast, but until I see it on the track it's just vapor".



    I'm not sure who coined the term, but they must have been an idiot. It sounds stupid, it doesn't make sense and there was always going to be confusion between "vapor" and "vaporware".
     0Likes 0Dislikes 0Informatives
  • Reply 36 of 82
    Quote:
    Originally Posted by DrDoppio View Post


    Just finished reading the review about ASUS Transformer Prime (on another website), a truly fantastic device! I have played with the current Transformer in BestBuy and it was the tablet I enjoyed most. IMO, with the keyboard, the upcoming Prime offers usability somewhere between iPad2 and the Air.



    The fifth core in the Kal-El chip is an interesting feature. Reportedly, it will offer great battery savings upon lower usage, including audio (and even video, although I suppose not HD) playback.



    The Kal-El chip is an interesting design. There is a race right now to get the most performance out of a low power ARM design without sacrificing energy consumption metrics. The older Tegra 2 looked good on paper but was absolutely trounced by the A5, especially when it came to graphics, which is (ironically) Nvidia's home turf. Using a 5-core design including one of them being a special low power version has some advantages:



    1. Great performance with 4 cores if the application is multithreaded



    2. The low power core can extend the battery life considerably. It might be possible to use it exclusively in low battery situations.



    Disadvantages:



    1. You need to rewrite Android's kernel to prioritize which core system is used for what task.



    2. The extra core makes the chip die larger which makes production of the chip more expensive.



    It's also unknown if the graphic performance still rivals what Apple has in the A5 (or the upcoming-likely-4-core A6). The only thing about this design is that I doubt that Apple would have solved the performance/power problem the same way. I've been wondering how much Apple's chip making skills/IP they purchased over the last few years was separating themselves from other ARM designs and we're starting to see that, especially in the A5. The A5 in the iPad clocked at 1 Ghz routinely gets the same performance out of rival chips at 1.2-1.3 Ghz and the iPhone 4s is clocked at 800 Mhz and does just fine against other phones. Apple's battery life for iOS is the envy of the tech world (iPhone 4S bugs not withstanding).



    It's said that A4s and A5s can dynamically shift their clock speed depending on the task which may make the need for special low power cores like Kal-El unnecessary (and cheaper to build). The A6 would have similar features and the 28 nm design only makes it more power friendly. Still, the A6 is not likely to see the light of day until the iPad 3..whenever that happens. Kal-El will have it's day in the sun for the time being. I also believe that Nvidia likely has other plans for this chip. The embedded market (media streamers, TVs and other devices) might have a good use for this chip, even if the Android tablet market continues to falter.
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 82
    Quote:
    Originally Posted by mdriftmeyer View Post


    By the time Nvidia's Tegra is ready to be released Apple will be gearing up for the iPad 3 ramp up.



    The Transformer Prime is going to be released next month. It will be running one. How about you wait until you get one in hand before you judge it.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 82
    Why are they spinning this like Nvidia is catching up? The A5 came out and the Tegra 2 came out at A4 speeds. Now the A6 will be out soon and the Tegra 3 is coming out at A5 speeds. They are not any closer to catching up. Apple always under-clocks their chips for battery savings. If Nvidia does the same, they may actually perform worse then the A5.
     0Likes 0Dislikes 0Informatives
  • Reply 39 of 82
    Quote:
    Originally Posted by Ronbo View Post


    I'm delighted to see people pushing the ARM forward. I think Apple is pragmatic enough that if nVidia does come up with a really terrific design that blows away Apple's next candidate processor, they could just switch to the nVidia chip. They're all just ARMs, right? From a developer standpoint, nothing would need to change. We wouldn't even need to recompile.



    Can someone with knowledge on the matter answer these questions: are these things pin compatible? Do they all have identical instruction sets, or does nVidia or Apple get to add custom instructions without breaking their licenses?



    The problem for NVidia is that the Tegra 2 didn't even even really challenge the A5...it was pretty much dead on arrival in terms of graphics performance, which is Nvidia's strong suit. They say that Kal-El is twice as fast as the A5 in terms of performance like video encoding. Considering they are benchmarking 4 cores against 2 with a test that is easy to split among cores, I would certainly hope so. The real test will be if they can advance the field with single-core performance and especially with graphics performance. Just equaling or nominally improving on the A5 won't cut it...you know Apple's A6 will blow that away. When the benchmarks are finally published, it will be interesting to see what Kal-El is really capable of.



    And to answer your question, Apple has invested billions in their own processor designs (acquisitions, research, paying for fabs). It's unlikely they would just dump it now and go to Nvidia...not when they have been the performance/power leader for the last few years.
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 82
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by shompa View Post


    Nvidias solution with a companion core is interesting..



    That's what caught my eye. Sounds like the low performance chip will be running most of the time and they'll only fire up the quad core for really CPU intensive stuff. That could save energy, although possibly at the expense of performance when in 'low speed' mode.



    I believe that the A5 and A6 have on-demand cores - which means that they only need to run one core if the workload demands it. That achieves much of the same thing without requiring a separate chip. Time will tell which approach is better.



    Quote:
    Originally Posted by uncore View Post


    Why some sites says Tegra 3 has 5 cores?



    http://www.ubergizmo.com/2011/09/tegra-3-5-cores/



    Quad core Tegra plus one companion (low speed) chip.



    Quote:
    Originally Posted by Sevenfeet View Post


    The problem for NVidia is that the Tegra 2 didn't even even really challenge the A5...it was pretty much dead on arrival in terms of graphics performance, which is Nvidia's strong suit. They say that Kal-El is twice as fast as the A5 in terms of performance like video encoding. Considering they are benchmarking 4 cores against 2 with a test that is easy to split among cores, I would certainly hope so. The real test will be if they can advance the field with single-core performance and especially with graphics performance. Just equaling or nominally improving on the A5 won't cut it...you know Apple's A6 will blow that away. When the benchmarks are finally published, it will be interesting to see what Kal-El is really capable of.



    Yes, but history suggests that being skeptical is appropriate. The made all sorts of performance claims in the past that weren't true, so it makes sense to take their current claims with a grain of salt.



    Quote:
    Originally Posted by Sevenfeet View Post


    And to answer your question, Apple has invested billions in their own processor designs (acquisitions, research, paying for fabs). It's unlikely they would just dump it now and go to Nvidia...not when they have been the performance/power leader for the last few years.



    But IF Nvidia were able to offer a clear performance advantage, there would be nothing stopping Apple from switching.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.