One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android

1356

Comments

  • Reply 41 of 114
    relicrelic Posts: 4,735member
    Quote:
    Originally Posted by jungmark View Post





    Take care, Relic. I do enjoy skipping over your longer posts. image



    64 bit is just for marketing. /s

    I don't blame you as I do get a little carried away at times. I don't know if it's just marketing or not, who cares, it's cool to talk about it.

  • Reply 42 of 114

    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

  • Reply 43 of 114
    relic wrote: »
    I don't blame you as I do get a little carried away at times. I don't know if it's just marketing or not, who cares, it's cool to talk about it.

    The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.

    I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

    Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.
  • Reply 44 of 114
    thttht Posts: 5,452member
    For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.

    It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.

    That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?
  • Reply 45 of 114
    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

    Good lord, you're back? Did you unload your AAPL shares? You seem (for this moment) to be less of an annoying whiner and modestly "up" on Apple.
  • Reply 46 of 114
    Quote:

    Originally Posted by TimmyDax View Post





    The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.



    I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.



    Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.

     

    What a crock of shit. There are Apps that take advantage of 64bit and I've been using them for some time. While it's true most Apps don't need (or take advantage) of the A7, that doesn't mean that none of them do or that it's marketing.

     

    Most people don't utilize a fraction of the processing power in their PC - doesn't mean it's marketing to actually have that power available.

     

    There is no "trickle down effect" for Apple. They control everything (hardware, OS and App approval process). Apple will make the transition to 64bit in record time (2 years tops). "Trickle down" is something that applied to Windows and their move to 64bit. There was no incentive for developers to make the switch and MS didn't have any power to move them along.

  • Reply 47 of 114
    tundraboytundraboy Posts: 1,885member
    Quote:
    Originally Posted by wizard69 View Post

     
     

    I think you miss the point, this has little to do with Apple as Apples market is captive.

    I have never ever heard anyone say or write "I really prefer Android but the only smartphone I am able to, or am allowed to purchase, and this is completely against my will, is an iPhone".   That is what an iPhone 'captive market' would be.

  • Reply 48 of 114
    tundraboytundraboy Posts: 1,885member
    Quote:

    Originally Posted by TimmyDax View Post





    The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.



    I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.



    Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.

     

    Sooooo, if you were running Apple, you would tell the developers to upgrade their apps to 64 bit first and only when they're done would you release a 64 bit iPhone?

  • Reply 49 of 114
    allenbfallenbf Posts: 993member
    Quote:

    Originally Posted by jungmark View Post





    64 bit is just for marketing. /s

     

    Ha!  I'd forgotten this is what the haters were saying.  Too funny

  • Reply 50 of 114
    sockrolidsockrolid Posts: 2,789member

    Originally Posted by AppleInsider View Post

    "Apple kicked everybody in the balls with this," a Qualcomm employee said at the time. "It's being downplayed, but it set off panic in the industry."

     

    Get used to it.  

    Apple is pretty good at bollocking "the industry."

  • Reply 51 of 114
    blastdoorblastdoor Posts: 3,305member
    Quote:

    Originally Posted by wizard69 View Post



    How did VLIW architectures get pulled into this?



    As for Apple I wouldn't be surprised to see them deviate from the norm and start to consider merging CPUs with neural networks, FPGA and the like.

     My impression is that Denver is an evolution of VLIW -- that is, using the compiler to extract parallelism from the code and bundling those instructions together to be executed simultaneously. 

  • Reply 52 of 114
    blastdoorblastdoor Posts: 3,305member
    Quote:

    Originally Posted by THT View Post



    For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.



    It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.



    That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?

    It sure sounds VLIW-ish to me, so it would appear that it's not dead yet. 

  • Reply 53 of 114
    Quote:
    Originally Posted by Blastdoor View Post

     

     

    Indeed... I would think that if VLIW were ever to succeed in the market, it would be in the context of a tightly integrated stack in which one company controls everything from the silicon up to software distribution. Yet Apple has not chosen to go VLIW, at least not yet. If Apple doesn't think it's a good idea, with their total control over compilers, language, OS, APIs, etc etc... it's a little hard to see how anyone else can make it work. 




    HP, back WHEN it actually designed its own processors/compilers/software, went down the VLIW path. That rabbit hole became known as Itanium as they partnered with INTEL rather than go it alone as they had in the past (INTEL made HP specific implementations of the architecture and HP focused on the compilers).   HPs efforts on VLIW computing never did pan out as hoped as the compiler technology which is central to the idea simply never did deliver on the promise.  This is not to say that VLIW is fatally flawed, but it has yet to prove its worth or capabilities in the real world of commercial computing. 

  • Reply 54 of 114
    blastdoorblastdoor Posts: 3,305member
    Quote:

    Originally Posted by dtidmore View Post

     



    HP, back WHEN it actually designed its own processors/compilers/software, went down the VLIW path. That rabbit hole became known as Itanium as they partnered with INTEL rather than go it alone as they had in the past (INTEL made HP specific implementations of the architecture and HP focused on the compilers).   HPs efforts on VLIW computing never did pan out as hoped as the compiler technology which is central to the idea simply never did deliver on the promise.  This is not to say that VLIW is fatally flawed, but it has yet to prove its worth or capabilities in the real world of commercial computing. 


    But was the problem HP's compilers not living up to expectations, or was it that Itanium was killed by lack of economy of scale? IIRC, Intel never fabbed Itanium on their latest process, new versions of Itanium were substantially delayed, and Itanium was incredibly expensive. Maybe all of those disadvantages were too much for a great compiler to overcome? 

     

    I really don't know the answer to that question. Maybe VLIW is a fatally flawed concept. I'm just not sure we've had a clean test of that hypothesis yet. Unfortunately, I doubt Denver will be a clean test of the hypothesis either. 

  • Reply 55 of 114
    pscooter63pscooter63 Posts: 1,080member
    Quote:
    Originally Posted by jungmark View Post



    64 bit is just for marketing. /s

     

    Quote:
    Originally Posted by TimmyDax View Post



    The best use of the A7 last year was marketing.

     

    Only four posts later...  <img class=" src="http://forums-files.appleinsider.com/images/smilies//lol.gif" /> 

  • Reply 56 of 114
    dtidmoredtidmore Posts: 145member
    Quote:
    Originally Posted by Blastdoor View Post

     

    But was the problem HP's compilers not living up to expectations, or was it that Itanium was killed by lack of economy of scale? IIRC, Intel never fabbed Itanium on their latest process, new versions of Itanium were substantially delayed, and Itanium was incredibly expensive. Maybe all of those disadvantages were too much for a great compiler to overcome? 

     

    I really don't know the answer to that question. Maybe VLIW is a fatally flawed concept. I'm just not sure we've had a clean test of that hypothesis yet. Unfortunately, I doubt Denver will be a clean test of the hypothesis either. 




    At least part of the issue was delays on INTEL's part with delivering the processor technology that HP needed.  This started in the very earliest days of the HP/INTEL partnership and caused HP to rely on their PA-RISC architecture long after they had intended to be well down a new path.    But HP admitted that the task of creating the necessary VLIW compilers, which started almost as soon as PA-RISC was released (i.e. years before the anticipated need) proved FAR more difficult than anticipated.  HP hired virtually every PhD CS major with an interest in compiler design, but still it proved a task beyond the abilities of man/money/time.  INTEL did eventually deliver the exact processors that HP had spec'd and they pretty much delivered the raw performance that HP had intended...the devil turned out to be the compilers were never able to deliver code that could exploit the architecture at sufficient efficiency to make the venture pan out.  I well remember that HP anticipated that even the first release of the VLIW hardware/software architecture would be able to exceed native PA-RISC running PA-RISC emulation...that turned out to be a something that did not happen until almost the very end of the life of the project.  Itanium/VLIW never did significantly exceed what PA-RISC delivered even in native mode.  While few have PA-RISC in their bag of experience or knowledge, it was well ahead of its time and I remember attending an INTEL architecture briefing where they flatly stated that they (INTEL) did NOT believe that ANYONE was going to actually supersede PA-RISC for many years, including INTEL (this was several years before the HP/INTEL joint venture into Itanium).  All of this played out during the Fiorina years during which HP lost its way and is only NOW showing a few glowing embers of returning it former self (i.e. "The Machine"). 

  • Reply 57 of 114
    evilutionevilution Posts: 1,399member

    64bit means nothing when it's cobbled together with an OS that is bad in 32bit. It'll be even worse when they add a 64 bit layer over the top of the touchwiz layer, over the top of the butter layer, over the top of the Knox layer, over the top of the other layers that Android/Google/Samsung add to desperately try to keep up with iOS.

     

    They should all take 2 years out and rewrite Android from scratch for a full touch screen instead of persisting with rewrites and layers on an OS that was originally designed for a phone with buttons.

  • Reply 58 of 114

    Chromebooks are likely what NVidia are looking to get these chips into. 

  • Reply 59 of 114
    tallest skiltallest skil Posts: 43,388member
    Originally Posted by TimmyDax View Post

    The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.



    Maybe stop lying, please?

  • Reply 60 of 114
    relicrelic Posts: 4,735member
    tht wrote: »
    For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.

    It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.

    That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?

    For the record Charlie Demerjian worked for the Inquirer before leaving to setup his own shop. Now though instead of writing stories about how the Loch Ness monster had Big Foots love child and is suing for child support, he is now writing ridiculous tech opinions, his site is called semi-accurate for goodness sake, red flag maybe. Charlie is also a hard core AMD fanboy and takes every opportunity to bad mouth Nvidia. Frankly the man is a buffoon and I'm actually flabbergasted that you would actually use him as a credible source, especially when he is also known for bad mouthing Apple every chance he gets. Next time you use Google to find negative information about a company or one of their specific products, I highly recommend you read some of the authors work before pawning off one of their ignorant stories as credible or highly likely.

    -Charlie Demerjian
    "Have you noticed that OSX releases of late, well, to be blunt, suck? It’s not that they suck as a stand alone OS, but they take away a lot of the freedoms and flexibility that the Mac desktop OS user has come to expect. Bit by bit Apple is removing all of the parts that make OSX something other than a phone class OS. The UI changes, the App store, and the overall feel of the new OS move it more and more toward a slightly open iOS, not a UNIX core with a slick GUI on top. It is being progressively closed down and phone-ized. Any guesses as to why?"

    About Samsung vs. Apple Lawsuit;
    "Any naysayers who think the whole court fight is just a pissing match between two petulant children have no choice but to eat their words after this show of force. If Samsung coming out and laying down their design law like this doesn’t sink any Apple claims, I don’t know what will. Game, set, and match, Samsung for the crushing win."

    "Apple’s (AAPL) endless quest to make computing easier for the masses has included thousands of innovations over the years. The GUI was borrowed from Xerox Parc, the Smart Phone from Palm, the Tablet from Microsoft… Apple is good at taking another company’s idea, reworking it, and selling a lot of kit"

    iCloud Release;
    "We think that Apple may have plans in the future for its cloud service, but for now, the only plus the service brings is the get-out-of-jail-free card that the music service brings."

    "Flash is the next in line, expect that to be transitioned away from Apple before the fruity iThingy company is comfortable with the move. Once again, no one can supply Apple with the quantity, quality, and price that Samsung can. Luckily Samsung has a buyer for this too, it is called Samsung. Nice how that circle closes, eh?"

    More Apple vs. Samsung quotes;
    "CPUs may be untouchable once Apple moves to TSMC, but there is still time for a toner cartridge to be dropped in the Steve Jobs Memorial Wing of the Samsung fabs. But that probably won’t happen, there are subtler ways. Screens are definitely being used as a weapon now, and flash will likely follow suit because, well, because they can. Where can Apple go? Nowhere. What can Apple do to make nice? At this point not much, Hallmark doesn’t make a card for this particular situation, or at least there isn’t one on their web site. Heck, there isn't even an applicable category of cards."

    "In the end, Apple suing Samsung was incredibly stupid and self-destructive. In attacking the Korean giant, they antagonized the supplier of not just one but three critical component categories. To make matters worse, none of the three has a second source that is anything close to Samsung in quantity, quality, and price. By suing, Apple may very well have killed itself, and Samsung’s actions today are proof of that."
Sign In or Register to comment.