One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android

1246

Comments

  • Reply 61 of 114
    Quote:

    Originally Posted by Mac-sochist View Post



    For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy.

     

    I promised myself I wouldn't respond to you anymore... why do I do this?

     

    First, there's a world of difference between relatively objective comparison, which is what Relic does, and mindlessly poo-pooing everything that doesn't come from your preferred vendor. You'll note that Relic's experience with both hardware and software is considerably more extensive than most here. That's a result of an open-mindedness and willingness to explore alternatives that is exactly the opposite of how you're portraying her.

     

    Second, obviously her personal preferences play into her assessments, as do those of anyone, including you. Even if she were an Apple hater, which clearly she isn't, I'd still be interested in her posts because they provide perspective for my own planning exercises.

     

    TL/dr: Broad perspectives on what other vendors are doing benefits even Apple enthusiasts.

     0Likes 0Dislikes 0Informatives
  • Reply 62 of 114
    Quote:

    Originally Posted by Constable Odo View Post

     

    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone. 

     

    [...]

     

     Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry trade press makes a big deal about everything running on Android because market share is the their most important metric.


    Had to fix that last sentence.

     

    pundits and analysts don't impact the buying public.  The investing public maybe, but If the phone/laptop/tablet is good, people will buy it.

     

    Marketshare is how you measure commodity markets.  soda pop, gasoline, computer chips.

     

    Now some may say that Android may shift that in that billions of users (like Windows) will drive developers to make their apps for android first, and sell them for a nickel, and delay selling the app on an Apple device with 300+Million users.  

     

    But that model was pre-Internet based.  Post Internet, the local App is selling the razor, the razorblades are in the cloud. 

     


    the iOS ecosystem is more profitable per unit effort at this time.    The problem with that logic is that Apple is easier to develop for, and the buyers are willing to pay for quality.... if I can charge .99 for the app, vs that nickel,  I'm making more money per unit, and that is by nature the measure of capitalism (margin).


     


    Profits are how you measure the 'success' of any industry/company.  

     

     0Likes 0Dislikes 0Informatives
  • Reply 63 of 114
    Quote:

    Originally Posted by Constable Odo View Post

     

    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.


     

    Wall Street isn't necessarily as stupid as you think. BUYERS are stupid, and Wall Street knows it. Analysts know that morons like me respond to gobs of RAM and multiple cores and 3Ghz clock speeds. We're not well-versed in the subtleties of product design that allow some software to do more with less.

     0Likes 0Dislikes 0Informatives
  • Reply 64 of 114
    blastdoorblastdoor Posts: 3,845member
    Quote:

    Originally Posted by dtidmore View Post

     



    At least part of the issue was delays on INTEL's part with delivering the processor technology that HP needed.  This started in the very earliest days of the HP/INTEL partnership and caused HP to rely on their PA-RISC architecture long after they had intended to be well down a new path.    But HP admitted that the task of creating the necessary VLIW compilers, which started almost as soon as PA-RISC was released (i.e. years before the anticipated need) proved FAR more difficult than anticipated.  HP hired virtually every PhD CS major with an interest in compiler design, but still it proved a task beyond the abilities of man/money/time.  INTEL did eventually deliver the exact processors that HP had spec'd and they pretty much delivered the raw performance that HP had intended...the devil turned out to be the compilers were never able to deliver code that could exploit the architecture at sufficient efficiency to make the venture pan out.  I well remember that HP anticipated that even the first release of the VLIW hardware/software architecture would be able to exceed native PA-RISC running PA-RISC emulation...that turned out to be a something that did not happen until almost the very end of the life of the project.  Itanium/VLIW never did significantly exceed what PA-RISC delivered even in native mode.  While few have PA-RISC in their bag of experience or knowledge, it was well ahead of its time and I remember attending an INTEL architecture briefing where they flatly stated that they (INTEL) did NOT believe that ANYONE was going to actually supersede PA-RISC for many years, including INTEL (this was several years before the HP/INTEL joint venture into Itanium).  All of this played out during the Fiorina years during which HP lost its way and is only NOW showing a few glowing embers of returning it former self (i.e. "The Machine"). 


     

    Thanks -- interesting stuff! 

     

    Hopefully for HP's sake "The Machine" will be more like PA-RISC and less like Itanium. 

     0Likes 0Dislikes 0Informatives
  • Reply 65 of 114
    wizard69 wrote: »
    No problem! Honestly I'd rather see a few screw ups from you then to not hear from you at all! You are greatly missed when you don't post at all.
    relic wrote: »
    I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.

    I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.

    Oh and !@#$%& you!

    Way to go Relic!

    Oh and hang in there we are all pulling for you.

    ^^^ This!
     0Likes 0Dislikes 0Informatives
  • Reply 66 of 114
    mcdavemcdave Posts: 1,927member
    B-b-b-but it can't be fast because it's not Quad/Octa-core! Surely it's the silicon your software can't see that counts!

    McD
     0Likes 0Dislikes 0Informatives
  • Reply 67 of 114
    thttht Posts: 6,017member
    relic wrote: »
    For the record Charlie Demerjian worked for the Inquirer before leaving to setup his own shop. Now though instead of writing stories about how the Loch Ness monster had Big Foots love child and is suing for child support, he is now writing ridiculous tech opinions, his site is called semi-accurate for goodness sake, red flag maybe. Charlie is also a hard core AMD fanboy and takes every opportunity to bad mouth Nvidia. Frankly the man is a buffoon and I'm actually flabbergasted that you would actually use him as a credible source, especially when he is also known for bad mouthing Apple every chance he gets.

    Yes. I know of his reputation as well as his history. He's a drama queen and has a sensationalist style wherever he works all the way back to the Inquirer.

    But he's been in the tech hardware gossip rag business for over a decade, especially for CPUs, and knows the fundentals of EE and processors. He does have moles in East Asia in regards to foundries and stuff like that.

    His negativity towards Tegra has been right for 4 or 5 years now. Maybe he was lucky, but for that long? Nvidia has missed their promised Tegra road maps and perf/watt targets for basically 5 years now. They don't have any significant OEM wins for 3 years. That's three hype and ship cycles there. They can't even get the 32 bit Tegra K1 shipping in a flagship from a major OEM. It's been nothing but Snapdragons for like 2 or 3 years running. Frankly, I'm getting bored with seeing nothing but Qualcomm SoCs in Android flagships. Even the non flagships don't use Tegra SoCs.

    Now, you think Nvidia is going to ship a custom CPU arch, a VLIW one at that, that'll be competitive to ARMH cores or Qualcomm cores? I don't, and wouldn't be surprised if it is dead. I'm skeptical until they ship. Maybe they'll ship in embedded devices or servers, but the people on this board could care less about that space.
     0Likes 0Dislikes 0Informatives
  • Reply 68 of 114
    relicrelic Posts: 4,735member
    Quote:

    Originally Posted by Lorin Schultz View Post

     

     

    I promised myself I wouldn't respond to you anymore... why do I do this?

     

    First, there's a world of difference between relatively objective comparison, which is what Relic does, and mindlessly poo-pooing everything that doesn't come from your preferred vendor. You'll note that Relic's experience with both hardware and software is considerably more extensive than most here. That's a result of an open-mindedness and willingness to explore alternatives that is exactly the opposite of how you're portraying her.

     

    Second, obviously her personal preferences play into her assessments, as do those of anyone, including you. Even if she were an Apple hater, which clearly she isn't, I'd still be interested in her posts because they provide perspective for my own planning exercises.

     

    TL/dr: Broad perspectives on what other vendors are doing benefits even Apple enthusiasts.


     

    They you so much for those kind words Lorin, luv yeah!

     0Likes 0Dislikes 0Informatives
  • Reply 69 of 114
    thttht Posts: 6,017member
    wizard69 wrote: »
    How did VLIW architectures get pulled into this?

    As for Apple I wouldn't be surprised to see them deviate from the norm and start to consider merging CPUs with neural networks, FPGA and the like.

    64 bit Tegra K1 is supposedly a dual core VLIW arch CPU with a Kepler arch GPU. AI is saying that 128 Mbyte of main memory will be used to store profiled ARM instructions. There likely needs to be a code interpreter that does the code morphing and profiling, etc. So, similar to Transmeta's Crusoe and Efficeon.

    I said that Demerjian is gossiping that this arch is dead. But the PR here is saying it's still there. So, Nvidia is going full steam ahead here. We'll see how it performs. These kinds of arch don't handle spaghetti code well, which browsing basically is. So wait and see.

    As for Apple, I think it's all going to be I/O, memory performance and low power. Well, a super low power display and super low power wireless network performance over that if they are resource constrained.
     0Likes 0Dislikes 0Informatives
  • Reply 70 of 114
    relicrelic Posts: 4,735member
    Quote:
    Originally Posted by Lloydbm4 View Post

     

    Chromebooks are likely what NVidia are looking to get these chips into. 


     

    Like the new Acer ChromeBook 13, yyyyaaaayyyyyyy, forget Android phones, I want to see Linux laptops and mini computers with the new 64bit K1.

     

    This is only the 32bit chip but still very fast, in fact the early benchmarks are showing that the new Tegra K1 is pulling faster times in SunSpider then other Chromebooks using a Celeron or Atom.

     

    http://www.slashgear.com/acer-chromebook-13-packs-tegra-k1-for-13hr-battery-11340270/

    http://www.androidcentral.com/acer-announces-tegra-k1-powered-chromebook-13

     

    Specs;

    Tegra K1, 2.1 Ghz, Quad Core

    13.3" 1080P LCD Display

    4GB Ram

    32GB Storage

    13+ hour battery

    100GB Google Drive storage for 2 years

     

    Price is listed at 380.00 but I'm sure like the other Acer Chromebooks you should be able to find one for 350 buckaroony's. Then Install Ubuntu and Steam along side ChromeOS, and you'll have a pretty neat and cheap portable gaming laptop.

     

    image 

     0Likes 0Dislikes 0Informatives
  • Reply 71 of 114
    relicrelic Posts: 4,735member
    Quote:
    Originally Posted by Dick Applebaum View Post





    ^^^ This!

    Thank you Dick, I rarely ever get angry at people but that guy just made me so, so mad. I know not everyone enjoys what I say but after Marvin scolded me on some of my writing technique's, I have honestly made an effort to cut back on the rhetoric. However, I don't think I deserve to be called a troll in my own house not saying I own this place or anything, but I am part of the family, better or for worst, right? I mean I have been here for more then a decade, spend an ungodly amount of time here, even write what I hope to be entertaining little stories. I don't know, I just felt hurt and I really don't need to feel any worst then I already do. Again, thank you very much for your kind words and I will always be here if you ever need something.

     0Likes 0Dislikes 0Informatives
  • Reply 72 of 114
    wizard69wizard69 Posts: 13,377member
    I sometimes believe that for every intelligent person working Wall Street there are at least two loud mouthed idiots to make home look bad. There really are some extremely brilliant people in the financial industry, but you don't hear much from them.
    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

    Actually more RAM and faster RAM could be huge for iPad. People try to dismiss the importance of RAM but it is something Apple needs to find an optimal balance point for each new generation of iOS devices. I'm actually extremely disappointed with Apple in that they have stalled RAM and even flash upgrades in the iPads. I just expect better from them.
     0Likes 0Dislikes 0Informatives
  • Reply 73 of 114
    wizard69wizard69 Posts: 13,377member
    timmydax wrote: »
    The best use of the A7 last year was marketing.
    Apple certainly has a marketing advantage with A7, however that is not what made A7 important this year. To put it in Pentagon terms they have prepared the battle field. Effectively they have put a lot of infrastructure in place.
    There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.
    That is misleading as Apple shipped a complete 64 bit OS apps included. Obviously third party apps are behind because 64 bit was a surprise. However apps are already taking advantage of the chip and IOS 8 just pushes that forward.
    I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

    Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.
     0Likes 0Dislikes 0Informatives
  • Reply 74 of 114
    Quote:

    Originally Posted by Mac-sochist View Post



    Relic has been earning her pay pretty well for a while now, but she's dropping the ball lately.



    For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"



    It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.

     

    Relic, troll? No. And I think she owns an Apple share or two. I also think that Relic has probably used dozens of Apple products.

     0Likes 0Dislikes 0Informatives
  • Reply 75 of 114
    relicrelic Posts: 4,735member
    Quote:
    Originally Posted by Constable Odo View Post

     

    Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.


    You're reading the wrong Wall Street blogs then my friend, one of the only things a lot of these journalist  agree upon is how Apple is changing the world. I think your mind set is still stuck in the 90's. Apple won, game over, their the most powerful computer company and brand name in the world and will be so for a very long time. Throw a search term about Apple in any which direction on the Internet and you'll find more positive stories about Apple then negative. I notice this a lot here, it's like you guys want to hear bad press on Apple just for the opportunity to go after the individual who wrote the story.

     

    Shall we play a game, you go find as many negative stories about Apple in 30 minutes and I will do the same for positive. We can the tally the results, we'll need someone to officiate and say go.

     0Likes 0Dislikes 0Informatives
  • Reply 76 of 114
    calicali Posts: 3,494member
    (......guys....there's a girl in here.......)
     0Likes 0Dislikes 0Informatives
  • Reply 77 of 114
    wizard69wizard69 Posts: 13,377member
    tht wrote: »
    For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.
    If this is all true (I don't follow NVidia closely) then someone at NVidia has fallen off their rocker. Transmutation chip never lived up to any of its promises so I don't see much hope for it here. There might be some advantageous to translating a RISC instruction set but no matter what you have a huge overhead to deal with that isn't nice in a power restricted space.
    It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.
    That is for sure.
    That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?
    Or 128KB? If they can cache a huge amount of instructions it could help performance at least for benchmarks.

    Honestly though this just strikes me as a wasted effort as one can put a little bit of effort I to an enhanced ARM core and get better results.
     0Likes 0Dislikes 0Informatives
  • Reply 78 of 114
    Quote:
    Originally Posted by cali View Post



    (......guys....there's a girl in here.......)

     

    It's okay. She's more afraid of you than you are of her.

     

     

     

     

    Or at least she should be.

     0Likes 0Dislikes 0Informatives
  • Reply 79 of 114
    jungmarkjungmark Posts: 6,928member
    relic wrote: »
    However, I don't think I deserve to be called a troll in my own house not saying I own this place or anything, but I am part of the family, better or for worst, right? I mean I have been here for more then a decade, spend an ungodly amount of time here, even write what I hope to be entertaining little stories.

    You are family. Sort of like that creepy aunt with lots of cats. Haha.
     0Likes 0Dislikes 0Informatives
  • Reply 80 of 114
    Quote:

    Originally Posted by jungmark View Post

     
    Quote:

    Originally Posted by Relic View Post



    However, I don't think I deserve to be called a troll in my own house not saying I own this place or anything, but I am part of the family, better or for worst, right? I mean I have been here for more then a decade, spend an ungodly amount of time here, even write what I hope to be entertaining little stories.




    You are family. Sort of like that creepy aunt with lots of cats. Haha.

     

    Whaaaat? Get out of here. Relic's cool.

     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.