or Connect
AppleInsider › Forums › Mobile › iPhone › One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android
New Posts  All Forums:Forum Nav:

One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android - Page 2

post #41 of 115
Quote:
Originally Posted by Relic View Post

I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.

I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.

Oh and !@#$%& you!

Take care, Relic. I do enjoy skipping over your longer posts. 1wink.gif

64 bit is just for marketing. /s
post #42 of 115
Quote:
Originally Posted by Relic View Post
 

I'm a girl, a beautiful princess, who currently looks like a dyke, so maybe your first impulse was correct, could you imagine though, waking up with different parts. Anyway, the sentiment is the same, thank you.

Oops, Relic. You're a dudette! My bad. :)

 

Best.

post #43 of 115
Quote:
Originally Posted by jungmark View Post


Take care, Relic. I do enjoy skipping over your longer posts. 1wink.gif

64 bit is just for marketing. /s

I don't blame you as I do get a little carried away at times. I don't know if it's just marketing or not, who cares, it's cool to talk about it.

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #44 of 115

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

post #45 of 115
Quote:
Originally Posted by Relic View Post

I don't blame you as I do get a little carried away at times. I don't know if it's just marketing or not, who cares, it's cool to talk about it.

The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.

I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.
post #46 of 115
For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.

It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.

That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?
post #47 of 115
Quote:
Originally Posted by Constable Odo View Post

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

Good lord, you're back? Did you unload your AAPL shares? You seem (for this moment) to be less of an annoying whiner and modestly "up" on Apple.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #48 of 115
Quote:
Originally Posted by TimmyDax View Post


The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.

I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.

 

What a crock of shit. There are Apps that take advantage of 64bit and I've been using them for some time. While it's true most Apps don't need (or take advantage) of the A7, that doesn't mean that none of them do or that it's marketing.

 

Most people don't utilize a fraction of the processing power in their PC - doesn't mean it's marketing to actually have that power available.

 

There is no "trickle down effect" for Apple. They control everything (hardware, OS and App approval process). Apple will make the transition to 64bit in record time (2 years tops). "Trickle down" is something that applied to Windows and their move to 64bit. There was no incentive for developers to make the switch and MS didn't have any power to move them along.

Author of The Fuel Injection Bible

Reply

Author of The Fuel Injection Bible

Reply
post #49 of 115
Quote:
Originally Posted by wizard69 View Post
 
 
I think you miss the point, this has little to do with Apple as Apples market is captive.

I have never ever heard anyone say or write "I really prefer Android but the only smartphone I am able to, or am allowed to purchase, and this is completely against my will, is an iPhone".   That is what an iPhone 'captive market' would be.

post #50 of 115
Quote:
Originally Posted by TimmyDax View Post


The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.

I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.

 

Sooooo, if you were running Apple, you would tell the developers to upgrade their apps to 64 bit first and only when they're done would you release a 64 bit iPhone?

post #51 of 115
Quote:
Originally Posted by jungmark View Post


64 bit is just for marketing. /s

 

Ha!  I'd forgotten this is what the haters were saying.  Too funny

post #52 of 115
Originally Posted by AppleInsider View Post
"Apple kicked everybody in the balls with this," a Qualcomm employee said at the time. "It's being downplayed, but it set off panic in the industry."

 

Get used to it.  

Apple is pretty good at bollocking "the industry."

Sent from my iPhone Simulator

Reply

Sent from my iPhone Simulator

Reply
post #53 of 115
Quote:
Originally Posted by wizard69 View Post

How did VLIW architectures get pulled into this?

As for Apple I wouldn't be surprised to see them deviate from the norm and start to consider merging CPUs with neural networks, FPGA and the like.

 My impression is that Denver is an evolution of VLIW -- that is, using the compiler to extract parallelism from the code and bundling those instructions together to be executed simultaneously. 

post #54 of 115
Quote:
Originally Posted by THT View Post

For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.

It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.

That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?

It sure sounds VLIW-ish to me, so it would appear that it's not dead yet. 

post #55 of 115
Quote:
Originally Posted by Blastdoor View Post
 

 

Indeed... I would think that if VLIW were ever to succeed in the market, it would be in the context of a tightly integrated stack in which one company controls everything from the silicon up to software distribution. Yet Apple has not chosen to go VLIW, at least not yet. If Apple doesn't think it's a good idea, with their total control over compilers, language, OS, APIs, etc etc... it's a little hard to see how anyone else can make it work. 


HP, back WHEN it actually designed its own processors/compilers/software, went down the VLIW path. That rabbit hole became known as Itanium as they partnered with INTEL rather than go it alone as they had in the past (INTEL made HP specific implementations of the architecture and HP focused on the compilers).   HPs efforts on VLIW computing never did pan out as hoped as the compiler technology which is central to the idea simply never did deliver on the promise.  This is not to say that VLIW is fatally flawed, but it has yet to prove its worth or capabilities in the real world of commercial computing. 

post #56 of 115
Quote:
Originally Posted by dtidmore View Post
 


HP, back WHEN it actually designed its own processors/compilers/software, went down the VLIW path. That rabbit hole became known as Itanium as they partnered with INTEL rather than go it alone as they had in the past (INTEL made HP specific implementations of the architecture and HP focused on the compilers).   HPs efforts on VLIW computing never did pan out as hoped as the compiler technology which is central to the idea simply never did deliver on the promise.  This is not to say that VLIW is fatally flawed, but it has yet to prove its worth or capabilities in the real world of commercial computing. 

But was the problem HP's compilers not living up to expectations, or was it that Itanium was killed by lack of economy of scale? IIRC, Intel never fabbed Itanium on their latest process, new versions of Itanium were substantially delayed, and Itanium was incredibly expensive. Maybe all of those disadvantages were too much for a great compiler to overcome? 

 

I really don't know the answer to that question. Maybe VLIW is a fatally flawed concept. I'm just not sure we've had a clean test of that hypothesis yet. Unfortunately, I doubt Denver will be a clean test of the hypothesis either. 

post #57 of 115
Quote:
Originally Posted by jungmark View Post

64 bit is just for marketing. /s

 

Quote:
Originally Posted by TimmyDax View Post

The best use of the A7 last year was marketing.

 

Only four posts later...  :lol: 

Quality isn't expensive... it's priceless.

Reply

Quality isn't expensive... it's priceless.

Reply
post #58 of 115
Quote:
Originally Posted by Blastdoor View Post
 

But was the problem HP's compilers not living up to expectations, or was it that Itanium was killed by lack of economy of scale? IIRC, Intel never fabbed Itanium on their latest process, new versions of Itanium were substantially delayed, and Itanium was incredibly expensive. Maybe all of those disadvantages were too much for a great compiler to overcome? 

 

I really don't know the answer to that question. Maybe VLIW is a fatally flawed concept. I'm just not sure we've had a clean test of that hypothesis yet. Unfortunately, I doubt Denver will be a clean test of the hypothesis either. 


At least part of the issue was delays on INTEL's part with delivering the processor technology that HP needed.  This started in the very earliest days of the HP/INTEL partnership and caused HP to rely on their PA-RISC architecture long after they had intended to be well down a new path.    But HP admitted that the task of creating the necessary VLIW compilers, which started almost as soon as PA-RISC was released (i.e. years before the anticipated need) proved FAR more difficult than anticipated.  HP hired virtually every PhD CS major with an interest in compiler design, but still it proved a task beyond the abilities of man/money/time.  INTEL did eventually deliver the exact processors that HP had spec'd and they pretty much delivered the raw performance that HP had intended...the devil turned out to be the compilers were never able to deliver code that could exploit the architecture at sufficient efficiency to make the venture pan out.  I well remember that HP anticipated that even the first release of the VLIW hardware/software architecture would be able to exceed native PA-RISC running PA-RISC emulation...that turned out to be a something that did not happen until almost the very end of the life of the project.  Itanium/VLIW never did significantly exceed what PA-RISC delivered even in native mode.  While few have PA-RISC in their bag of experience or knowledge, it was well ahead of its time and I remember attending an INTEL architecture briefing where they flatly stated that they (INTEL) did NOT believe that ANYONE was going to actually supersede PA-RISC for many years, including INTEL (this was several years before the HP/INTEL joint venture into Itanium).  All of this played out during the Fiorina years during which HP lost its way and is only NOW showing a few glowing embers of returning it former self (i.e. "The Machine"). 

post #59 of 115

64bit means nothing when it's cobbled together with an OS that is bad in 32bit. It'll be even worse when they add a 64 bit layer over the top of the touchwiz layer, over the top of the butter layer, over the top of the Knox layer, over the top of the other layers that Android/Google/Samsung add to desperately try to keep up with iOS.

 

They should all take 2 years out and rewrite Android from scratch for a full touch screen instead of persisting with rewrites and layers on an OS that was originally designed for a phone with buttons.

post #60 of 115

Chromebooks are likely what NVidia are looking to get these chips into. 

post #61 of 115
Originally Posted by TimmyDax View Post
The best use of the A7 last year was marketing. There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.


Maybe stop lying, please?

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply
post #62 of 115
Quote:
Originally Posted by THT View Post

For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.

It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.

That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?

For the record Charlie Demerjian worked for the Inquirer before leaving to setup his own shop. Now though instead of writing stories about how the Loch Ness monster had Big Foots love child and is suing for child support, he is now writing ridiculous tech opinions, his site is called semi-accurate for goodness sake, red flag maybe. Charlie is also a hard core AMD fanboy and takes every opportunity to bad mouth Nvidia. Frankly the man is a buffoon and I'm actually flabbergasted that you would actually use him as a credible source, especially when he is also known for bad mouthing Apple every chance he gets. Next time you use Google to find negative information about a company or one of their specific products, I highly recommend you read some of the authors work before pawning off one of their ignorant stories as credible or highly likely.

-Charlie Demerjian
"Have you noticed that OSX releases of late, well, to be blunt, suck? It’s not that they suck as a stand alone OS, but they take away a lot of the freedoms and flexibility that the Mac desktop OS user has come to expect. Bit by bit Apple is removing all of the parts that make OSX something other than a phone class OS. The UI changes, the App store, and the overall feel of the new OS move it more and more toward a slightly open iOS, not a UNIX core with a slick GUI on top. It is being progressively closed down and phone-ized. Any guesses as to why?"

About Samsung vs. Apple Lawsuit;
"Any naysayers who think the whole court fight is just a pissing match between two petulant children have no choice but to eat their words after this show of force. If Samsung coming out and laying down their design law like this doesn’t sink any Apple claims, I don’t know what will. Game, set, and match, Samsung for the crushing win."

"Apple’s (AAPL) endless quest to make computing easier for the masses has included thousands of innovations over the years. The GUI was borrowed from Xerox Parc, the Smart Phone from Palm, the Tablet from Microsoft… Apple is good at taking another company’s idea, reworking it, and selling a lot of kit"

iCloud Release;
"We think that Apple may have plans in the future for its cloud service, but for now, the only plus the service brings is the get-out-of-jail-free card that the music service brings."

"Flash is the next in line, expect that to be transitioned away from Apple before the fruity iThingy company is comfortable with the move. Once again, no one can supply Apple with the quantity, quality, and price that Samsung can. Luckily Samsung has a buyer for this too, it is called Samsung. Nice how that circle closes, eh?"

More Apple vs. Samsung quotes;
"CPUs may be untouchable once Apple moves to TSMC, but there is still time for a toner cartridge to be dropped in the Steve Jobs Memorial Wing of the Samsung fabs. But that probably won’t happen, there are subtler ways. Screens are definitely being used as a weapon now, and flash will likely follow suit because, well, because they can. Where can Apple go? Nowhere. What can Apple do to make nice? At this point not much, Hallmark doesn’t make a card for this particular situation, or at least there isn’t one on their web site. Heck, there isn't even an applicable category of cards."

"In the end, Apple suing Samsung was incredibly stupid and self-destructive. In attacking the Korean giant, they antagonized the supplier of not just one but three critical component categories. To make matters worse, none of the three has a second source that is anything close to Samsung in quantity, quality, and price. By suing, Apple may very well have killed itself, and Samsung’s actions today are proof of that."
Edited by Relic - 8/12/14 at 12:35pm
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #63 of 115
Quote:
Originally Posted by Mac-sochist View Post

For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy.

 

I promised myself I wouldn't respond to you anymore... why do I do this?

 

First, there's a world of difference between relatively objective comparison, which is what Relic does, and mindlessly poo-pooing everything that doesn't come from your preferred vendor. You'll note that Relic's experience with both hardware and software is considerably more extensive than most here. That's a result of an open-mindedness and willingness to explore alternatives that is exactly the opposite of how you're portraying her.

 

Second, obviously her personal preferences play into her assessments, as do those of anyone, including you. Even if she were an Apple hater, which clearly she isn't, I'd still be interested in her posts because they provide perspective for my own planning exercises.

 

TL/dr: Broad perspectives on what other vendors are doing benefits even Apple enthusiasts.

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #64 of 115
Quote:
Originally Posted by Constable Odo View Post
 

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone. 

 

[...]

 

 Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry trade press makes a big deal about everything running on Android because market share is the their most important metric.

Had to fix that last sentence.

 

pundits and analysts don't impact the buying public.  The investing public maybe, but If the phone/laptop/tablet is good, people will buy it.

 

Marketshare is how you measure commodity markets.  soda pop, gasoline, computer chips.

 

Now some may say that Android may shift that in that billions of users (like Windows) will drive developers to make their apps for android first, and sell them for a nickel, and delay selling the app on an Apple device with 300+Million users.  

 

But that model was pre-Internet based.  Post Internet, the local App is selling the razor, the razorblades are in the cloud. 

 
the iOS ecosystem is more profitable per unit effort at this time.    The problem with that logic is that Apple is easier to develop for, and the buyers are willing to pay for quality.... if I can charge .99 for the app, vs that nickel,  I'm making more money per unit, and that is by nature the measure of capitalism (margin).
 

Profits are how you measure the 'success' of any industry/company.  

 
post #65 of 115
Quote:
Originally Posted by Constable Odo View Post
 

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

 

Wall Street isn't necessarily as stupid as you think. BUYERS are stupid, and Wall Street knows it. Analysts know that morons like me respond to gobs of RAM and multiple cores and 3Ghz clock speeds. We're not well-versed in the subtleties of product design that allow some software to do more with less.

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #66 of 115
Quote:
Originally Posted by dtidmore View Post
 


At least part of the issue was delays on INTEL's part with delivering the processor technology that HP needed.  This started in the very earliest days of the HP/INTEL partnership and caused HP to rely on their PA-RISC architecture long after they had intended to be well down a new path.    But HP admitted that the task of creating the necessary VLIW compilers, which started almost as soon as PA-RISC was released (i.e. years before the anticipated need) proved FAR more difficult than anticipated.  HP hired virtually every PhD CS major with an interest in compiler design, but still it proved a task beyond the abilities of man/money/time.  INTEL did eventually deliver the exact processors that HP had spec'd and they pretty much delivered the raw performance that HP had intended...the devil turned out to be the compilers were never able to deliver code that could exploit the architecture at sufficient efficiency to make the venture pan out.  I well remember that HP anticipated that even the first release of the VLIW hardware/software architecture would be able to exceed native PA-RISC running PA-RISC emulation...that turned out to be a something that did not happen until almost the very end of the life of the project.  Itanium/VLIW never did significantly exceed what PA-RISC delivered even in native mode.  While few have PA-RISC in their bag of experience or knowledge, it was well ahead of its time and I remember attending an INTEL architecture briefing where they flatly stated that they (INTEL) did NOT believe that ANYONE was going to actually supersede PA-RISC for many years, including INTEL (this was several years before the HP/INTEL joint venture into Itanium).  All of this played out during the Fiorina years during which HP lost its way and is only NOW showing a few glowing embers of returning it former self (i.e. "The Machine"). 

 

Thanks -- interesting stuff! 

 

Hopefully for HP's sake "The Machine" will be more like PA-RISC and less like Itanium. 

post #67 of 115
Quote:
Originally Posted by wizard69 View Post

No problem! Honestly I'd rather see a few screw ups from you then to not hear from you at all! You are greatly missed when you don't post at all.
Quote:
Originally Posted by Relic View Post

I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.

I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.

Oh and !@#$%& you!

Way to go Relic!

Oh and hang in there we are all pulling for you.

^^^ This!
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #68 of 115
B-b-b-but it can't be fast because it's not Quad/Octa-core! Surely it's the silicon your software can't see that counts!

McD
Android proves (as Windows & VHS did before it) that if you want to control people, give us choices and the belief we're capable of making them. We're all 'living' the American dream.
Reply
Android proves (as Windows & VHS did before it) that if you want to control people, give us choices and the belief we're capable of making them. We're all 'living' the American dream.
Reply
post #69 of 115
Quote:
Originally Posted by Relic View Post

For the record Charlie Demerjian worked for the Inquirer before leaving to setup his own shop. Now though instead of writing stories about how the Loch Ness monster had Big Foots love child and is suing for child support, he is now writing ridiculous tech opinions, his site is called semi-accurate for goodness sake, red flag maybe. Charlie is also a hard core AMD fanboy and takes every opportunity to bad mouth Nvidia. Frankly the man is a buffoon and I'm actually flabbergasted that you would actually use him as a credible source, especially when he is also known for bad mouthing Apple every chance he gets.

Yes. I know of his reputation as well as his history. He's a drama queen and has a sensationalist style wherever he works all the way back to the Inquirer.

But he's been in the tech hardware gossip rag business for over a decade, especially for CPUs, and knows the fundentals of EE and processors. He does have moles in East Asia in regards to foundries and stuff like that.

His negativity towards Tegra has been right for 4 or 5 years now. Maybe he was lucky, but for that long? Nvidia has missed their promised Tegra road maps and perf/watt targets for basically 5 years now. They don't have any significant OEM wins for 3 years. That's three hype and ship cycles there. They can't even get the 32 bit Tegra K1 shipping in a flagship from a major OEM. It's been nothing but Snapdragons for like 2 or 3 years running. Frankly, I'm getting bored with seeing nothing but Qualcomm SoCs in Android flagships. Even the non flagships don't use Tegra SoCs.

Now, you think Nvidia is going to ship a custom CPU arch, a VLIW one at that, that'll be competitive to ARMH cores or Qualcomm cores? I don't, and wouldn't be surprised if it is dead. I'm skeptical until they ship. Maybe they'll ship in embedded devices or servers, but the people on this board could care less about that space.
post #70 of 115
Quote:
Originally Posted by Lorin Schultz View Post
 

 

I promised myself I wouldn't respond to you anymore... why do I do this?

 

First, there's a world of difference between relatively objective comparison, which is what Relic does, and mindlessly poo-pooing everything that doesn't come from your preferred vendor. You'll note that Relic's experience with both hardware and software is considerably more extensive than most here. That's a result of an open-mindedness and willingness to explore alternatives that is exactly the opposite of how you're portraying her.

 

Second, obviously her personal preferences play into her assessments, as do those of anyone, including you. Even if she were an Apple hater, which clearly she isn't, I'd still be interested in her posts because they provide perspective for my own planning exercises.

 

TL/dr: Broad perspectives on what other vendors are doing benefits even Apple enthusiasts.

 

They you so much for those kind words Lorin, luv yeah!

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #71 of 115
Quote:
Originally Posted by wizard69 View Post

How did VLIW architectures get pulled into this?

As for Apple I wouldn't be surprised to see them deviate from the norm and start to consider merging CPUs with neural networks, FPGA and the like.

64 bit Tegra K1 is supposedly a dual core VLIW arch CPU with a Kepler arch GPU. AI is saying that 128 Mbyte of main memory will be used to store profiled ARM instructions. There likely needs to be a code interpreter that does the code morphing and profiling, etc. So, similar to Transmeta's Crusoe and Efficeon.

I said that Demerjian is gossiping that this arch is dead. But the PR here is saying it's still there. So, Nvidia is going full steam ahead here. We'll see how it performs. These kinds of arch don't handle spaghetti code well, which browsing basically is. So wait and see.

As for Apple, I think it's all going to be I/O, memory performance and low power. Well, a super low power display and super low power wireless network performance over that if they are resource constrained.
post #72 of 115
Quote:
Originally Posted by Lloydbm4 View Post
 

Chromebooks are likely what NVidia are looking to get these chips into. 

 

Like the new Acer ChromeBook 13, yyyyaaaayyyyyyy, forget Android phones, I want to see Linux laptops and mini computers with the new 64bit K1.

 

This is only the 32bit chip but still very fast, in fact the early benchmarks are showing that the new Tegra K1 is pulling faster times in SunSpider then other Chromebooks using a Celeron or Atom.

 

http://www.slashgear.com/acer-chromebook-13-packs-tegra-k1-for-13hr-battery-11340270/

http://www.androidcentral.com/acer-announces-tegra-k1-powered-chromebook-13

 

Specs;

Tegra K1, 2.1 Ghz, Quad Core

13.3" 1080P LCD Display

4GB Ram

32GB Storage

13+ hour battery

100GB Google Drive storage for 2 years

 

Price is listed at 380.00 but I'm sure like the other Acer Chromebooks you should be able to find one for 350 buckaroony's. Then Install Ubuntu and Steam along side ChromeOS, and you'll have a pretty neat and cheap portable gaming laptop.

 

 

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #73 of 115
Quote:
Originally Posted by Dick Applebaum View Post


^^^ This!

Thank you Dick, I rarely ever get angry at people but that guy just made me so, so mad. I know not everyone enjoys what I say but after Marvin scolded me on some of my writing technique's, I have honestly made an effort to cut back on the rhetoric. However, I don't think I deserve to be called a troll in my own house not saying I own this place or anything, but I am part of the family, better or for worst, right? I mean I have been here for more then a decade, spend an ungodly amount of time here, even write what I hope to be entertaining little stories. I don't know, I just felt hurt and I really don't need to feel any worst then I already do. Again, thank you very much for your kind words and I will always be here if you ever need something.

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #74 of 115
I sometimes believe that for every intelligent person working Wall Street there are at least two loud mouthed idiots to make home look bad. There really are some extremely brilliant people in the financial industry, but you don't hear much from them.
Quote:
Originally Posted by Constable Odo View Post

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

Actually more RAM and faster RAM could be huge for iPad. People try to dismiss the importance of RAM but it is something Apple needs to find an optimal balance point for each new generation of iOS devices. I'm actually extremely disappointed with Apple in that they have stalled RAM and even flash upgrades in the iPads. I just expect better from them.
post #75 of 115
Quote:
Originally Posted by TimmyDax View Post

The best use of the A7 last year was marketing.
Apple certainly has a marketing advantage with A7, however that is not what made A7 important this year. To put it in Pentagon terms they have prepared the battle field. Effectively they have put a lot of infrastructure in place.
Quote:
There were no apps that used it, and I don't know if anyone's using it now (newer 3D games?). It was for backwards compatibility in a couple of years, or future-proofing, whichever.
That is misleading as Apple shipped a complete 64 bit OS apps included. Obviously third party apps are behind because 64 bit was a surprise. However apps are already taking advantage of the chip and IOS 8 just pushes that forward.
Quote:
I doubt any popular (ie. sales) Android phones/tabs will support it for a good few years.

Apple has the trickle down effect they started before anyone else. First year every new thing had the same processor.
post #76 of 115
Quote:
Originally Posted by Mac-sochist View Post

Relic has been earning her pay pretty well for a while now, but she's dropping the ball lately.

For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"

It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.

 

Relic, troll? No. And I think she owns an Apple share or two. I also think that Relic has probably used dozens of Apple products.

"If the young are not initiated into the village, they will burn it down just to feel its warmth."
- African proverb
Reply
"If the young are not initiated into the village, they will burn it down just to feel its warmth."
- African proverb
Reply
post #77 of 115
Quote:
Originally Posted by Constable Odo View Post
 

Still, the pundits in the smartphone industry are always braying about how Apple has fallen way behind all the Android smartphone manufacturers (which I suppose has to do with some specs of components).  Apple has never been given a nod by Wall Street saying the iPhone has some sort of processing edge over any Android flagship smartphone.  All Android manufacturers have to do is add on a few more cores and crank the clock speed towards 3GHz up to break every benchmark record possible.  I'm willing to bet all the analysts are going to be jumping in now saying how much more competition the iPhone has because the analysts are thinking that every Android smartphone and tablet is going to be equipped with some 64-bit processor aimed at taking Apple down.  Almost nothing Apple has come out with has ever given Apple a clear advantage in order to give the company potential future value in the eyes of Wall Street.  I suppose even the A8 processor isn't going to excite anyone in the industry unless it's a quad-core processor or something and they'll complain that Apple's measly 1GB RAM isn't enough to do a decent job when Android smartphones already come with 3GB RAM and soon to be bumped to 4GB.  The industry makes a big deal about everything running on Android because market share is the most important metric.

You're reading the wrong Wall Street blogs then my friend, one of the only things a lot of these journalist  agree upon is how Apple is changing the world. I think your mind set is still stuck in the 90's. Apple won, game over, their the most powerful computer company and brand name in the world and will be so for a very long time. Throw a search term about Apple in any which direction on the Internet and you'll find more positive stories about Apple then negative. I notice this a lot here, it's like you guys want to hear bad press on Apple just for the opportunity to go after the individual who wrote the story.

 

Shall we play a game, you go find as many negative stories about Apple in 30 minutes and I will do the same for positive. We can the tally the results, we'll need someone to officiate and say go.

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.
Reply
post #78 of 115
(......guys....there's a girl in here.......)
post #79 of 115
Quote:
Originally Posted by THT View Post

For the record, Charlie Demerjian (semiaccurate.com) thinks Denver is dead, or will never ship. When he says this, he is thinking of the Denver (aka 64 bit Tegra K1) project using something like Transmeta's VLIW Efficeon architecture. One of the rumors was that Nvidia revectored the arch from x64 to ARM. Anyways, Demerjian has been gossiping that Denver was a long ways away from meeting perf/watt targets, and may never ship in a handheld, and thusly, is likely dead.
If this is all true (I don't follow NVidia closely) then someone at NVidia has fallen off their rocker. Transmutation chip never lived up to any of its promises so I don't see much hope for it here. There might be some advantageous to translating a RISC instruction set but no matter what you have a huge overhead to deal with that isn't nice in a power restricted space.
Quote:
It appears Nvidia is still touting VLIW was some kind of instruction morphing or translating going on, so some remnants of the original Denver is still there. Or they've managed to make it work. It's really not big news until they get a win from a significant OEM.
That is for sure.
Quote:
That 128 megabyte instruction comment by AI has got to be wrong. Maybe it's 128 instructions?
Or 128KB? If they can cache a huge amount of instructions it could help performance at least for benchmarks.

Honestly though this just strikes me as a wasted effort as one can put a little bit of effort I to an enhanced ARM core and get better results.
post #80 of 115
Quote:
Originally Posted by cali View Post

(......guys....there's a girl in here.......)

 

It's okay. She's more afraid of you than you are of her.

 

 

 

 

Or at least she should be.

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPhone
  • One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android
AppleInsider › Forums › Mobile › iPhone › One year after Apple's A7, Nvidia announces first 64-bit ARM CPU for Android