Go NVidia! Can mass CPU shipments coupled with widespread support for 64-bit benefits by the OS, great developer tools, and tons of third-party apps in users' hands be far behind? That's what we on iOS had a year ago, so I'm sure Android 64-bit support is now all caught-up with Google, developers, and handset makers alike! August 2014 a great month for Android users! They'll probably have CPUs with well-planned OS-integrated secure enclaves at the same time, no doubt!
/sarcasm
Screw Android, a new market for powerful mini Linux computers. drrooolllll.....
Relic has been earning her pay pretty well for a while now, but she's dropping the ball lately.
For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"
It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.
Nvidia says each Denver core is capable of processing up to seven operations per clock cycle, compared with a reported six instructions per clock for the A7…
[monotone] Wow. What an amazing chip. The benefits are staggering. [/monotone]
Originally Posted by sog35
I wonder why NVidia is developing a gimmick processor?
Because they’ve hit a roadblock with their next series of GPUs, maybe?
Relic has been earning her pay pretty well for a while now, but she's dropping the ball lately.
For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"
It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.
I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.
I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.
Raise your hand if you actually own an nvidia powered mobile device...
... All two of you must be very excited at this announcement.
I think you mis the point, this has little to do with Apple as Apples market is captive. What it is is strong competition for the part of the market Intel wants.
Doesn't pin compatibility with a 32 bit CPU tell us that the 64 bit cores are hobbled with a 32 bit wide path to the outside world?
All APU style chips are hobbled by the memory interface. That is one reason why Intel gas added an in package fast cache chip. It is also why on chip caches are so important in these sorts of SoC.
So this chip and this OS are basically being developed in vacuums and it will be up to the 'integrators' to make them work together. Who is invested in Performance? (Integration is a 'time to market' problem.)
Indeed... I would think that if VLIW were ever to succeed in the market, it would be in the context of a tightly integrated stack in which one company controls everything from the silicon up to software distribution. Yet Apple has not chosen to go VLIW, at least not yet. If Apple doesn't think it's a good idea, with their total control over compilers, language, OS, APIs, etc etc... it's a little hard to see how anyone else can make it work.
Nvidia is already shipping the 64bit ARM Opteron development kits to manufactures, so I have no doubt their mobile chips will closely follow. 1.5 years away, where did you come up with that, its not like Nvidia saw Apple release a 64bit version back in 2013 and go, oh my gosh ,we need one of those too. Nvidia announced a 64Bit ARM chip roadmap right after ARM themselves announced theirs, with Opteron to be first back in 2012 to be released in 2014, so it looks like their right on schedule. Actually if you look at almost every manufactures roadmap who produces ARM chips, 64bit was announced 2012 to be released in 2014, Samsung included.
I just don't understand the necessity to belittle everything that isn't Apple.
I take your point, Relic...and you're right. But in this case, I think Solip, makes a good point. I remember Stevo saying words to the effect, "In tech you have to be 10 years ahead and Apple is about 5 years ahead." This is around the time of the iPhone release. I think. Solip is right is not just matching a product release, but matching the sales volume, too. I hadn't thought of that.
All APU style chips are hobbled by the memory interface. That is one reason why Intel gas added an in package fast cache chip. It is also why on chip caches are so important in these sorts of SoC.
interesting, the Denver has 128KB of L1 instruction cache and 64KB of L1 data cache, looking good.
I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.
I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.
I take your point, Relic...and your right. But in this case, I think Solip, makes a good point. I remember Stevo saying words to the effect, "In tech you have to be 10 years ahead and Apple is about 5 years ahead." This is around the time of the iPhone release. I think. Solip is right is not just matching a product release, but matching the sales volume, too. I hadn't thought of that.
Anyway,
Best.
Yea, I agree with him too. Nvidia does seem to be on track though and I have no doubt we will see a product with a Denver 64bit this year
So this chip and this OS are basically being developed in vacuums and it will be up to the 'integrators' to make them work together. Who is invested in Performance? (Integration is a 'time to market' problem.)
I say all this in the vein that CPUs and OSes and Compilers have cycled back to the 70's and 80's when and have learned from their mistakes as well.
1) You don't design long term backwards compatibility into your chips. You design forward compatibility into your compilers.
2) Optimizing your pipeline is more an exercise in compiler technology
3) Designing OSes and SoC separately slow down performance innovations.
Bingo. Who cares how many instructions it can perform in parallel? Where is the compiler coming from to take advantage of it? Is Google going to make changes to the Android SDK to take advantage if this? Is Nvidia going to develop an add-in compiler tool to work with Android?
And when we have Exynos and Snapdragon 64bit processors, what then? If they all have different widths (very likely) and different methods of dispatching instructions (very likely) than what happens to the Android SDK? Are you going to have 3 different plug-ins to compile/optmize your code based on which processor you use?
Then add in Intel and Android on x86.
The whole "64bit on Android" looks to be a huge fragmented mess for developers.
Indeed... I would think that if VLIW were ever to succeed in the market, it would be in the context of a tightly integrated stack in which one company controls everything from the silicon up to software distribution. Yet Apple has not chosen to go VLIW, at least not yet. If Apple doesn't think it's a good idea, with their total control over compilers, language, OS, APIs, etc etc... it's a little hard to see how anyone else can make it work.
It certainly isn't bad at all, that is a lot of capability when you consider the chip is targetting hand held devices. I'm sitting here remembering the day when you actually had to pay extra for cache chips. It is surprising how far we have come.
The thing here is everybody is expecting Apple to debut a quad core chip. That may happen but they could vastly improve A7 by tweaking this like cache size.
I'm a girl, a beautiful princess, who currently looks like a dyke, so maybe your first impulse was correct, could you imagine though, waking up with different parts. Anyway, the sentiment is the same, thank you.
I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.
I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.
Oh and !@#$%& you!
Take care, Relic. I do enjoy skipping over your longer posts.
I'm a girl, a beautiful princess, who currently looks like a dyke, so maybe your first impulse was correct, could you imagine though, waking up with different parts. Anyway, the sentiment is the same, thank you.
Comments
Go NVidia! Can mass CPU shipments coupled with widespread support for 64-bit benefits by the OS, great developer tools, and tons of third-party apps in users' hands be far behind? That's what we on iOS had a year ago, so I'm sure Android 64-bit support is now all caught-up with Google, developers, and handset makers alike! August 2014 a great month for Android users! They'll probably have CPUs with well-planned OS-integrated secure enclaves at the same time, no doubt!
/sarcasm
Screw Android, a new market for powerful mini Linux computers. drrooolllll.....
For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"
It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.
[monotone] Wow. What an amazing chip. The benefits are staggering. [/monotone]
Because they’ve hit a roadblock with their next series of GPUs, maybe?
Relic has been earning her pay pretty well for a while now, but she's dropping the ball lately.
For a purported expert on all the tech that's better than Apple, (which is all of it) and what's wrong with every Apple product, (everything) to say she "wouldn't mind" if the iPhone went to a 16:9 aspect ratio, but 4:3 like they are now is better—well...that's just sloppy. That's up there with "I've owned seven of every Mac model since 78 B.C., but damn it, when is Apple going to give us a two-button mouse?"
It's so obvious that most of these trolls have never used, or in some cases seen an Apple product.
I'm currently in the hospital, I have a tube down my throat at the moment, heavily medicated and the worst part is I have to go to the bathroom but that involves two nurses helping me onto a chair potty thing with wheels all the while with them standing next to me, so I ask for a little leniency.
I don't have an iPhone, I've said this many times and why,I have an iPad and I also apologized for screwing that up. I'm also sorry for mixing up AMD up with Nvidia, in my mind set when I wrote that I couldn't see the difference, I'm on fentanyl.
I think you mis the point, this has little to do with Apple as Apples market is captive. What it is is strong competition for the part of the market Intel wants.
All APU style chips are hobbled by the memory interface. That is one reason why Intel gas added an in package fast cache chip. It is also why on chip caches are so important in these sorts of SoC.
So this chip and this OS are basically being developed in vacuums and it will be up to the 'integrators' to make them work together. Who is invested in Performance? (Integration is a 'time to market' problem.)
Indeed... I would think that if VLIW were ever to succeed in the market, it would be in the context of a tightly integrated stack in which one company controls everything from the silicon up to software distribution. Yet Apple has not chosen to go VLIW, at least not yet. If Apple doesn't think it's a good idea, with their total control over compilers, language, OS, APIs, etc etc... it's a little hard to see how anyone else can make it work.
Nvidia is already shipping the 64bit ARM Opteron development kits to manufactures, so I have no doubt their mobile chips will closely follow. 1.5 years away, where did you come up with that, its not like Nvidia saw Apple release a 64bit version back in 2013 and go, oh my gosh ,we need one of those too. Nvidia announced a 64Bit ARM chip roadmap right after ARM themselves announced theirs, with Opteron to be first back in 2012 to be released in 2014, so it looks like their right on schedule. Actually if you look at almost every manufactures roadmap who produces ARM chips, 64bit was announced 2012 to be released in 2014, Samsung included.
I just don't understand the necessity to belittle everything that isn't Apple.
I take your point, Relic...and you're right. But in this case, I think Solip, makes a good point. I remember Stevo saying words to the effect, "In tech you have to be 10 years ahead and Apple is about 5 years ahead." This is around the time of the iPhone release. I think. Solip is right is not just matching a product release, but matching the sales volume, too. I hadn't thought of that.
Anyway,
Best.
All APU style chips are hobbled by the memory interface. That is one reason why Intel gas added an in package fast cache chip. It is also why on chip caches are so important in these sorts of SoC.
interesting, the Denver has 128KB of L1 instruction cache and 64KB of L1 data cache, looking good.
Way to go Relic!
Oh and hang in there we are all pulling for you.
I take your point, Relic...and your right. But in this case, I think Solip, makes a good point. I remember Stevo saying words to the effect, "In tech you have to be 10 years ahead and Apple is about 5 years ahead." This is around the time of the iPhone release. I think. Solip is right is not just matching a product release, but matching the sales volume, too. I hadn't thought of that.
Anyway,
Best.
Yea, I agree with him too. Nvidia does seem to be on track though and I have no doubt we will see a product with a Denver 64bit this year
So this chip and this OS are basically being developed in vacuums and it will be up to the 'integrators' to make them work together. Who is invested in Performance? (Integration is a 'time to market' problem.)
I say all this in the vein that CPUs and OSes and Compilers have cycled back to the 70's and 80's when and have learned from their mistakes as well.
1) You don't design long term backwards compatibility into your chips. You design forward compatibility into your compilers.
2) Optimizing your pipeline is more an exercise in compiler technology
3) Designing OSes and SoC separately slow down performance innovations.
Bingo. Who cares how many instructions it can perform in parallel? Where is the compiler coming from to take advantage of it? Is Google going to make changes to the Android SDK to take advantage if this? Is Nvidia going to develop an add-in compiler tool to work with Android?
And when we have Exynos and Snapdragon 64bit processors, what then? If they all have different widths (very likely) and different methods of dispatching instructions (very likely) than what happens to the Android SDK? Are you going to have 3 different plug-ins to compile/optmize your code based on which processor you use?
Then add in Intel and Android on x86.
The whole "64bit on Android" looks to be a huge fragmented mess for developers.
As for Apple I wouldn't be surprised to see them deviate from the norm and start to consider merging CPUs with neural networks, FPGA and the like.
Yea, I agree with him too. Nvidia does seem to be on track though and I have no doubt we will see a product with a Denver 64bit this year
Hang in there, Bro!
Best.
The thing here is everybody is expecting Apple to debut a quad core chip. That may happen but they could vastly improve A7 by tweaking this like cache size.
Gotta run lunch is about over.
English is difficult, even if it's your first language. Curtis does well, considering it's his second language.
Nvidia: Put the chip in a phone, THEN announce you have something.
Hang in there, Bro!
Best.
I'm a girl, a beautiful princess, who currently looks like a dyke, so maybe your first impulse was correct, could you imagine though, waking up with different parts. Anyway, the sentiment is the same, thank you.
Take care, Relic. I do enjoy skipping over your longer posts.
64 bit is just for marketing. /s
I'm a girl, a beautiful princess, who currently looks like a dyke, so maybe your first impulse was correct, could you imagine though, waking up with different parts. Anyway, the sentiment is the same, thank you.
Oops, Relic. You're a dudette! My bad.
Best.