or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's 'Core M' chip announcement suggests Broadwell-based MacBook Pros won't arrive until 2015
New Posts  All Forums:Forum Nav:

Intel's 'Core M' chip announcement suggests Broadwell-based MacBook Pros won't arrive until 2015 - Page 2

post #41 of 111
Quote:
Originally Posted by Danox View Post
 
Bye Bye!

Spoken like a joven who thinks an iPad is the only computing device anyone would ever need. You have a lot to learn about enterprise computing lad. When you grow up, perhaps you'll discover there is a world beyond jerking off to porn.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #42 of 111
Intel needs to keep the PC relavent and cool or they will see their revenue and profits fall. They simply can't sell a chip, in a tablet, on such thin margins and as much as they make fat margins on servers the volume is simply not enough. Apple will continue to use Intel, on Macintosh, because the switching costs do not outweigh the advantages. Most of Intel's efforts lately have focused on core-density and power management. Clock speed is still important, but somewhat secondary...
post #43 of 111
Quote:
Originally Posted by Benjamin Frost View Post

There's a Mac called the iMac which fits between the Mini and the Pro; Apple have been selling it for a few years, now.

Did you miss the part about "Upgradeable GPU" PCIe slot?

https://www.ifixit.com/Teardown/iMac+Intel+27-Inch+EMC+2639+Teardown/17828

Even when Apple uses a desktop CPU part, they still use a Laptop GPU. Do you see how much space is just heatsink?

There are literately three and a half tiers of computers that need to exist:
1. Server/Workstation, which defining feature is ECC memory and replaceable PCIe parts and RAM
2. Desktop, defining feature is that it can be replaced every 7 years or upgraded every 2 years with a new video card and/or cpu + RAM.
3. Laptops, which are meant to last 7 years, but not be upgraded beyond RAM.
3.5 Tablet/UltraBook, which are meant to last 3 years and be thrown out when their batteries quit holding a charge. Entire point being battery life at the expense of everything else.

None of these replace each other. Apple doesn't offer the first two, rather all Apple offers are laptop class parts. The iPad included. The Mac Pro is really more of a "MacMini portable Workstation"

The performance gap between the iPad and the MacBook Air is not really that large for the price. http://www.gizmag.com/ipad-air-vs-macbook-air-2014/31940/ , so when people start suggesting Apple is going to put their own ARM parts in MacBooks, I still shake my head and laugh. This would be counterproductive since Intel still has at least one more die shrink up their sleeve. Until there is a point where Apple can produce a A series chip at the same die size, Apple has no reason to even consider it.

The PowerPC vs x86 problem was one of those things where the entire computer industry didn't see where the puck was going. Everyone kept yelling that the x86 ISA was dead, time to find something else. Once everyone hit 3Ghz it was like "oh dude, we can't keep going higher without burning holes in the motherboard." That's when Intel woke up and started building multi-core parts... which is the same time Apple switched over (Jan 2006, same with the Core Solo/Core Duo parts.)

Where everyone is going to get burned in the next 5 years is the SSD's produced this year on TLC processes will burn out. Die shrinks for NAND flash actually result in shorter life spans. So factor this into the MacMini, Mac Pro and Macbook Pro designs. Most of the current MacMini is the space for the hard drive. Remove that with a SSD and the MacMini could be half as tall. Or they could re-style it like the Mac Pro and it would be the size of a coffee mug.
post #44 of 111

Intel is obviously trying to stem the post-PC tide long enough to find a new platform to target with their new chips. They obviously see Apple and its A-series chips and iOS as a serious threat. Same goes with Google and its Chrome OS. This is very much following the same disruption pattern that transitioned us from mainframe computers to mini computers and then to microcomputers (PCs). Now we're well on our way to post-PC and transitioning to handheld computers in the form of tablets, smartphones, and wearables. 

 

If history repeats itself, and it's not even an "IF" as some of you would like it to be, the dominant players in the generation that is left behind almost never successfully transition to the next generation or even survive. Near zero survival rate. They simply don't have to corporate value systems in place that supports killing their current cash cow to jump onto the next generation. They'd rather ride their cash cow until its dead. Until the cow actually dies everything they're doing seems to make good business sense. There's a reason why DEC, Sun, SGI, etc., didn't become dominant PC players even though they dominated the mini computer market. It wasn't because they didn't know how to build damn fast computers or didn't understand the technology or didn't understand the emerging market. These were not dumb companies led by idiots. They had some of the best and brightest minds and business people - and they crashed and burned. One recurring characteristic of companies that end up going the way of the Dodo and DEC is an attempt to hybridize the old with the new. They just can't bring themselves to let the old cash cow go. Does it work? Rarely if ever, and it only delays the inevitable. 

 

Another recurring pattern is the disruptor starting out as inferior in performance to the disrupted. Early PCs were hobby toys compared to mainframes and minis. Did that matter? There are still some mainframes around and PCs have largely displaced minis. Once PCs were "good enough" the mini was dead and the mainframes further squeezed into niche applications. The same thing will happen with what we now consider high-end PCs. They will survive as niche products, but the bulk of the market will move down to tablets and low cost laptops like the ChromeBook. Those are the main choices and challenges Intel has in front of it today.

 

Intel knows this pattern very well, at least under Andy Grove it did. Will they follow DEC and Sun into the sunset? History is not on their side. 

post #45 of 111
Quote:
Originally Posted by Danox View Post
 

 

 

Intel  is going to be history, the writing is on the wall.

We are all history... some just take longer to decompose. 

post #46 of 111
Quote:
Originally Posted by DewMe View Post
 

Intel is obviously trying to stem the post-PC tide long enough to find a new platform to target with their new chips. They obviously see Apple and its A-series chips and iOS as a serious threat. Same goes with Google and its Chrome OS. This is very much following the same disruption pattern that transitioned us from mainframe computers to mini computers and then to microcomputers (PCs). Now we're well on our way to post-PC and transitioning to handheld computers in the form of tablets, smartphones, and wearables. 

 

If history repeats itself, and it's not even an "IF" as some of you would like it to be, the dominant players in the generation that is left behind almost never successfully transition to the next generation or even survive. Near zero survival rate.

 

There are still some mainframes around and PCs have largely displaced minis. Once PCs were "good enough" the mini was dead and the mainframes further squeezed into niche applications. The same thing will happen with what we now consider high-end PCs. They will survive as niche products, but the bulk of the market will move down to tablets and low cost laptops like the ChromeBook. Those are the main choices and challenges Intel has in front of it today.

 

Intel knows this pattern very well, at least under Andy Grove it did. Will they follow DEC and Sun into the sunset? History is not on their side. 

err... most computer spending is going into the cloud.  not to the desktop.  

 

Where I agree with you.   CISC mainframes was supplanted by RISC super minis which was supplanted by x86 'frames'    ARM will have its day in the sun, and then something else that can do mesh computing.

 

Where I disagree:  endpoint computing is really all about UX.  Almost no-one will be running a business on a chrome book, and no one will have  _THEIR_ database wholly on the device they carry, or have on their desk.   These are VT52s, XTerms, 327x devices, nothing more.  Yes, they are computers, but they are not standalone; their value drops to near zero when the cord/wireless is cut.   And at a certain point, it's less about raw compute speed than functionality (A faster mac mini is great, but wait, what I really want is support for 4 4K monitors... speed may be needed to support it, but it's not really a 'business requirement')

 

Intel has a lot of challenges, and to them, like microsoft, they had a hard time trying to specialize into one of 2 competing markets.   You can't take technology which started on the desktop and has been marching towards both the data center and the wristwatch and win in either domain.  

 

Apple learned that in OSes, and has to come to a tactical decision point for what chipset to use for desktops and mobile devices.  Strategically, it doesn't matter: intel, Aseries, or abacuses: Apple's long term viability is about getting people to pay apple for a better way to do integrate into the digital world.  Tactically, they do know that the more effective the chip is in delivering value (price/performance/power/size envelope), the more advantage it brings, but it only is a means of delivering the value... it isn't the value.

 

But back to agreeing... unless you continually reinvent yourself, (e.g. IBM), you only survive one wave of the technology surf.

post #47 of 111
Quote:
Originally Posted by macxpress View Post

What we need...is a Mac mini based off of this. Yes...were still bitching about this in case anyone is wondering.


No we do not.  The Mac mini is a desktop machine, it needs a more powerful chip, not a less powerful one.

 

But we absolutely do need a new mini.  I've got ancient servers that I'm barely keeping alive, waiting on a new, more powerful mini to replace them.

post #48 of 111
Quote:
Originally Posted by DarkVader View Post

The Mac mini is a desktop machine, it needs a more powerful chip, not a less powerful one.

I'm sure it will get one if it's updated, but it is still a low-power machine so I wouldn't expect anything but a mobile-grade chip to be included.

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply

This bot has been removed from circulation due to a malfunctioning morality chip.

Reply
post #49 of 111
Quote:
Originally Posted by DarkVader View Post
 


No we do not.  The Mac mini is a desktop machine, it needs a more powerful chip, not a less powerful one.

 

But we absolutely do need a new mini.  I've got ancient servers that I'm barely keeping alive, waiting on a new, more powerful mini to replace them.


It may be a desktop Mac, but its always gotten a mobile CPU. I can all but guarantee you that it will never have a desktop class CPU inside it. If you want more power, get an iMac. 

 

I think you're better off just getting whats available for Mac mini servers right now. Otherwise, you're going to be disappointed. 

Mac Mini (Mid 2011) 2.5 GHz Core i5

120 GB SSD/500 GB HD/8 GB RAM

AMD Radeon HD 6630M 256 MB

Reply

Mac Mini (Mid 2011) 2.5 GHz Core i5

120 GB SSD/500 GB HD/8 GB RAM

AMD Radeon HD 6630M 256 MB

Reply
post #50 of 111
Quote:
Originally Posted by SolipsismX View Post

I'm sure it will get one if it's updated, but it is still a low-power machine so I wouldn't expect anything but a mobile-grade chip to be included.

This is probably why we have waited so long for the Mini update. Apple most likely saw a huge opportunity in Broadwell to make the Mini a much more powerful machine while keeping the form factor or even shrinking it a bit. As it is it isn't the mobile chip that breaks the Mini but rather the expansion capability and the lackluster Intel GPU's Broadwell appear to be able to deal with the GPU issue and it may have also been a better solution for the TB interface, so more pain over the long wait for Broadwell. The only other expansion issue is the internal drive storage where much could be done to fix the access issues to make servicing those drives more friendly. It is hard to be serious about a machine marketed as a server that has drives as difficult to get to as the Mini. Lastly they really need to keep the two drive bays and add a slot for a blade SSD solution which would really make for a very versatile Mini.

In a nut shel Intel has really f,ed up this year.
post #51 of 111
Quote:
Originally Posted by wizard69 View Post
 
Quote:
Originally Posted by SolipsismX View Post

I'm sure it will get one if it's updated, but it is still a low-power machine so I wouldn't expect anything but a mobile-grade chip to be included.



In a nut shel Intel has really f,ed up this year.

 

hav eyo ube enr ead ing too muc hee cum min gsl ate ly?

"If the young are not initiated into the village, they will burn it down just to feel its warmth."
- African proverb
Reply
"If the young are not initiated into the village, they will burn it down just to feel its warmth."
- African proverb
Reply
post #52 of 111

I spent $5000 for my first Apple II+ setup and have spent a good part of my disposable and vacation money on Apple Computer products since 1982. I have often said to people that a person with a moderate amount of money can get the equivalent of a high class sports car and probably get better value by buying most anything Apple. At first I was dismayed by the paltry Mac Mini offerings we got served a week ago.

 

I have to assume (in my naive and hopeful beginner's mind) that the new offering was a patch until the Broadwell chips got cheap and plentiful; or that Apple has something more exciting on the horizon in the general format as the Mini.

 

I have always preferred a complete all-in-one product, except that for the creative computer user; WE like flexibility. We want Apple to give us something we can mess around with.

 

And we want to continue to be a fan base for Apple, which as a corporation had it's ups and downs, while we saw the genius in Apple and kept touting the essential superiority of the original concept of Apple and saw it eventually prosper beyond our dreams.

 

We are not looking for mediocrity; we see Apple as an exception to the mediocrity we see all around us.

 

Give us a Mac Mini to be proud of!

post #53 of 111
Somehow I feel that even if Apple offered a quad-core Iris Pro mini, people still would not be content and demand discrete graphics. If discrete graphics were in the mini, people would say the graphics used are not good enough. That aside, I am glad that in my 2011 mini, I got to change the RAM myself (went from 2 GB to 8 GB) however am I the only one who feels $200 is not that high of a price to play for the 16 GB upgrade? The SSD I had a bit more trouble with. I missed out on RAM being cheap about two years ago I guess it was but if I want to upgrade my current Mac mini, the prices are listed below and are the same for 1333 MHz and 1600 MHz.

http://www.crucial.com/usa/en/memory-ddr3/ct2k8g3s1339m

http://www.crucial.com/usa/en/memory/ct2k8g3s160bm

As I mentioned before in my other topic, I am stoked for the Broadwell U and H chips. I do hope we get a quad-core mini perhaps next year or the year after though.
post #54 of 111
Quote:
Originally Posted by Winter View Post

Somehow I feel that even if Apple offered a quad-core Iris Pro mini, people still would not be content and demand discrete graphics. If discrete graphics were in the mini, people would say the graphics used are not good enough.
Well you have a point there. However with this release it just looks a bit awkward, the high end machine does not significantly improve the CPU/GPU over the midrange model. I believe that is due to Intel just not having the solution Apple wanted.

Let's face it a lot of Mini users could have easily justified a quad core. I mean with sound technical reasoning, not some of the techno lust we see here. On the flip side there are way to many that would chase after a quad that would never be able to leverage the processor. So what I'm saying here is the upset people over the lack of a quad core may rightfully be pissed off at Apple. On the other hand the mid range machine is actually a huge upgrade over past machines, so it is I'll advised for people to slam this years update as totally worthless.
Quote:
That aside, I am glad that in my 2011 mini, I got to change the RAM myself (went from 2 GB to 8 GB) however am I the only one who feels $200 is not that high of a price to play for the 16 GB upgrade?
It isn't a lot of money to pay for RAM from a mainline vendor. However realize that you are really only paying for 8 GB as the cost of 8 GB of RAM is built into the mid range Mini. The other thing here is that this is LPDDR3 RAM which last I knew was a bit more expensive than mainline desktop RAM.
Quote:
The SSD I had a bit more trouble with. I missed out on RAM being cheap about two years ago I guess it was but if I want to upgrade my current Mac mini, the prices are listed below and are the same for 1333 MHz and 1600 MHz.
Strange pricing. It does highlight that Apples prices are not anywhere near as bad as they once where. The price of RAM in these machines shouldn't impact any bodies purchasing decisions.
Quote:

As I mentioned before in my other topic, I am stoked for the Broadwell U and H chips. I do hope we get a quad-core mini perhaps next year or the year after though.

I'm not sure if Intel will be focusing on quad cores yet with Broadwell. They have been hell bent on lowering power and increasing GPU functionality to try to protect the low end from ARM hardware. That being said CPU cores these days are not the power hogs they have been in the past. In my world it is foolish of Apple to not have a quad core option for the Mini, who knows what the scoop is in Apples world.

I'm still left with the impression that the Mini is a minimalist engineering effort that has delivered a really nice midrange machine but has failed to provide a suitable performance option. Contrast this with the new iPad Air which they blew out all stops to totally overhaul the machine and attack the performance issues there. It is obvious where the engineering effort has gone these days.
post #55 of 111
They might have offered a 37W quad core chip but I guess they didn't want to offer Intel HD 4600 only Iris Pro to not confuse people with Intel's numbering system of Intel HD 5000, Iris 5100, and Iris Pro 5200.
post #56 of 111
Quote:
Originally Posted by Winter View Post

Somehow I feel that even if Apple offered a quad-core Iris Pro mini, people still would not be content and demand discrete graphics. If discrete graphics were in the mini, people would say the graphics used are not good enough. 

I think the expectation was to see a revised Mini that with at least the same quad-core option and user-replacible RAM. That it didn't appear left a lot of people disappointed in the Mini - and Apple.

post #57 of 111
Quote:
Originally Posted by Header View Post

I think the expectation was to see a revised Mini that with at least the same quad-core option and user-replacible RAM. That it didn't appear left a lot of people disappointed in the Mini - and Apple.

The lack of a quad core is a bit of a disappointment as that can have a significant bearing on performance for some users. Anybody whining about replaceable RAM is just out of Touch. Seriously what will these people do when RAM gets integrated right into the processor module?

I can't say I'm extremely disappointed with Apple as Intel is partly to blame here. However Apple did have the option of boosting the power supply wattage to better support Intel decent quad core options. The flip side here is that Apple did focus on GPU performance which is huge for the majority of Mini users. So; I can't use the word disappointed here at all, rather I see those machines as very nice upgrades if you don't need a quad core. I really think too many focused on the regression at the high end and thus failed to realize just how nice the update is for the midrange machine.
post #58 of 111
Quote:
Originally Posted by wizard69 View Post

Seriously what will these people do when RAM gets integrated right into the processor module?

When is this?
Yeah, well, you know, that's just, like, my opinion, man.
Reply
Yeah, well, you know, that's just, like, my opinion, man.
Reply
post #59 of 111
Originally Posted by WonkoTheSane View Post
When is this?


Skylake, I believe.

 

If you think people are whining about change now, wait until post-Cannonlake chips drop silicon in favor of indium antimonide. :p 

post #60 of 111
Quote:
Originally Posted by WonkoTheSane View Post

When is this?


Well that is hard to say exactly but this may interest you: http://www.anandtech.com/show/8217/intels-knights-landing-coprocessor-detailed. This chip is for 2015 and with an execrated trickle down to conventional chips we could see this in desk top chips in three years. However you also need to consider Haswells with Iris Pro which is a slightly different technology to deal with the same issue of memory performance. So the basic concept of putting memory in the package to get past bandwidth issues to slow DRAM is already with us. The difference with Knights Landing is that the RAM is applications or main memory.

Even if we don't see RAM in the processor modules, in three years, I would expect that high performance memory systems will be soldered in.

As a side note that Knights Landing could make one hell of a Mac Pro Chip. Maybe not ideal for everyday mac Pro usage but in some situation it would be awesome.
post #61 of 111

 

Quote:
Originally Posted by wizard69 View Post




Even if we don't see RAM in the processor modules, in three years, I would expect that high performance memory systems will be soldered in.
 

What do you mean by ram in the processor modules? Are you referring to a growth in cache/sram to the point where dram is obsolete?

post #62 of 111
Quote:
Originally Posted by wizard69 View Post

Well that is hard to say exactly but this may interest you: http://www.anandtech.com/show/8217/intels-knights-landing-coprocessor-detailed. This chip is for 2015 and with an execrated trickle down to conventional chips we could see this in desk top chips in three years. However you also need to consider Haswells with Iris Pro which is a slightly different technology to deal with the same issue of memory performance. So the basic concept of putting memory in the package to get past bandwidth issues to slow DRAM is already with us. The difference with Knights Landing is that the RAM is applications or main memory.

Even if we don't see RAM in the processor modules, in three years, I would expect that high performance memory systems will be soldered in.

As a side note that Knights Landing could make one hell of a Mac Pro Chip. Maybe not ideal for everyday mac Pro usage but in some situation it would be awesome.
Quote:
Originally Posted by Tallest Skil View Post


Skylake, I believe.

If you think people are whining about change now, wait until post-Cannonlake chips drop silicon in favor of indium antimonide. 1tongue.gif  

Thanks. Sounds interesting. However, also sounds a bit like "we are working in a tricorder and hope to release it wothong a year or two". Honestly, I once met a director of a science institute who told me this in all seriousness. A lot of nice concepts... But that was all. I'm no saying Intel will not deliver. It just seems that the technological envelope is being pushed harder so putting nice concepts into practice has a significant amount of risk. Just take a look at G5, today broadwell etc.
maybe they just need someone who tells them: http://m.youtube.com/watch?v=5OXn88m4NBg 1wink.gif
Yeah, well, you know, that's just, like, my opinion, man.
Reply
Yeah, well, you know, that's just, like, my opinion, man.
Reply
post #63 of 111
Quote:
Originally Posted by hmm View Post

What do you mean by ram in the processor modules? Are you referring to a growth in cache/sram to the point where dram is obsolete?

No I mean high speed RAM mounted right in the same module as the processor silicon and closely tied to the processor through a very high speed bus. The goal is far higher performance out of the RAM.
post #64 of 111
Sorry about some of the auto correct typos in my earlier post!
Quote:
Originally Posted by WonkoTheSane View Post


Thanks. Sounds interesting. However, also sounds a bit like "we are working in a tricorder and hope to release it wothong a year or two".
Well yeah sometimes it does sound that way. However you have to realize a few things. One is that going to RAM slows a processor down today significantly. There is a lot of contention for access to RAM and more cores and better GPUs integrated in the SoC just makes that bandwidth issue worst. The short term solution for desktop and probably mobile is DDR 4 wich is actually here now in some hardware. Longer term faster requires being very close to the processor.

This is where things like Memory Cibe technology become very interesting as the high density and low power combined with high performance make for a very compelling solution to improving the and width problem to RAM. We probably won't see such tech in the desktop for awhile but I see it as inevitable and maybe closer than I think for something like a Mac Pro.

As it is next year as DDR 4 supporting hardware rolls out more generally expect to see all sort of benchmarking exploring increased bandwidths. Of course the faster RAM has to actually arrive but the impact on APU based machines should be very noticeable.
Quote:
Honestly, I once met a director of a science institute who told me this in all seriousness. A lot of nice concepts... But that was all.
A lot of the technology for a Tricoder and other Star Trek tech is already here, it just needs to be integrated into a device. Take the iPad for example as it is certainly an analog to many things in that Universe. In some ways an iOad is already more sophisticated than anything imagined for a Tricoder. It is certainly less bulky than the originals and frankly it wouldn't take much to add some of the I/O to deliver the analytical functionality.

Think about the analytical capabilities on the space Buggys on Mars. Much of that stuff is extremely compact
Quote:
I'm no saying Intel will not deliver. It just seems that the technological envelope is being pushed harder so putting nice concepts into practice has a significant amount of risk. Just take a look at G5, today broadwell etc.
maybe they just need someone who tells them: http://m.youtube.com/watch?v=5OXn88m4NBg 1wink.gif

There is always risk. Also there is no substitute for a working product in your hands to test. I notice in antiher forum today that Intel has apparently made a stealth release of more Broadwell M chips. Some are speculation that they could go into a new Mac Book Air like machine. That is possible but one has to be careful with Intels marketing materials.

For example the processors are being described as 4.5 watt devices. However there are some benchmarks floating about where Intel runs the processor at 6 watts. Also it is unknown just how much the processor will throttle in a fanless design. The point is the proof is in the pudding and we need Apple to ship something to determine if performance will be acceptable.

Technology is certainly interesting but it is far more interesting in a shipping product.
post #65 of 111
Quote:
Originally Posted by wizard69 View Post


No I mean high speed RAM mounted right in the same module as the processor silicon and closely tied to the processor through a very high speed bus. The goal is far higher performance out of the RAM.


So on the cpu package but just not on the die itself?

post #66 of 111
Quote:
Originally Posted by hmm View Post


So on the cpu package but just not on the die itself?

 

Yeah. Standard fare. Intel already does this with "CrystalWell", which has been shipping for a year now. Anything Haswell with GT3e or Iris Pro graphics is a Haswell CPU/GPU with 128 MB of eDRAM on package. One is shipping in the 21.5" iMac.

 

Intel did it with the Pentium Pro 20 years ago. Back then, L2 cache was off-package. You can upgrade your L2 cache then. With the Pentium Pro, Intel put a high speed SRAM L2 cache on the package. Time marches and lower levels of memory get moved to the die, and onto the package. I'm not sure if it'll be common practice as everything is hypersensitive to power consumption these days, but it certainly an option for desktop and laptop systems.

 

Apple does it with their ARM SoCs, but it isn't "high speed". Just a space savings convenience and the memory bandwidth is just your normal multi-channel multi-data rate memory interface.

post #67 of 111
Quote:
Originally Posted by hmm View Post


So on the cpu package but just not on the die itself?

Yes effectively a multi chip module.
post #68 of 111
Quote:
Originally Posted by THT View Post
 

 

Yeah. Standard fare. Intel already does this with "CrystalWell", which has been shipping for a year now. Anything Haswell with GT3e or Iris Pro graphics is a Haswell CPU/GPU with 128 MB of eDRAM on package. One is shipping in the 21.5" iMac.

 

Intel did it with the Pentium Pro 20 years ago. Back then, L2 cache was off-package. You can upgrade your L2 cache then. With the Pentium Pro, Intel put a high speed SRAM L2 cache on the package. Time marches and lower levels of memory get moved to the die, and onto the package. I'm not sure if it'll be common practice as everything is hypersensitive to power consumption these days, but it certainly an option for desktop and laptop systems.

 

Apple does it with their ARM SoCs, but it isn't "high speed". Just a space savings convenience and the memory bandwidth is just your normal multi-channel multi-data rate memory interface.

I was unaware of those details with the Pentium Pro, mostly because I didn't follow computing very closely until the very late 90s, which seems odd in retrospect. I was aware of it in Apple's SoC's.

 

Quote:
Originally Posted by wizard69 View Post


Yes effectively a multi chip module.


Got it, but that may still be a while. Apple went that route to conserve height. With the mini they reuse design work wherever possible, so I suspect that led to the soldered ram.

post #69 of 111
Quote:
Originally Posted by THT View Post

Yeah. Standard fare. Intel already does this with "CrystalWell", which has been shipping for a year now. Anything Haswell with GT3e or Iris Pro graphics is a Haswell CPU/GPU with 128 MB of eDRAM on package. One is shipping in the 21.5" iMac.
In this case Crystallwell is effectively a high speed cache chip. This isn't the same thing as system memory but the impact on performance is due to it being faster than DDR3 RAM.

What I find shocking (if that is the right word) is that 128 MB of Cache RAM is more memory that some of my first few computers had altogether. The industry certainly has come a long way in one lifetime.
Quote:
Intel did it with the Pentium Pro 20 years ago. Back then, L2 cache was off-package. You can upgrade your L2 cache then. With the Pentium Pro, Intel put a high speed SRAM L2 cache on the package. Time marches and lower levels of memory get moved to the die, and onto the package. I'm not sure if it'll be common practice as everything is hypersensitive to power consumption these days, but it certainly an option for desktop and laptop systems.
The funny thing is you never heard complaints about Cache memory moving on die or into the package back then. If I remember correctly your cache RAM was optional on some motherboards. It is just funny the reaction some have to RAM being soldered into a motherboard when the ability to install or upgrade a cache RAM array went away years ago.

The other reality here is that modern processors can have many buffers and caches on chip to help deal with the impact of slow RAM. That works good for the current crop of mainstream processors but the slow path to RAM really impacts performance when you have a lot of cores running.
Quote:
Apple does it with their ARM SoCs, but it isn't "high speed". Just a space savings convenience and the memory bandwidth is just your normal multi-channel multi-data rate memory interface.

This is a slightly different type of implementation and is more along the lines of what I expect intel is doing with the next Xeon Phi. That is the RAM is RAM not a cache implementation. The Apple processors implement 1GB of RAM while the apparent goal with XEON Phi is to have 16GB of RAM available in the processor module.

It should be noted that Apples approach with the cell phone processors has other positives. For one you reduce pin counts that go to the outside world which is part of the space saving advantages. You also have tighter control over the electrical interface. There is also more that one approach to stacking dies in a package like this each with performance characteristics and assembly issues.
post #70 of 111
Quote:
Originally Posted by wizard69 View Post

Sorry about some of the auto correct typos in my earlier post!
Well yeah sometimes it does sound that way. However you have to realize a few things. One is that going to RAM slows a processor down today significantly. There is a lot of contention for access to RAM and more cores and better GPUs integrated in the SoC just makes that bandwidth issue worst. The short term solution for desktop and probably mobile is DDR 4 wich is actually here now in some hardware. Longer term faster requires being very close to the processor.

This is where things like Memory Cibe technology become very interesting as the high density and low power combined with high performance make for a very compelling solution to improving the and width problem to RAM. We probably won't see such tech in the desktop for awhile but I see it as inevitable and maybe closer than I think for something like a Mac Pro.

As it is next year as DDR 4 supporting hardware rolls out more generally expect to see all sort of benchmarking exploring increased bandwidths. Of course the faster RAM has to actually arrive but the impact on APU based machines should be very noticeable.
A lot of the technology for a Tricoder and other Star Trek tech is already here, it just needs to be integrated into a device. Take the iPad for example as it is certainly an analog to many things in that Universe. In some ways an iOad is already more sophisticated than anything imagined for a Tricoder. It is certainly less bulky than the originals and frankly it wouldn't take much to add some of the I/O to deliver the analytical functionality.

Think about the analytical capabilities on the space Buggys on Mars. Much of that stuff is extremely compact
There is always risk. Also there is no substitute for a working product in your hands to test. I notice in antiher forum today that Intel has apparently made a stealth release of more Broadwell M chips. Some are speculation that they could go into a new Mac Book Air like machine. That is possible but one has to be careful with Intels marketing materials.

For example the processors are being described as 4.5 watt devices. However there are some benchmarks floating about where Intel runs the processor at 6 watts. Also it is unknown just how much the processor will throttle in a fanless design. The point is the proof is in the pudding and we need Apple to ship something to determine if performance will be acceptable.

Technology is certainly interesting but it is far more interesting in a shipping product.


Thanks for the elaborate answer. 

 

Regarding the Tricorder: I did not want to come across as judging. For sure, you have to aim high and have a lot of stamina, if you want to develop some breakthrough products and/or technologies. Just take the iPhone, e.g. starting with the Apple Personal Assistant concepts, then the Newton etc. Also, if I am not mistaken, the iPhone was a "side-product" of the iPad and then priorities were shifted. This kind of energy and focus over several years deserves some deep respect, in addition, if your company is registered in the stock market, and therefore the ROI is expected quickly. Also it is quite easy from a consumer side to expect step changes in tech on a regular basis. 

 

Regarding risk: From my professional experience I see a clear correlation between the level of innovation and the risks that something does not work as expected, be it from a design perspective, from a manufacturing perspective, and increasingly from a system interaction perspective. Risk mitigation is often neglected, especially when the time to market is decreasing, and marketing is already promising the hell out of the new product. The proclaimed 4,5 W TDP (@ 800 MHz, wasn't it?) could be such an example. But like you said, it is all pudding and we have to wait and see when a final product ships. :-)

Yeah, well, you know, that's just, like, my opinion, man.
Reply
Yeah, well, you know, that's just, like, my opinion, man.
Reply
post #71 of 111
Quote:
Originally Posted by WonkoTheSane View Post


Thanks for the elaborate answer. 

Regarding the Tricorder: I did not want to come across as judging. For sure, you have to aim high and have a lot of stamina, if you want to develop some breakthrough products and/or technologies. Just take the iPhone, e.g. starting with the Apple Personal Assistant concepts, then the Newton etc. Also, if I am not mistaken, the iPhone was a "side-product" of the iPad and then priorities were shifted.
it is too bad there isn't a more public history of the development of iOS and iPads/IPhones. What is really interesting here is that the detour into the iPhine gave time to debut the iPad with much better hardware which I see as a significant deal in the success of iPad. If iPad shipped with the first generation of iPhone processors it would have had to struggle to obtain consumer acceptance.
Quote:
This kind of energy and focus over several years deserves some deep respect, in addition, if your company is registered in the stock market, and therefore the ROI is expected quickly. Also it is quite easy from a consumer side to expect step changes in tech on a regular basis. 
as a consumer sometimes Apple drags its feet and this causes me concern as a stock holder. The new iOad Air 2 is an example of Apple pulling out all stops. The iPod Touch and other items are examples of Apple ignoring a product so much that it can't possibly succeed in the marketplace. Sometimes it would be better if Apple killed off a product rather than to ignore it.
Quote:
Regarding risk: From my professional experience I see a clear correlation between the level of innovation and the risks that something does not work as expected, be it from a design perspective, from a manufacturing perspective, and increasingly from a system interaction perspective. Risk mitigation is often neglected, especially when the time to market is decreasing, and marketing is already promising the hell out of the new product. The proclaimed 4,5 W TDP (@ 800 MHz, wasn't it?) could be such an example. But like you said, it is all pudding and we have to wait and see when a final product ships. :-)

I'd love to see a MBA that is fabless and a step up performance wise relative to what we have now but I don't see this chip providing that sort of solution. Maybe I'm wrong here, but I've been waiting for the MBA to hit the performance level I want and frankly broad well may do the trick at the current power levels.
post #72 of 111
Quote:
Originally Posted by wizard69 View Post

In this case Crystallwell is effectively a high speed cache chip. This isn't the same thing as system memory but the impact on performance is due to it being faster than DDR3 RAM.

 

True, but I view it as more of a semantics game. A computing system has pools of memory with the smallest, but fastest pool closest to the CPU logic. The pools get bigger, but slower the further away they get from the CPU.

 

Storage, as in HDD and SSD storage, is memory. It is a lot further away from the CPU, but is ~10 times bigger than system memory (RAM). They effect the performance a computing system as anyone who's done an upgrade from an HDD to an SSD will attest. Heck, the "cloud" is another level memory available to you, and obviously it's slow, but faster access to the cloud will come to play as more and more data is stored there.

 

With higher transistor densities, all these levels of memory get closer and faster access to the CPU/GPU. Like L2 cache, the memory controller used to be off-die, and it is now on-die. (Intel implemented some MCMs with Atom where the PCH/MCH was on-package, but not on-die I think, so they went through that stage too). I/O controllers used to be off-package, now they are on-package or on-die.

 

The modern Mac has basically moved as much of the memory access and memory levels closer to the CPU/GPU then ever before for PCs. L3 cache on-die. Memory controller on-die. SSDs connected by way of PCIe I/O that's on-die.

 

Apple iPhone SoCs have RAM that is on-package. If Apple made CPU/GPUs for their Macs, I bet, like you, a lot that RAM for some of their systems would be on-package sooner rather than later. But they have to wait on Intel.

 

Lastly, storage will eventually be on-package as well. A triple stack of chips in the package where the CPU, GPU, higher levels of cache, memory controller, I/O (including SSD controller) all in one chip. RAM chips on top of that, then SSD Flash chips on top of that; with, the hottest chips closest to the surface.

 

If you look at the Watch S1 "computer-in-package", they are almost all they way there. It's a PCB with a multitude chips, so it's more of a miniature logic board instead of an MCM, but the whole thing is encased in some kind of resin, and in effect, appears to the lone "package" in the system. I can see them maintaining the interfaces and shape of it so that Watch upgrades will become easy. 4 years down the road, the thing will have 2x to 4x the CPU, GPU, memory, storage, and radio performance, in the same basic "package".

 

If they start moving that type of design (encase the PCB in resin) to iPhones, hmm...

 

What I find shocking (if that is the right word) is that 128 MB of Cache RAM is more memory that some of my first few computers had altogether. The industry certainly has come a long way in one lifetime.
 

Yup. I feel old, and I don't think I'm that old. Way back, "storage" used to be mobile and we carried it around with us.

post #73 of 111
Quote:
Originally Posted by THT View Post

True, but I view it as more of a semantics game. A computing system has pools of memory with the smallest, but fastest pool closest to the CPU logic. The pools get bigger, but slower the further away they get from the CPU.
What is neat is that 128 MB of RAM is still enough to keep the core or an app in that cache. In some cases the on chip caches are big enough to keep the core of an app on chip. This has a very profound impact on performance.
Quote:

Storage, as in HDD and SSD storage, is memory. It is a lot further away from the CPU, but is ~10 times bigger than system memory (RAM). They effect the performance a computing system as anyone who's done an upgrade from an HDD to an SSD will attest. Heck, the "cloud" is another level memory available to you, and obviously it's slow, but faster access to the cloud will come to play as more and more data is stored there.
Yeah it is storage but there is a big difference in the way it is addressed.
Quote:
With higher transistor densities, all these levels of memory get closer and faster access to the CPU/GPU. Like L2 cache, the memory controller used to be off-die, and it is now on-die. (Intel implemented some MCMs with Atom where the PCH/MCH was on-package, but not on-die I think, so they went through that stage too). I/O controllers used to be off-package, now they are on-package or on-die.
Yep! This is what I was getting at when using the word "closer" as closeness is very important to speeding up a computer. Speed and distance are wedded in computer design, in crease one and you effectively lower the other. I believe it was Grace Hopper that came up with the term "light foot" the distance light travels in a nano second. This physical reality is why computers have gotten much faster as they have gotten much smaller. It is the fundamental point I'm trying to get across here.
Quote:
The modern Mac has basically moved as much of the memory access and memory levels closer to the CPU/GPU then ever before for PCs. L3 cache on-die. Memory controller on-die. SSDs connected by way of PCIe I/O that's on-die.
Yes the constant evolution of computer hardware as always shrunk parts to increase speed. Even on chip there are high speed caches and low speed caches. This due to the ALU core being so fast that accessing the slower and sometimes further parts of the chip lead to real slow downs. It is actually more complex than that, especially in multi core processors, but in the end todays RAM is extremely slow relative to what is happening in the processor proper.
Quote:
Apple iPhone SoCs have RAM that is on-package. If Apple made CPU/GPUs for their Macs, I bet, like you, a lot that RAM for some of their systems would be on-package sooner rather than later. But they have to wait on Intel.
Unfortunately I don't have a crystal ball. I do know what Intel is doing with Xeon Phi and such approaches eventually work their way down to the workstation level. Given that I can only say that the path here is pretty clear and even if you are not up to speed on tech you can look at the history of computers to see where we are still headed.
Quote:
Lastly, storage will eventually be on-package as well. A triple stack of chips in the package where the CPU, GPU, higher levels of cache, memory controller, I/O (including SSD controller) all in one chip. RAM chips on top of that, then SSD Flash chips on top of that; with, the hottest chips closest to the surface.

If you look at the Watch S1 "computer-in-package", they are almost all they way there. It's a PCB with a multitude chips, so it's more of a miniature logic board instead of an MCM, but the whole thing is encased in some kind of resin, and in effect, appears to the lone "package" in the system. I can see them maintaining the interfaces and shape of it so that Watch upgrades will become easy. 4 years down the road, the thing will have 2x to 4x the CPU, GPU, memory, storage, and radio performance, in the same basic "package".
I was actually at one point wondering if Apples 2015 debut was to wait for a 14 nm process to pack even more stuff into he watch at an even lower power point. At this point it doesn't look like that is the case. However we till have a few more process shrinks ahead of us and other techniques to lower power. So we can be looking at iPhone type power in a watch in a few short years.
Quote:
If they start moving that type of design (encase the PCB in resin) to iPhones, hmm...
That would fix bendgate too. An iPhone that is one solid blob of whatever would be durable that is for sure. Most potting materials are Epoxy like so they could even reinforce the mix with glass fiber. The problem is such hardware wold be awfully expensive. At least the potted hardware I buy or see at work is.
Quote:
Yup. I feel old, and I don't think I'm that old. Way back, "storage" used to be mobile and we carried it around with us.

Yeah I was a teenager when if first started reading Byte magazine. That was all I could afford to do for years. Some of my computers came with memory measured in KiloBytes, not even Megabytes.

In any event I have a sense of history here which is why I comment on the new Mac Mini and try to balance all the negativity that is seen in the forums about this machine. people need to realize that we are in another epoch here where current hardware will quickly become outdated as new hardware comes on line supporting the latest standards. here DDR 4 should be well established by the time anybody seriously thinks about replacing the current Mini with a new one. DDR 4 based platforms should provide significant improvements over current machines to make upgrades of old hardware pretty silly. It can be liked to the move from the 386 to the 486, the step up was so great that old hardware quickly became door stops.
post #74 of 111
I'm thinking that some of you may be interested in what Intel is up to with XEON Phi. In a nut shell you can get some of the Xeon Phi cards at extremely large discounts if you look around a bit. We are talking somewhere in the range of 200-500 dollars. This discounting apparently is going on until the end of the year.

One theory has it that they want to get hardware into the hands of developers. My theory is that the have a surplus to zero out before the debut of Landing or whatever the next generation Phi is. Obviously this isn't Apple related unless Apple decides to offer the Mac Pro with a XEON Phi installed. I just thought that some following this thread might be interested.
post #75 of 111
We may be disappointed with Intels Haswell line up for Mac Books and Minis but Intel hasn't given up on researching new tech. Here is an article about research into new RAM tech: http://www.taipeitimes.com/News/biz/archives/2014/11/19/2003604752. That tech would be very interesting in a cell phone

On another note things aren't going to well for mobile at Intel. The division for mobile is apparently being merged into another group.
post #76 of 111

Intel's Mobile group is being merged with it's PC group and the combined group will be led by the Ultrabook chief.

 

This only screams to me that Intel is trying to hide the massive cash drain that their mobile division is. 1 billion dollar loss was the latest result for it.

post #77 of 111
Quote:
Originally Posted by Jexus View Post

Intel's Mobile group is being merged with it's PC group and the combined group will be led by the Ultrabook chief.
The way I see this, Intel has two problems. One is that they're a manufacture of one size fits all chip sets in an industry that absolutely needs customizable SoCs. I can see Apple eventually moving into a position where they have no choice but to go to ARM in its laptops if Intel isn't willing to customize solutions to Apples needs. The second issue is that they are too hung up on the past with backwards compatibility. Intel should have come out with a 64 bit architecture that dropped all support for i86 legacy modes. A truly lean 64 bit i86 chip.
Quote:
This only screams to me that Intel is trying to hide the massive cash drain that their mobile division is. 1 billion dollar loss was the latest result for it.

Yep, sweeping a failure under the rug. Intel is hurting as they have laid off people this year and frankly it is likely to get worst for them. They really have to come to grips with industry needs and also have to get with the program as far as 14 nm products go.
post #78 of 111
Quote:
Originally Posted by wizard69 View Post


The way I see this, Intel has two problems. One is that they're a manufacture of one size fits all chip sets in an industry that absolutely needs customizable SoCs. I can see Apple eventually moving into a position where they have no choice but to go to ARM in its laptops if Intel isn't willing to customize solutions to Apples needs.

Yes, this is a huge problem. Intel won't customize chips for anyone but their largest clients, and even then, they seem to have pretty stingy limits.

 

As for ARM, I see this as another potential way for AMD to encroach in on Intel in the mac line. As AMD is pretty open to customization of IP, and most importantly, access to some beefy solutions(Both X86 and ARM). I don't know how well the current PowerVR graphics in Apple's mobile chips perform in openCL benchmarks, but I do know that AMD's GCN arch(which will be integrated into K12) is a compute monster. Integrating those with Apple's current ARM design IMO may yield potential for interesting benefits on the mac side. If there is one thing I imagine apple would be happy to continue to push, OpenCL is one of my votes.  At least mobile wise as per the above anyway. Though I could easily see Apple just sticking with PowerVR mobile anyway, though I'm curious as to whether they'll reconsider AMD post Zen for the AIO Macs.

 

Of course the obvious problem would still be supply. I do remember reading that Apple was strongly considering AMD's Llano chips a few years back but ultimately backed out because AMD couldn't supply the chips to meet Apple's demands quantitive wise. They would surely have improved since then yield wise?

 

Quote:
Yep, sweeping a failure under the rug. Intel is hurting as they have laid off people this year and frankly it is likely to get worst for them. They really have to come to grips with industry needs and also have to get with the program as far as 14 nm products go. 

I used to believe that Intel Mobile had a shot, but I'm only ever seeing the same pattern. Outside of Windows Tablets, Intel is basically non existent and the performance of their chips on other platforms(Ala android) simply isn't enough to justify their non subsidy premiums.

post #79 of 111
Quote:
Originally Posted by wizard69 View Post


Anybody whining about replaceable RAM is just out of Touch. Seriously what will these people do when RAM gets integrated right into the processor module?

 

If this innovation is truly performance-based and not simply meant to turn computers into disposable commodities to be replaced every three years, shouldn't it begin with the Pro and not the Mini? Why can I replace the RAM in a Pro and not a Mini?

 

Did making the Mini's RAM non-upgradable this year make it significantly faster?

The evil that we fight is but the shadow of the evil that we do.
Reply
The evil that we fight is but the shadow of the evil that we do.
Reply
post #80 of 111
Quote:
Originally Posted by Jexus View Post

Yes, this is a huge problem. Intel won't customize chips for anyone but their largest clients, and even then, they seem to have pretty stingy limits.
The industry will leave them behind if they don't get their head out of the sand.
Quote:
As for ARM, I see this as another potential way for AMD to encroach in on Intel in the mac line. As AMD is pretty open to customization of IP, and most importantly, access to some beefy solutions(Both X86 and ARM).
I was kinda hoping for an AMD chip in the Mini as that would be the ideal product for the chip.
Quote:
I don't know how well the current PowerVR graphics in Apple's mobile chips perform in openCL benchmarks, but I do know that AMD's GCN arch(which will be integrated into K12) is a compute monster. Integrating those with Apple's current ARM design IMO may yield potential for interesting benefits on the mac side.
It is AMDs superior integrated GPUs that has me wanting them in a Mini. Admittedly Intel is catching up so it is more of a mixed bag now.
Quote:
If there is one thing I imagine apple would be happy to continue to push, OpenCL is one of my votes.  At least mobile wise as per the above anyway. Though I could easily see Apple just sticking with PowerVR mobile anyway, though I'm curious as to whether they'll reconsider AMD post Zen for the AIO Macs.
Opencl is nice but I find that a lot of people don't understand it and have misplaced expectations. Applied right it is a massive advantage though.
Quote:
Of course the obvious problem would still be supply. I do remember reading that Apple was strongly considering AMD's Llano chips a few years back but ultimately backed out because AMD couldn't supply the chips to meet Apple's demands quantitive wise. They would surely have improved since then yield wise?
I'm not sure how much truth there was in that rumor. However it does look like Global has over come most of their manufacturing problems.
Quote:
I used to believe that Intel Mobile had a shot, but I'm only ever seeing the same pattern. Outside of Windows Tablets, Intel is basically non existent and the performance of their chips on other platforms(Ala android) simply isn't enough to justify their non subsidy premiums.
They most certainly need to get Broadwell out in a way that is suitable for the mass market. I really don't think they have a grip on the changing industry. I just hope they don't turn into the Kodak of the electronics world.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's 'Core M' chip announcement suggests Broadwell-based MacBook Pros won't arrive until 2015