Actually, twice the width AND twice the height mean that the diaplay will be exactly four times the number of pixels.
STOP IT.
The article is not wrong. You are not wrong. PLEASE understand this. It certainly wasn't worth breaking your eight year vow of silence to erroneously correct.
Also, who the frick actually thought we'd be getting retina Macs this early? No way.
Are you saying that Apple is one of the dogs, and is getting wagged by Intel?
Is that the best possible position for Apple to be in?
That's like saying wouldn't GM be in a better position if they made their tires and had their own steel mills and refined their own gas. Talk about boat anchors.
There are exactly two possible sources of chips for Apple in 2012, AMD and Intel. What is your solution? Apple making their own chips is the complete polar opposite approach of their ascent to uber profitability. And you called Macs boat anchors?
I wonder what would happen if they bought AMD?....
They'd just be pissing away money on a dying company.
AMD don't fabricate chips any more. Global Foundry does this for them. AMD designs generally have been a poor match for the computers Apple make. LLano may be an exception, we don't really now. But the fabrication deficiencies, poor yields and lagging Intel as far as shrinking the process, that have plagued AMD would still continue even if Apple bought them.
I wonder what would happen if they bought AMD? Apple management style and corporate philosophy should be able to turn them into a serious threat to Intel. All the pieces are there, they just need to be brought into the proper focus.
Apple focuses on making excellent consumer electronics, not on supplying processors and chipsets to competitors.
Apple with a fraction of its cash could have purchased AMD yesterday if it led to greater Apple products.
AMD allegedly had an opportunity to impress Apple last year and they blew it, they failed to deliver a better product.
Some posters seem to have already forgotten why Apple switched to x86 in the first place, it was because neither IBM nor Motorola could deliver enough suitable and competitive chips for Apple's computers.
None of the people moaning about Intel can point to any credible, superior alternatives.
That's like saying wouldn't GM be in a better position if they made their tires and had their own steel mills and refined their own gas. Talk about boat anchors.
If GM products were held up by tires, steel or gas, then it might be an equivalent.
A retina display that has double the dpi has FOUR TIMES THE RESOLUTION, not double the resolution, because resolution goes by the square. e.g. You have a 1 by 1 square pixel and now that will become a 2 by 2 square if you double the dpi; but a 2 by 2 contains 4 pixels while a 1 by 1 contains 1 pixel; as you see the resolution, i.e. pixel count is quadrupled not doubled.
As you can see, this can be solved without the use of advanced math; yet AI gets this CONSISTENTLY WRONG. Would the editors please take note?
This has nothing to do with manufacturing process issues. Intel could go straight to 11 nm and beyond (nanotechnology) if they wanted right now. It is just milking the process as much as possible.
Sometimes you read a thread and you see something so utterly stupid that you just want to become a hermit away from all the rest of humanity.
The Mac accounts for an increasingly smaller percentage of Apple's revenue each year so why waste money buying Intel or AMD. Apple probably wouldn't get the chips any sooner anyway.
Any such takeover would almost certainly be blocked by the US and EU regulators as being anti-competitive.
I reached out to Intel this morning and received the same feedback. Jon Carvill, at Intel Media Relations, reiterated via email that the reports of an eight week delay were inaccurate and that the schedule had only been impacted a few weeks. He went on to say that they were on track with their launch guidance to be in-market for spring and they expected to ship 50% more units of Ivy Bridge in the first two quarter of production as compared to Sandy Bridge parts.
This has nothing to do with manufacturing process issues. Intel could go straight to 11 nm and beyond (nanotechnology) if they wanted right now. It is just milking the process as much as possible.
Sometimes you read a thread and you see something so utterly stupid that you just want to become a hermit away from all the rest of humanity.
This is nearly one of those times.
I second that. The technology to profitably mass-produce chips at smaller feature sizes than 22 nm simply does not exist yet. It's all still experimental. That Intel is even managing to produce at 22 nm can almost be called a miracle (and considering these delays, apparently they aren't actually managing it all that well, at least not yet).
Source: my employer, who makes about 80% of the optical lithography gear used by foundries and chip manufacturers around the world, including Intel.
I second that. The technology to profitably mass-produce chips at smaller feature sizes than 22 nm simply does not exist yet. It's all still experimental. That Intel is even managing to produce at 22 nm can almost be called a miracle (and considering these delays, apparently they aren't actually managing it all that well, at least not yet).
Source: my employer, who makes about 80% of the optical lithography gear used by foundries and chip manufacturers around the world, including Intel.
I agree. The continued ability to produce finer and finer lines absolutely astounds me.
I remember when all the experts were saying that 100 nm might be the limit - and the same statement has been made half a dozen times since then.
Comments
Actually, twice the width AND twice the height mean that the diaplay will be exactly four times the number of pixels.
STOP IT.
The article is not wrong. You are not wrong. PLEASE understand this. It certainly wasn't worth breaking your eight year vow of silence to erroneously correct.
Also, who the frick actually thought we'd be getting retina Macs this early? No way.
It certainly wasn't worth breaking your eight year vow of silence to erroneously correct.
2004 and one comment? Is he posting from a monastery?
2004 and one comment? Is he posting from a monastery?
ISTM that there are many, many accounts on AI that are held in reserve by some people. Are they the same person?
Does this delay have any bearing on the pending iMac release?
Since we never had a launch date for Ivy Bridge desktop chips and never expected it earlier than summer, I'd imagine those are pushed back, too.
Since we never had a launch date for Ivy Bridge desktop chips and never expected it earlier than summer, I'd imagine those are pushed back, too.
Afraid of that -- thanks. (Guess it give me more time to save up!)
Are you saying that Apple is one of the dogs, and is getting wagged by Intel?
Is that the best possible position for Apple to be in?
That's like saying wouldn't GM be in a better position if they made their tires and had their own steel mills and refined their own gas. Talk about boat anchors.
There are exactly two possible sources of chips for Apple in 2012, AMD and Intel. What is your solution? Apple making their own chips is the complete polar opposite approach of their ascent to uber profitability. And you called Macs boat anchors?
I wonder what would happen if they bought AMD?....
They'd just be pissing away money on a dying company.
AMD don't fabricate chips any more. Global Foundry does this for them. AMD designs generally have been a poor match for the computers Apple make. LLano may be an exception, we don't really now. But the fabrication deficiencies, poor yields and lagging Intel as far as shrinking the process, that have plagued AMD would still continue even if Apple bought them.
I wonder what would happen if they bought AMD? Apple management style and corporate philosophy should be able to turn them into a serious threat to Intel. All the pieces are there, they just need to be brought into the proper focus.
Apple focuses on making excellent consumer electronics, not on supplying processors and chipsets to competitors.
Apple with a fraction of its cash could have purchased AMD yesterday if it led to greater Apple products.
AMD allegedly had an opportunity to impress Apple last year and they blew it, they failed to deliver a better product.
Some posters seem to have already forgotten why Apple switched to x86 in the first place, it was because neither IBM nor Motorola could deliver enough suitable and competitive chips for Apple's computers.
None of the people moaning about Intel can point to any credible, superior alternatives.
with apps like iTeleport, i am finding myself using my Mac less and less everyday ;-)
That's like saying wouldn't GM be in a better position if they made their tires and had their own steel mills and refined their own gas. Talk about boat anchors.
If GM products were held up by tires, steel or gas, then it might be an equivalent.
As of now, that would be senseless.
A retina display that has double the dpi has FOUR TIMES THE RESOLUTION, not double the resolution, because resolution goes by the square. e.g. You have a 1 by 1 square pixel and now that will become a 2 by 2 square if you double the dpi; but a 2 by 2 contains 4 pixels while a 1 by 1 contains 1 pixel; as you see the resolution, i.e. pixel count is quadrupled not doubled.
As you can see, this can be solved without the use of advanced math; yet AI gets this CONSISTENTLY WRONG. Would the editors please take note?
Will you guys EVER learn math?
A retina display that has double the dpi has FOUR TIMES THE RESOLUTION, not double the resolution
Twice the resolution = four times the pixels.
Super Hi-Vision, for example, has four times the resolution and 16 times the pixels as 1080p. Get it?
This has nothing to do with manufacturing process issues. Intel could go straight to 11 nm and beyond (nanotechnology) if they wanted right now. It is just milking the process as much as possible.
Sometimes you read a thread and you see something so utterly stupid that you just want to become a hermit away from all the rest of humanity.
This is nearly one of those times.
Only Intel knows the reason for Ivy Bridge delays.
Problems with the 22nm process:
"Maloney attributes the "adjustment" to "the new manufacturing process needed to make the smaller chips."
Any such takeover would almost certainly be blocked by the US and EU regulators as being anti-competitive.
I reached out to Intel this morning and received the same feedback. Jon Carvill, at Intel Media Relations, reiterated via email that the reports of an eight week delay were inaccurate and that the schedule had only been impacted a few weeks. He went on to say that they were on track with their launch guidance to be in-market for spring and they expected to ship 50% more units of Ivy Bridge in the first two quarter of production as compared to Sandy Bridge parts.
http://www.forbes.com/sites/patrickm...ase-calm-down/
And a few other blogs are updating their reports (well, re-reports) consistent to this timeline.
This has nothing to do with manufacturing process issues. Intel could go straight to 11 nm and beyond (nanotechnology) if they wanted right now. It is just milking the process as much as possible.
Sometimes you read a thread and you see something so utterly stupid that you just want to become a hermit away from all the rest of humanity.
This is nearly one of those times.
I second that. The technology to profitably mass-produce chips at smaller feature sizes than 22 nm simply does not exist yet. It's all still experimental. That Intel is even managing to produce at 22 nm can almost be called a miracle (and considering these delays, apparently they aren't actually managing it all that well, at least not yet).
Source: my employer, who makes about 80% of the optical lithography gear used by foundries and chip manufacturers around the world, including Intel.
I second that. The technology to profitably mass-produce chips at smaller feature sizes than 22 nm simply does not exist yet. It's all still experimental. That Intel is even managing to produce at 22 nm can almost be called a miracle (and considering these delays, apparently they aren't actually managing it all that well, at least not yet).
Source: my employer, who makes about 80% of the optical lithography gear used by foundries and chip manufacturers around the world, including Intel.
I agree. The continued ability to produce finer and finer lines absolutely astounds me.
I remember when all the experts were saying that 100 nm might be the limit - and the same statement has been made half a dozen times since then.