Sadly a lot of problems with this article! Many have already been pointed out but I will add one more. The Melburne Design Center is not related to the finger print sensor company. It is my understanding anyways that Melburne is focused on advancing Apples GPU chops. As such they have hired away many AMD engineers
The general idea expressed in the article is correct, Apple is rapidly advancing it's design capability. Frankly this is to an extent that few grasp. As I've mentioned before in other threads, Apple really has no choice because it is on silicon where innovation takes place these days. Silicon is the printed circuit board of the 80's. Apple would need to hire these guys even if IBM was doing a bang up job.
In this sense hiring these sorts of people is no different than hiring Woz and the engineers of those days that pieced together computers from stock parts. Today's stock parts are IP blocks and block Apple custom Engineers. Apple isn't any closer to being vertically integrated than they where when the first Mac came out. They may get there but they have a huge amount of work to do yet.
In the end a good point in an otherwise terrible article.
Dilger is and always will be a hack. He cobbles together the flimsiest of conclusions based on whatever biased sources or predetermined headline he has in mind.
I consider him no better than the Bill O'Reilly of Apple reporting.
As far as Wall Street is concerned, Apple doesn't even have a processor design division. Apple certainly gets no premium for designing its own processors and they're certainly not considered innovative by the news media. Even when the A7 was announced as the world's first 64-bit mobile processor it was greeted with jeers as being pretty much useless. Apple's chip division is invisible to everyone except Apple. Wall Street definitely considers Qualcomm's Snapdragon and NVidia Tegra processors far more impressive than Apple's A-series processors. Maybe because Apple doesn't have impressive demos that are shown at those processor events by those other companies. The news media is more interested in benchmarks spouting insane numbers as to which company makes the most powerful processors. Apple doesn't seem to give much information about benchmarks of A-series processors so they're overlooked by the mobile industry. Since Apple doesn't have to sell its processors to anyone I guess they can take a low-key approach and say as little as possible about them. Apple has no octa-core processors nor do they have processors running at nearly 3 GHz. The news media likes to drool over those big numbers because they sound very impressive.
Dilger is and always will be a hack. He cobbles together the flimsiest of conclusions based on whatever biased sources or predetermined headline he has in mind.
I consider him no better than the Bill O'Reilly of Apple reporting.
If you compare DED to Bill O'Reilly, I don't think you're getting any benefit from what the word "hack" is supposed to convey.
There are real hacks like Blodgett, Lyons, even Daisey, you could better spend your outrage on. They're on the wrong side. DED is on the right side, and he turns his readers on to some, many, good views from that perspective. He doesn't have to get everything right, just keep the Irish fighting spirit going. Why? Because Apple is still the underdog in the bizarre world of mainstream tech and business journalism.
If you compare DED to Bill O'Reilly, I don't think you're getting any benefit from what the word "hack" is supposed to convey.
There are real hacks like Blodgett, Lyons, even Daisey, you could better spend your outrage on. They're on the wrong side. DED is on the right side, and he turns his readers on to some, many, good views from that perspective. He doesn't have to get everything right, just keep the Irish fighting spirit going. Why? Because Apple is still the underdog in the bizarre world of mainstream tech and business journalism.
I consider Phillip Elmer DeWitt a reputable journalist on Apple matters. Also Jason Snell.
I think it's really interesting that Apple is becoming a 21st century version of what IBM once was -- the preeminent vertically integrated computer company. In 1970 the computers were in big rooms, today they slip into your pocket. But both then and now, the most successful computer company designs/controls the silicon, OS, compilers, programming language, and more.
In the 1990s it was taken as given by many industry observers that vertical integration was a failed business model. But I wonder if the failure was not in the model but in the implementation of that model by IBM and Apple in the 80s and 90s, while Microsoft and Intel just did a much better job of implementing their approach.
Conversely, I wonder if we should be careful to avoid making the opposite mistake today. That is, maybe the reason Microsoft has done poorly in recent years is not so much that their business model is inferior, but that their implementation of the model has been inferior (Google seems to be doing a better job with a revised version of that model).
Perhaps the market can support both models simultaneously... I hope so. I think we are all better off by having companies successfully implementing both models. I personally like Apple's approach and products better, but I'm glad the competition is out there.
IMO, the problem with vertical integration is "what" not "why"
A company like Apple, buying companies to support their primary businesses , eg chip designers, software developers, etc makes sense. A company like Comcast who owns both Media and data networks does not, because that puts one of those business arms in conflict with the other. There are customers (eg business) that need the data, and do not wish to subsidize the media arm. Comcast can't make a customer choose between It and "the other guy" because they own some media networks, when the truth is that owning that media network is what makes them expensive. Apple on the other hand isn't selling anything but hardware and "cloud services" so buying all these companies only puts the other customers of those companies on notice that they might lose them as a supplier. The average customer of these companies is not a general consumer.
Which comes back to the IBM comparison. IBM (and AT&T) at various points in their history were instrumental in getting technology to where it is, which is what Apple did with the iPod, iPhone and iPad. AT&T is responsible for Unix and the C language getting out, and IBM is responsible for the "PC" getting out, but not completely without a fight. These were companies who had technology and didn't know what to do with it. Apple's version of this was Google looting it's iPhone IP, and it ending up in Samsung's hands.
I think the Android platform is fundamentally flawed and literately runs the opposite direction that Metal is. I fully expect the Android platform defenders to cry foul about it, as they are going to be stuck with OpenGL ES, much in the same way earlier feature phones were stuck with JavaME. Much of the software that is portable between iOS and Android is written using Unity, and it's not the most efficient thing to use as it suffers from exactly the problems Apple pointed out, too much abstraction and inefficiency.
even the late great Steve Jobs said do not speculate on what Steve Jobs would do....
For once, Crowley is right. Because we know that Steve was against stealing employees from other companies, if only because he didn’t want HIS employees stolen.
But he wouldn’t at all have been against hiring competitors’ people if it meant keeping his own.
Dilger is and always will be a hack. He cobbles together the flimsiest of conclusions based on whatever biased sources or predetermined headline he has in mind.
I consider him no better than the Bill O'Reilly of Apple reporting.
Your repugnant personal attacks are out of line. Stop posting garbage or I'll request that your account be terminated for violating TOS and basic decency.
Comment on topic on a professional level or leave.
That's probably why he gets such recognition that Phil Schiller would put AI and 'Daniel Eran Dilger' on one of his presentation slides! How'd you swing that one Daniel?
I took the photo at the iPhone 5s event and added site & photographer credits as a watermark because everything we post is immediately stolen by everyone else. But you knew that.
If you compare DED to Bill O'Reilly, I don't think you're getting any benefit from what the word "hack" is supposed to convey.
There are real hacks like Blodgett, Lyons, even Daisey, you could better spend your outrage on. They're on the wrong side. DED is on the right side, and he turns his readers on to some, many, good views from that perspective. He doesn't have to get everything right, just keep the Irish fighting spirit going. Why? Because Apple is still the underdog in the bizarre world of mainstream tech and business journalism.
DED is the cheerleader that Apple neither needs nor wants.
And in other news, Google is using IBM's new Power8 chip to build their new servers (red motherboard). So much for "Amazon and Google now represent 20 percent of the server market, and they want cheap servers rather than IBM's service-oriented premium servers targeting traditional businesses."
In other news, Google uses IBM Power 8 chip in their new server (red motherboard). So much for "Amazon and Google now represent 20 percent of the server market, and they want cheap servers rather than IBM's service-oriented premium servers targeting traditional businesses."
Interesting article. The move to ARM is continuing apace. It seems likely to me that ARM will find a new home in the Mac within the next three years. The benefits to developers will be great: they will be able to sell an app that works on your three devices, thereby increasing the synergy between iOS and Mac OS X. The Mac's visibility will be significantly increased, and Mac developers will be able to reduce the cost of their apps as a result.
Comments
Dilger is and always will be a hack. He cobbles together the flimsiest of conclusions based on whatever biased sources or predetermined headline he has in mind.
I consider him no better than the Bill O'Reilly of Apple reporting.
As far as Wall Street is concerned, Apple doesn't even have a processor design division. Apple certainly gets no premium for designing its own processors and they're certainly not considered innovative by the news media. Even when the A7 was announced as the world's first 64-bit mobile processor it was greeted with jeers as being pretty much useless. Apple's chip division is invisible to everyone except Apple. Wall Street definitely considers Qualcomm's Snapdragon and NVidia Tegra processors far more impressive than Apple's A-series processors. Maybe because Apple doesn't have impressive demos that are shown at those processor events by those other companies. The news media is more interested in benchmarks spouting insane numbers as to which company makes the most powerful processors. Apple doesn't seem to give much information about benchmarks of A-series processors so they're overlooked by the mobile industry. Since Apple doesn't have to sell its processors to anyone I guess they can take a low-key approach and say as little as possible about them. Apple has no octa-core processors nor do they have processors running at nearly 3 GHz. The news media likes to drool over those big numbers because they sound very impressive.
If you compare DED to Bill O'Reilly, I don't think you're getting any benefit from what the word "hack" is supposed to convey.
There are real hacks like Blodgett, Lyons, even Daisey, you could better spend your outrage on. They're on the wrong side. DED is on the right side, and he turns his readers on to some, many, good views from that perspective. He doesn't have to get everything right, just keep the Irish fighting spirit going. Why? Because Apple is still the underdog in the bizarre world of mainstream tech and business journalism.
I consider Phillip Elmer DeWitt a reputable journalist on Apple matters. Also Jason Snell.
A company like Apple, buying companies to support their primary businesses , eg chip designers, software developers, etc makes sense. A company like Comcast who owns both Media and data networks does not, because that puts one of those business arms in conflict with the other. There are customers (eg business) that need the data, and do not wish to subsidize the media arm. Comcast can't make a customer choose between It and "the other guy" because they own some media networks, when the truth is that owning that media network is what makes them expensive. Apple on the other hand isn't selling anything but hardware and "cloud services" so buying all these companies only puts the other customers of those companies on notice that they might lose them as a supplier. The average customer of these companies is not a general consumer.
Which comes back to the IBM comparison. IBM (and AT&T) at various points in their history were instrumental in getting technology to where it is, which is what Apple did with the iPod, iPhone and iPad. AT&T is responsible for Unix and the C language getting out, and IBM is responsible for the "PC" getting out, but not completely without a fight. These were companies who had technology and didn't know what to do with it. Apple's version of this was Google looting it's iPhone IP, and it ending up in Samsung's hands.
I think the Android platform is fundamentally flawed and literately runs the opposite direction that Metal is. I fully expect the Android platform defenders to cry foul about it, as they are going to be stuck with OpenGL ES, much in the same way earlier feature phones were stuck with JavaME. Much of the software that is portable between iOS and Android is written using Unity, and it's not the most efficient thing to use as it suffers from exactly the problems Apple pointed out, too much abstraction and inefficiency.
For once, Crowley is right. Because we know that Steve was against stealing employees from other companies, if only because he didn’t want HIS employees stolen.
But he wouldn’t at all have been against hiring competitors’ people if it meant keeping his own.
Dilger is and always will be a hack. He cobbles together the flimsiest of conclusions based on whatever biased sources or predetermined headline he has in mind.
I consider him no better than the Bill O'Reilly of Apple reporting.
Your repugnant personal attacks are out of line. Stop posting garbage or I'll request that your account be terminated for violating TOS and basic decency.
Comment on topic on a professional level or leave.
That's probably why he gets such recognition that Phil Schiller would put AI and 'Daniel Eran Dilger' on one of his presentation slides! How'd you swing that one Daniel?
I took the photo at the iPhone 5s event and added site & photographer credits as a watermark because everything we post is immediately stolen by everyone else. But you knew that.
If you compare DED to Bill O'Reilly, I don't think you're getting any benefit from what the word "hack" is supposed to convey.
There are real hacks like Blodgett, Lyons, even Daisey, you could better spend your outrage on. They're on the wrong side. DED is on the right side, and he turns his readers on to some, many, good views from that perspective. He doesn't have to get everything right, just keep the Irish fighting spirit going. Why? Because Apple is still the underdog in the bizarre world of mainstream tech and business journalism.
DED is the cheerleader that Apple neither needs nor wants.
In other news, Google uses IBM Power 8 chip in their new server (red motherboard). So much for "Amazon and Google now represent 20 percent of the server market, and they want cheap servers rather than IBM's service-oriented premium servers targeting traditional businesses."
http://www.hashslush.com/rumors-continues-new-iphone-air-concept/