Wouldn't it be nice if Apple made their own servers, running on their own OS...
I guess this is telling us that Apple CAN'T make Enterprise level equipment. They are so concerned about making Cradle to Grave hardware for their customers but are unable to make an Enterprise Server by the thousands to meet all their Data Center needs. It doesn't have to run OS X (Server) but it can be custom tailored to be power efficient and robust enough to meet all the demand needs of iCloud, iTunes, & App Store. You can't tell me that the company that created servers that created the Fastest Super Computer Cluster in 2008 at Virginia Tech and they only used G5 Xserves to accomplish the task aren't able to build an enterprise level server with all their added in house experience in 8 years.
Apple can make just about anything but servers, for the most part, aren't seen by the public and only need to work. It's very easy to get a company to throw some generic CPUs together, add a power supply and various I/O devices and call it a server. These don't require graphic cards in the consumer sense but can use GPUs where additional computing speed is required. For the majority of cloud services, e.g., mail, messaging, etc., the most important aspect of a server is its ability to handle multiple requests so multiple, very fast ethernet access, fast disk access and plenty of disk storage is all that's required. You only need a bare bones server OS to run them. Why would Apple want to waste their time building something like this when servers are a commodity? Well, lots of Mac users want them for one but that isn't the direction Apple is going in at the moment.
As for the VT supercomputer cluster, that was a different time. The G5 was a very nice general purpose computer that hit the market at the right time. Cray and IBM were behind the time and (personally) I believe they were embarrassed at the numbers VT was able to get using OTS hardware. If you look at what happened afterwards, Apple stands no chance of creating OTS hardware to compete with the current crop of supercomputers. The current top500.org supercomputer has 10.6M cores (only 1.45GHz each). We're complaining at the cost of a 12-core CPU for the Mac Pro. The only way Apple could compete would be to waste a bunch of money and time stringing together 2M unreleased 6-core A10 CPUs and see what happens. Even if they were able to run a LINPACK benchmark that come close, they would never sell a complete system. This isn't Apple's business.
If Apple is greatly expanding their server farms, they could purchase parts directly from the manufacturers and contract assembly ala Amazon and Google (and Apple iPhone, etc.) ... in other words, Apple already does this well -- probably better than anyone!
then, if the required volume of servers is high enough, it might make sense to use one or more A10 chips instead of Intel chips. Potentially, there could be considerable savings in initial cost of the servers -- as well as power, cooling, etc.
Hell, they all will be running Swift code, anyway ...
Wouldn't it be nice if Apple made their own servers, running on their own OS...
Apple DID sell their own servers, running MacOS Server, and they quickly killed them off! There is simply no reason for Apple to sell their own servers as it's a low-margin commodity business. Apple buys servers from the same contract manufacturers that build their stuff so they have insights into the associated cost structures. Apple is basically following the same model as Google, Facebook, Amazon, and Microsoft for their iCloud Services...
kennmsr said: You can't tell me that the company that created servers that created the Fastest Super Computer Cluster in 2008 at Virginia Tech and they only used G5 Xserves to accomplish the task aren't able to build an enterprise level server with all their added in house experience in 8 years.
System X (the Virginia Tech G5 cluster) was never the fastest cluster in the world. Not even close. At the time it was created, the Earth Simulator held the top position, and System X came in at 29% of its Rmax and 43% the Rpeak. It was number three when it was built (https://www.top500.org/lists/2003/11/).
A year later, they managed to improve the Rmax by 20% and the Rpeak by 15%. That got them 7th place in that year's list (https://www.top500.org/lists/2004/11/). The pole position had been taken by IBM's BlueGene/L beta system at the DoE, though. 5.8x the Rmax and 4.5x the Rpeak.
The impressive bit about System X wasn't the performance. It was the cost. The Earth Simulator cost ¥60 billion to build in 1999, which comes out to about $530 million. System X cost $5.2 million to build in 2003. Roughly 1% the cost for roughly 30% the performance. The Earth Simulator also consumed many times more power (3.2 megawatts, but I wasn't able to find a precise figure for System X).
If Apple made their own servers, that would be incentive to support their own OS development (as in on the efficiency side), would let them create native Mac OS software to manage that data processing (controlling both ends of the data management software ecosystem; is Linux the best place to invest development costs? They could sell servers and server software to other businesses to get deeper into enterprise business), and would eliminate paying someone ELSE for the computers and software.
Isn't that all kind of like Tim Cook's supply chain control thing?
I guess this is telling us that Apple CAN'T make Enterprise level equipment. They are so concerned about making Cradle to Grave hardware for their customers but are unable to make an Enterprise Server by the thousands to meet all their Data Center needs.
Apple can make just about anything but servers, for the most part, aren't seen by the public and only need to work. It's very easy to get a company to throw some generic CPUs together, add a power supply and various I/O devices and call it a server. These don't require graphic cards in the consumer sense but can use GPUs where additional computing speed is required... Why would Apple want to waste their time building something like this when servers are a commodity? Well, lots of Mac users want them for one but that isn't the direction Apple is going in at the moment.
If Apple is greatly expanding their server farms, they could purchase parts directly from the manufacturers and contract assembly ala Amazon and Google (and Apple iPhone, etc.) ... in other words, Apple already does this well -- probably better than anyone!
then, if the required volume of servers is high enough, it might make sense to use one or more A10 chips instead of Intel chips. Potentially, there could be considerable savings in initial cost of the servers -- as well as power, cooling, etc.
Google has made one recent and very significant change: They're designing/building their own computer chips beginning with their neural network server's TCU (GPU). Microsoft may end up doing something similar.
I don't know. iCloud works pretty damn well for me right now? What's your problem other than parroting troll talking points?
It's works for me too ... mostly. And then it doesn't. When that happens it's a real pain - and almost impossible to get any help (AppleCare is the only possibility and it's not set up for this).
gatorguy said: Google has made one recent and very significant change: They designing/building their own server chips. As one writer said, time for Intel to freak out
Data center design is constantly changing. A decade ago we had 1U servers, then half depth 1U boxes, doubling capacity, then 240VAC blades, then multiple VMs, now ARM servers. Same with infrastructure, raised floor, then containers, then water cooled cabinets. Currently servers typically last about 5 years or so. Whatever they do today will be obsolete in no time all. I suspect we will have some new OS running on some new Chinese designed chips within the next few years. The internet has only been around in widespread use since about 1994. I remember when the LA Times ran an article about the newest advancement in technology called the World Wide Web. We're just getting started. Intel has seen the writing on the wall for several years now. They either adapt or die. Survival of the fittest baby!
So is the opinion that cloud will be a cheap commodity?
Funny how Wall STreet is pinning all its hopes on cloud services for Amazon and Microsoft. Both are getting ridiculously high valuations because of cloud revenue growth. Crazy. In a few years cloud providers will be just like gas stations. Commodity.
I really can't believe how insanily stupid wall Street is.
One should not look at other tech companies through Apple glasses.
Compared to Apple, Amazon has a complete different skill set, structure, business model and culture,. So it makes perfectly sense that Amazon can build a profitable cloud business but no profitable smartphone business while Apple does the opposite. There is nothing wrong with making profits in a commodity business.
If Apple made their own servers, that would be incentive to support their own OS development (as in on the efficiency side), would let them create native Mac OS software to manage that data processing (controlling both ends of the data management software ecosystem; is Linux the best place to invest development costs? They could sell servers and server software to other businesses to get deeper into enterprise business), and would eliminate paying someone ELSE for the computers and software.
Isn't that all kind of like Tim Cook's supply chain control thing?
Did you buy someone else's word processor, or did you write your own to avoid paying for one?
So is the opinion that cloud will be a cheap commodity?
Funny how Wall STreet is pinning all its hopes on cloud services for Amazon and Microsoft. Both are getting ridiculously high valuations because of cloud revenue growth. Crazy. In a few years cloud providers will be just like gas stations. Commodity.
I really can't believe how insanily stupid wall Street is.
One should not look at other tech companies through Apple glasses.
Compared to Apple, Amazon has a complete different skill set, structure, business model and culture,. So it makes perfectly sense that Amazon can build a profitable cloud business but no profitable smartphone business while Apple does the opposite. There is nothing wrong with making profits in a commodity business.
While the cloud, itself, can be a commodity, the offerings from a specific cloud service need not be a commodity. For example: Watson, DropBox, iTunes, the Amazon Store ...
I suspect that coming Apple hardware/software will have exclusive capabilities that will exploit exclusive Apple cloud services.
A similar thing is already happening with IBM's MobileFirst offerings -- exclusive software running on exclusive hardware requiring/exploiting exclusive cloud services.
Comments
If Apple is greatly expanding their server farms, they could purchase parts directly from the manufacturers and contract assembly ala Amazon and Google (and Apple iPhone, etc.) ... in other words, Apple already does this well -- probably better than anyone!
then, if the required volume of servers is high enough, it might make sense to use one or more A10 chips instead of Intel chips. Potentially, there could be considerable savings in initial cost of the servers -- as well as power, cooling, etc.
Hell, they all will be running Swift code, anyway ...
A year later, they managed to improve the Rmax by 20% and the Rpeak by 15%. That got them 7th place in that year's list (https://www.top500.org/lists/2004/11/). The pole position had been taken by IBM's BlueGene/L beta system at the DoE, though. 5.8x the Rmax and 4.5x the Rpeak.
The impressive bit about System X wasn't the performance. It was the cost. The Earth Simulator cost ¥60 billion to build in 1999, which comes out to about $530 million. System X cost $5.2 million to build in 2003. Roughly 1% the cost for roughly 30% the performance. The Earth Simulator also consumed many times more power (3.2 megawatts, but I wasn't able to find a precise figure for System X).
Isn't that all kind of like Tim Cook's supply chain control thing?
Compared to Apple, Amazon has a complete different skill set, structure, business model and culture,. So it makes perfectly sense that Amazon can build a profitable cloud business but no profitable smartphone business while Apple does the opposite. There is nothing wrong with making profits in a commodity business.
Did you buy someone else's word processor, or did you write your own to avoid paying for one?
While the cloud, itself, can be a commodity, the offerings from a specific cloud service need not be a commodity. For example: Watson, DropBox, iTunes, the Amazon Store ...
I suspect that coming Apple hardware/software will have exclusive capabilities that will exploit exclusive Apple cloud services.
A similar thing is already happening with IBM's MobileFirst offerings -- exclusive software running on exclusive hardware requiring/exploiting exclusive cloud services.
Exclusive boom!