Amazon introduces native Mac instances for AWS, powered by Intel Mac mini

Posted:
in General Discussion edited December 2020
At AWS re:Invent, Amazon introduced new Mac instances for its Amazon Elastic Compute Cloud (EC2), enabling developers to natively run macOS in Amazon Web Services for the first time.

Mac mini


Announced late Monday, the new capability harnesses Intel-powered Mac mini hardware to run on-demand macOS workloads in the AWS cloud.

Developers building apps for iPhone, iPad, Mac, Apple Watch, Apple TV, and Safari can use the service to provision and access macOS environments, dynamically scale capacity with AWS, and take advantage of pay-as-you-go pricing, Amazon says. Basically, app makers can create and test in the AWS cloud. In addition, customers can consolidate development of cross-platform Apple, Windows, and Android apps using Amazon's cloud.

"Apple's thriving community of more than 28 million developers continues to create groundbreaking app experiences that delight customers around the world," said Bob Borchers, Apple's VP of Worldwide Product Marketing. "With the launch of EC2 Mac instances, we're thrilled to make development for Apple's platforms accessible in new ways, and combine the performance and reliability of our world-class hardware with the scalability of AWS."

The system integrates Mac mini devices running Intel's 3.2GHz Core i7 CPU and 32GB of RAM. Mac's built-in networking hardware is leveraged to connect to Amazon's Nitro System to provide up to 10 Gbps of VPC network bandwidth and 8 Gbps of EBS storage bandwidth through Thunderbolt 3 connections.

Amazon is entering an arena dominated by small companies like MacStadium and Mac Mini Vault. If they opt for Amazon's solution, developers will be granted access to more than 200 AWS services including Amazon Virtual Private Cloud (VPC), Amazon Elastic Block Storage (EBS), Amazon Elastic Load Balancer (ELB), and Amazon Machine Images (AMIs), not to mention the sheer scalability offered by a large cloud provider.

"The speed that things happen at [other Mac mini cloud providers] and the granularity that you can use those services at is not as fine as you get with a large cloud provider like AWS," VP of EC2 David Brown told TechCrunch. "So if you want to launch a machine, it takes a few days to provision and somebody puts a machine in a rack for you and gives you an IP address to get to it and you manage the OS. And normally, you're paying for at least a month -- or a longer period of time to get a discount. What we've done is you can literally launch these machines in minutes and have a working machine available to you. If you decide you want 100 of them, 500 of them, you just ask us for that and we'll make them available."

Amazon is working to integrate M1 Mac mini units into its data center, with current plans targeting a go live date sometime in the first half of 2021, according to TechCrunch. Big Sur support is also in the works, though EC2 will be limited to macOS Mojave and Catalina at launch.

Mac instances are available On-Demand or with Savings Plans at a rate of $1.083 per hour. Supported regions include the U.S. East (N. Virginia), U.S. East (Ohio), U.S. West (Oregon), Europe (Ireland), and Asia Pacific (Singapore) with more to come.

Comments

  • Reply 1 of 20
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?
    watto_cobra
  • Reply 2 of 20
    Rayz2016Rayz2016 Posts: 6,957member
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    watto_cobra
  • Reply 3 of 20
    blastdoorblastdoor Posts: 3,295member
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?
    I might be missing something, but I don’t think there are direct implications for end users. This sounds more relevant to developers who want/need to create apps for apple platforms but don’t want to buy and maintain their own Macs. Remember that to develop apps for Apple platforms, you need a Mac. But, crazy as it may sound, there do exist people who don’t want to own a Mac. 

    I’ve been secretly hoping that Apple will provide an “iCloud Pro” service that provides a much better UI/UX experience for Apple platform users than is possible with AWS. That is, a seamless/transparent way for Apple pro users to access additional computational/storage resources in an Apple cloud. Apple silicon might make the economics of that work in a way that it never could with Intel, but we shall see. 

    This AWS thing sounds like the bizarro world version of what I’ve been hoping for — a non-seamless way for non-Apple users to access Macs in the cloud. 
    watto_cobra
  • Reply 4 of 20
    dewmedewme Posts: 5,370member
    Rayz2016 said:
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    Yes and no. The yes part: This is a platform as a service (PaaS) flavor of cloud computing where you purchase access to additional Mac machines (and supporting AWS infrastructure) for a period of time to do with whatever your needs dictate. Yes, if you have a big job or process pipeline to execute that would benefit from distributing it across multiple machines, you’ll need to have a way to divide the work up, parcel it out to multiple machines, and aggregate the results. There are several software development automation tools on the market that are explicitly designed for this purpose, e.g., Jenkins, XCode Server, and some development shops build their own automation.

    That’s only one type of distributed processing use case, but there are many others especially in modern software development processes, like having a pool of dedicated compile machines, build machines, automated test machines, configuration management machines, deployment machines, etc., in a pipeline such that every time a block of code gets committed to a development branch from an individual developer it is compiled/built, integrated, unit tested, and regression tested against the entire software system that it is a part of. Each of the machines participating in the individual tasks along the pipeline feeds its output to the next machine in the process. These pipelines or processes run continuously and are event driven, but can also be scheduled. Thus the term continuous integration (CI) and continuous delivery (CD).

    There is nothing preventing an individual user from purchasing access to a cloud based machine (Mac, Windows, Linux, etc.) to do whatever they want to do with it. The inhibitor tends to be the cost and terms of service, which are typically geared towards business users with business budgets. The huge benefit of PaaS cloud computing tends to be the ability to acquire many machines in very short order, and the ability to add/delete many machines nearly instantaneously as your needs change, i.e., elasticity. Try asking your IT department to spin up 125 new Macs overnight if they have to order physical machines from Apple or a distributor. They will laugh, you will cry.

    The no part: If you need to deploy micro services you may not need complete, dedicated machines to deploy your micro services. You may just need a container to run your service, which could be a shared computing resource that is hosting several “tenants” running completely isolated in their own container. If you are building out the service hosting platform as part of your solution, then sure, you could have cloud based machines for your homegrown hosting platform, but this reverts to the previous use case that I mentioned.

    I don’t get the “bizarro” comment from Blastdoor. This type of PaaS cloud computing has been in very widespread use for a very long time, with Amazon’s AWS and Microsoft’s Azure being two of the major players. You may want to see if AWS offers a test drive to get a feel for how it works. Microsoft used to and may still allow you to test drive their Azure PaaS solution. There’s nothing at all bizarre about how it works. You’re sitting in front of a Mac with full access to everything you can get to via the keyboard, mouse, and monitor. Instead of sitting under your desk it is sitting in the cloud.

    Nothing bizarre looking here: https://aws.amazon.com/blogs/aws/new-use-mac-instances-to-build-test-macos-ios-ipados-tvos-and-watchos-apps/

    My only concern with the Mac version of EC2 is that it relies on VNC for remoting the desktop. VNC is not as secure or as performant as RDP that is used for Windows remoting. Any way you cut it, Amazon providing support for Mac in AWS is a very big deal and brings Apple another step closer to competing at the enterprise level against Windows and Linux. 
    edited December 2020 GG1Rayz2016dysamoriaroundaboutnowwatto_cobra
  • Reply 5 of 20
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?

    As others above have explained this is more geared towards developers than end user.  And in the first read it seems primarily targeted as a app development environment than a server solution. Although I would guess there nothing preventing developers from building a (distributed) multi machine video processing solution like you are asking for utilizing this.
    Linux is much more popular & prevalent as a server solution as a result there much more tools and software available to build a solution "which allow the spread of computations simultaneously across multiple cloud-based computers"

    edited December 2020 watto_cobra
  • Reply 6 of 20
    rob53rob53 Posts: 3,251member
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    dysamoriawatto_cobra
  • Reply 7 of 20
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    Unfortunately, I see this as well. Companies, like Amazon, will drive smaller competitors out of the market. Mac Mini, on EC2, is specifically targeted at Mac and iOS/iPad developers and not end users. Much of the content (audio, video, images) will be stored on AWS's Object Cloud Storage called 'S3'. One significant thing about this announcement is that unlike Amazon's use of custom-built x86 hardware, they must purchase hardware from Apple. 
    dysamoriawatto_cobra
  • Reply 8 of 20
    dewmedewme Posts: 5,370member
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    Huh? I totally cannot follow the conspiracy theories you’re spinning. This is big win for Apple in its quest to attract more developers, especially those who target enterprise apps, to the Mac platform. As far as Apple’s investment (or disinvestment) in their server products, it’s no different than all of the other things at Apple that get cut down or nixed as part of Apple’s strategy of saying “no” to anything that’s not in their own best interest to do themselves.

    When you look at bringing capabilities to market you can loosely divide them into three categories, adopt, adapt, or create. Adopt means using other people’s stuff as-is. Adapt means using at least part of other people’s stuff but tweaking it to fit your needs. Create means creating something yourself. The cost between adopt and create rises exponentially so you have to be very strategic about where you invest your resources for the value that you return from your investments. Having the right mix is very important and it can change over time. 

    A good case in point is the strategy behind the M1 SoC. Apple adopted general purpose Intel chips when it made sense to do so and as they built out their core products. Apple adapted ARM technology to tailor the compute resources for products like iPhone, iPad, Apple TV, and Apple Watch to specifically fit their needs. In the process of the ARM adaptation Apple learned enough to create a new class of SoCs that allowed them to replace general purpose CPUs in their mainstream desktop/mobile Mac products with a purpose built engine that crushes anything they could have done using “generic” parts from other suppliers. They did this because they were very strategic about how they allocated their investments across the the different adopt/adapt/creat investment categories. They will continue to do this moving forward.

    Apple has a robust cloud platform, but it’s one that is specifically tailored to their needs and customers. There is no incentive for Apple to compete against AWS or Azure, or any “generic” cloud services platforms. But they can still benefit from those other services if doing so helps advance Apple’s overall strategic objectives. Leaving some things on the table for others to go after is always part of the mix. Any company that thinks it can do everything for everyone and still come out on top is delusional. Apple has done exceptionally well at picking its strategic and tactical targets and walking away with massive profits, and in some cases, doing so with a fraction of the market share of its competitors.   
    Rayz2016GG1roundaboutnowwatto_cobra
  • Reply 9 of 20
    blastdoorblastdoor Posts: 3,295member
    dewme said:
    Rayz2016 said:
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    Yes and no. The yes part: This is a platform as a service (PaaS) flavor of cloud computing where you purchase access to additional Mac machines (and supporting AWS infrastructure) for a period of time to do with whatever your needs dictate. Yes, if you have a big job or process pipeline to execute that would benefit from distributing it across multiple machines, you’ll need to have a way to divide the work up, parcel it out to multiple machines, and aggregate the results. There are several software development automation tools on the market that are explicitly designed for this purpose, e.g., Jenkins, XCode Server, and some development shops build their own automation.

    That’s only one type of distributed processing use case, but there are many others especially in modern software development processes, like having a pool of dedicated compile machines, build machines, automated test machines, configuration management machines, deployment machines, etc., in a pipeline such that every time a block of code gets committed to a development branch from an individual developer it is compiled/built, integrated, unit tested, and regression tested against the entire software system that it is a part of. Each of the machines participating in the individual tasks along the pipeline feeds its output to the next machine in the process. These pipelines or processes run continuously and are event driven, but can also be scheduled. Thus the term continuous integration (CI) and continuous delivery (CD).

    There is nothing preventing an individual user from purchasing access to a cloud based machine (Mac, Windows, Linux, etc.) to do whatever they want to do with it. The inhibitor tends to be the cost and terms of service, which are typically geared towards business users with business budgets. The huge benefit of PaaS cloud computing tends to be the ability to acquire many machines in very short order, and the ability to add/delete many machines nearly instantaneously as your needs change, i.e., elasticity. Try asking your IT department to spin up 125 new Macs overnight if they have to order physical machines from Apple or a distributor. They will laugh, you will cry.

    The no part: If you need to deploy micro services you may not need complete, dedicated machines to deploy your micro services. You may just need a container to run your service, which could be a shared computing resource that is hosting several “tenants” running completely isolated in their own container. If you are building out the service hosting platform as part of your solution, then sure, you could have cloud based machines for your homegrown hosting platform, but this reverts to the previous use case that I mentioned.

    I don’t get the “bizarro” comment from Blastdoor. This type of PaaS cloud computing has been in very widespread use for a very long time, with Amazon’s AWS and Microsoft’s Azure being two of the major players. You may want to see if AWS offers a test drive to get a feel for how it works. Microsoft used to and may still allow you to test drive their Azure PaaS solution. There’s nothing at all bizarre about how it works. You’re sitting in front of a Mac with full access to everything you can get to via the keyboard, mouse, and monitor. Instead of sitting under your desk it is sitting in the cloud.

    Nothing bizarre looking here: https://aws.amazon.com/blogs/aws/new-use-mac-instances-to-build-test-macos-ios-ipados-tvos-and-watchos-apps/

    My only concern with the Mac version of EC2 is that it relies on VNC for remoting the desktop. VNC is not as secure or as performant as RDP that is used for Windows remoting. Any way you cut it, Amazon providing support for Mac in AWS is a very big deal and brings Apple another step closer to competing at the enterprise level against Windows and Linux. 
    https://en.wikipedia.org/wiki/Bizarro

    I was not suggesting that AWS/Azure/et al are "bizarre" in any absolute sense.

    I was saying that relative to my personal hopes for a hypothetical "iCloud Pro" service, which would provide seamless/transparent access to cloud resources for Mac users, this is in a sense the opposite (and therefore Bizarro) -- it's providing non-seamless / non-transparent access to Macs for people who aren't necessarily Apple users. 

    Actually, as I re-read my post, I don't understand what you don't understand. I was quite clear. Reading comprehension fail on you. 
    watto_cobra
  • Reply 10 of 20
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    1. Why on earth would things not be profitable? AWS has offered Windows 10/8/7 instances for years. Why would Amazon be any more likely to dump Macs than Windows?
    2. See 1: you can only develop macOS, iPadOS, iOS, watchOS, tvOS, HealthKit and HomeKit on macOS. There are no other alternatives. As Apple platform app development is booming, again why?
    3. See 1 again: There was a major article a few months back on Deadline, Variety etc. about how entertainment companies are moving animation, movie and TV production and post-production to the cloud, which will remove the requirement to live in Los Angeles if you want to do technical work on Hollywood projects. (Or to live in Tokyo if you want to do similar in Japan, Seoul for Korea, Beijing for China, London for the UK etc.) Now, what is the #1 PC platform for Hollywood types and other creators? Exactly.
    4. Microsoft did the same thing? How? Microsoft does not make server hardware. Microsoft most certainly does not make Linux. Quite the contrary, Microsoft under Ballmer tried to crush Linux using patent lawsuits and market pressure. Had Microsoft not recently already been reined in by an antitrust ruling, they would have succeeded. Don't blame Microsoft: they had nothing to do with it. And don't blame Linux as Microsoft's competing server products thrived. Instead, blame Apple for making a bad product. You would have needed to be absolutely nuts to buy that server product: expensive hardware, bad software and it only supported Apple products!

    Apple's failures in the enterprise are due to their own abject refusal to care about software beyond the absolute basics - 20 years and they still don't have an equivalent to Microsoft Active Directory when Google created their own to manage Chromebooks in less than a year - and their dragging their feet to embrace open platforms and standards. Look, Apple could dominate the ARM-based small server market tomorrow. Just come out with a 12 or 16 core version of the M1 and do what Google did with ChromeOS - start with a version of Debian (for maximum compatibility), inject some proprietary macOS libraries in it and ship it. They would utterly dominate.

    In fact, that is going to happen already. The guy who is working to put Linux on M1 Macs? What do you think they are going to use it for? Exactly. However, adoption is going to be limited because once you put Linux on the M1 Mac, it will be out of support. So not very many enterprises are going to put their mission-critical server applications on unsupported stacks because they won't receive timely security patches or help when something breaks.

    Good grief ...
  • Reply 11 of 20
    StrangeDaysStrangeDays Posts: 12,879member
    Rayz2016 said:
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 
    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    Sorry, I do mostly boring enterprise software, no video rendering.
  • Reply 12 of 20
    dysamoriadysamoria Posts: 3,430member
    Amazon is entering an arena dominated by small companies ...

    What a shock. 🙄
  • Reply 13 of 20
    dewmedewme Posts: 5,370member
    Re: Amazon is entering an arena dominated by small companies ...

    Realistically, from Amazon's perspective, and when you consider their scale, dominant position, and maturity level, the total collection of all Mac-centric cloud services companies, not including Apple's in-house cloud footprint, doesn't even qualify as an "arena." It's more like a backyard play set that you setup for your kids to play on. Even Microsoft with Azure was at a distinct technical disadvantage going up against Amazon on huge government cloud contracts. It was only Jeff Bezo's ownership of a "hostile" media outlet that gave Microsoft a political advantage in the bake-off against Amazon.

    This is a very small increase in market share for Amazon, but a big opportunity for Apple to get their DevOps solutions in front of more enterprise customers, especially those who want to go after the Mac market using the same cloud based development approaches that they've enjoyed with Microsoft and Linux for several years. This helps level the playing field just a little bit more for Apple and Apple developers. 
  • Reply 14 of 20
    cloudguy said:
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    1. Why on earth would things not be profitable? AWS has offered Windows 10/8/7 instances for years. Why would Amazon be any more likely to dump Macs than Windows?
    2. See 1: you can only develop macOS, iPadOS, iOS, watchOS, tvOS, HealthKit and HomeKit on macOS. There are no other alternatives. As Apple platform app development is booming, again why?
    3. See 1 again: There was a major article a few months back on Deadline, Variety etc. about how entertainment companies are moving animation, movie and TV production and post-production to the cloud, which will remove the requirement to live in Los Angeles if you want to do technical work on Hollywood projects. (Or to live in Tokyo if you want to do similar in Japan, Seoul for Korea, Beijing for China, London for the UK etc.) Now, what is the #1 PC platform for Hollywood types and other creators? Exactly.
    4. Microsoft did the same thing? How? Microsoft does not make server hardware. Microsoft most certainly does not make Linux. Quite the contrary, Microsoft under Ballmer tried to crush Linux using patent lawsuits and market pressure. Had Microsoft not recently already been reined in by an antitrust ruling, they would have succeeded. Don't blame Microsoft: they had nothing to do with it. And don't blame Linux as Microsoft's competing server products thrived. Instead, blame Apple for making a bad product. You would have needed to be absolutely nuts to buy that server product: expensive hardware, bad software and it only supported Apple products!

    Apple's failures in the enterprise are due to their own abject refusal to care about software beyond the absolute basics - 20 years and they still don't have an equivalent to Microsoft Active Directory when Google created their own to manage Chromebooks in less than a year - and their dragging their feet to embrace open platforms and standards. Look, Apple could dominate the ARM-based small server market tomorrow. Just come out with a 12 or 16 core version of the M1 and do what Google did with ChromeOS - start with a version of Debian (for maximum compatibility), inject some proprietary macOS libraries in it and ship it. They would utterly dominate.

    In fact, that is going to happen already. The guy who is working to put Linux on M1 Macs? What do you think they are going to use it for? Exactly. However, adoption is going to be limited because once you put Linux on the M1 Mac, it will be out of support. So not very many enterprises are going to put their mission-critical server applications on unsupported stacks because they won't receive timely security patches or help when something breaks.

    Good grief ...
    That's a lot of words to say that Apple isn't and never has been an enterprise company. These are for developers and companies that need automated testing. There is very little other use for them but that alone is going to be very useful.
    dewmewatto_cobra
  • Reply 15 of 20
    flydogflydog Posts: 1,124member
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    A visit to a competent mental health practitioner would not be a waste of your time. 
  • Reply 16 of 20
    flydogflydog Posts: 1,124member

    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?
    The main benefit is that an iOS developer can build, test, and deploy on a large scale.  See, e.g., https://www.macstadium.com

  • Reply 17 of 20
    Rayz2016Rayz2016 Posts: 6,957member
    cloudguy said:
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    1. Why on earth would things not be profitable? AWS has offered Windows 10/8/7 instances for years. Why would Amazon be any more likely to dump Macs than Windows?
    2. See 1: you can only develop macOS, iPadOS, iOS, watchOS, tvOS, HealthKit and HomeKit on macOS. There are no other alternatives. As Apple platform app development is booming, again why?
    3. See 1 again: There was a major article a few months back on Deadline, Variety etc. about how entertainment companies are moving animation, movie and TV production and post-production to the cloud, which will remove the requirement to live in Los Angeles if you want to do technical work on Hollywood projects. (Or to live in Tokyo if you want to do similar in Japan, Seoul for Korea, Beijing for China, London for the UK etc.) Now, what is the #1 PC platform for Hollywood types and other creators? Exactly.
    4. Microsoft did the same thing? How? Microsoft does not make server hardware. Microsoft most certainly does not make Linux. Quite the contrary, Microsoft under Ballmer tried to crush Linux using patent lawsuits and market pressure. Had Microsoft not recently already been reined in by an antitrust ruling, they would have succeeded. Don't blame Microsoft: they had nothing to do with it. And don't blame Linux as Microsoft's competing server products thrived. Instead, blame Apple for making a bad product. You would have needed to be absolutely nuts to buy that server product: expensive hardware, bad software and it only supported Apple products!

    Apple's failures in the enterprise are due to their own abject refusal to care about software beyond the absolute basics - 20 years and they still don't have an equivalent to Microsoft Active Directory when Google created their own to manage Chromebooks in less than a year - and their dragging their feet to embrace open platforms and standards. Look, Apple could dominate the ARM-based small server market tomorrow. Just come out with a 12 or 16 core version of the M1 and do what Google did with ChromeOS - start with a version of Debian (for maximum compatibility), inject some proprietary macOS libraries in it and ship it. They would utterly dominate.

    In fact, that is going to happen already. The guy who is working to put Linux on M1 Macs? What do you think they are going to use it for? Exactly. However, adoption is going to be limited because once you put Linux on the M1 Mac, it will be out of support. So not very many enterprises are going to put their mission-critical server applications on unsupported stacks because they won't receive timely security patches or help when something breaks.

    Good grief ...
    Could someone tell me what this word soup has to do with anything?
  • Reply 18 of 20
    Rayz2016Rayz2016 Posts: 6,957member

    blastdoor said:
    dewme said:
    Rayz2016 said:
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    Yes and no. The yes part: This is a platform as a service (PaaS) flavor of cloud computing where you purchase access to additional Mac machines (and supporting AWS infrastructure) for a period of time to do with whatever your needs dictate. Yes, if you have a big job or process pipeline to execute that would benefit from distributing it across multiple machines, you’ll need to have a way to divide the work up, parcel it out to multiple machines, and aggregate the results. There are several software development automation tools on the market that are explicitly designed for this purpose, e.g., Jenkins, XCode Server, and some development shops build their own automation.

    That’s only one type of distributed processing use case, but there are many others especially in modern software development processes, like having a pool of dedicated compile machines, build machines, automated test machines, configuration management machines, deployment machines, etc., in a pipeline such that every time a block of code gets committed to a development branch from an individual developer it is compiled/built, integrated, unit tested, and regression tested against the entire software system that it is a part of. Each of the machines participating in the individual tasks along the pipeline feeds its output to the next machine in the process. These pipelines or processes run continuously and are event driven, but can also be scheduled. Thus the term continuous integration (CI) and continuous delivery (CD).

    There is nothing preventing an individual user from purchasing access to a cloud based machine (Mac, Windows, Linux, etc.) to do whatever they want to do with it. The inhibitor tends to be the cost and terms of service, which are typically geared towards business users with business budgets. The huge benefit of PaaS cloud computing tends to be the ability to acquire many machines in very short order, and the ability to add/delete many machines nearly instantaneously as your needs change, i.e., elasticity. Try asking your IT department to spin up 125 new Macs overnight if they have to order physical machines from Apple or a distributor. They will laugh, you will cry.

    The no part: If you need to deploy micro services you may not need complete, dedicated machines to deploy your micro services. You may just need a container to run your service, which could be a shared computing resource that is hosting several “tenants” running completely isolated in their own container. If you are building out the service hosting platform as part of your solution, then sure, you could have cloud based machines for your homegrown hosting platform, but this reverts to the previous use case that I mentioned.

    I don’t get the “bizarro” comment from Blastdoor. This type of PaaS cloud computing has been in very widespread use for a very long time, with Amazon’s AWS and Microsoft’s Azure being two of the major players. You may want to see if AWS offers a test drive to get a feel for how it works. Microsoft used to and may still allow you to test drive their Azure PaaS solution. There’s nothing at all bizarre about how it works. You’re sitting in front of a Mac with full access to everything you can get to via the keyboard, mouse, and monitor. Instead of sitting under your desk it is sitting in the cloud.

    Nothing bizarre looking here: https://aws.amazon.com/blogs/aws/new-use-mac-instances-to-build-test-macos-ios-ipados-tvos-and-watchos-apps/

    My only concern with the Mac version of EC2 is that it relies on VNC for remoting the desktop. VNC is not as secure or as performant as RDP that is used for Windows remoting. Any way you cut it, Amazon providing support for Mac in AWS is a very big deal and brings Apple another step closer to competing at the enterprise level against Windows and Linux. 
    https://en.wikipedia.org/wiki/Bizarro

    I was not suggesting that AWS/Azure/et al are "bizarre" in any absolute sense.

    I was saying that relative to my personal hopes for a hypothetical "iCloud Pro" service, which would provide seamless/transparent access to cloud resources for Mac users, this is in a sense the opposite (and therefore Bizarro) -- it's providing non-seamless / non-transparent access to Macs for people who aren't necessarily Apple users. 

    Actually, as I re-read my post, I don't understand what you don't understand. I was quite clear. Reading comprehension fail on you. 
    Actually, there are a lot of enterprise Java developers that cut and test code exclusively on Macs, even if the final deployment ends up on Linux cloud servers. 
    If we now have Mac machines serving these apps then it’s going to be a shot in the arm for Swift server side technologies like Vapor. 
  • Reply 19 of 20
    BeatsBeats Posts: 3,073member
    I just like seeing competitors come together and contribute to Apple.
  • Reply 20 of 20
    croprcropr Posts: 1,124member
    dewme said:
    rob53 said:
    So what we're talking about is a company like Walmart taking over the small to medium cloud service providers. Amazon will simply undercut the traditional Mac cloud providers, forcing them out of business then when things aren't profitable, they'll dump AWS hosting on Mac hardware. I see this as predatory actions by a monopoly. Microsoft did the same thing. Apple had a half-baked server product (still has a quarter-baked server OS product) that they dumped because of cheap server hardware and free (linux) server OS. AWS is not free but it's huge. I'd rather see Apple enhance what's left of their server software and release a competitor to AWS instead of seeing AWS released on a desktop device (Mac mini). Apple could easily build a true server box with a version of the M1 that could compete with other server hardware and work with linux distributions for a very good server OS complete with better security than any server hardware and software available today. 
    Huh? I totally cannot follow the conspiracy theories you’re spinning. This is big win for Apple in its quest to attract more developers, especially those who target enterprise apps, to the Mac platform. As far as Apple’s investment (or disinvestment) in their server products, it’s no different than all of the other things at Apple that get cut down or nixed as part of Apple’s strategy of saying “no” to anything that’s not in their own best interest to do themselves.

    I see 2 use cases for this offering:
     - Web developers who don't have Macs and want to test a developed website using Safari.
     - iOS and Mac developers who have Macs and want run automated tests in a Continuous Integration environment
    In both cases the developers won't buy any additional Apple hardware, so I don't see a big win for Apple.  On the contrary up to now, the former group had to buy a Mac, but now this is no longer necessary, so Apple Macs sales might be impacted.

    No developer want to develop iOS or Mac apps on a remote system.  That is just too slow and cumbersome
Sign In or Register to comment.