But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
As a guy who built PCs for decades, I love my iMac AIOs. My last one lasted 8 years, by the time it was done even the screen was out of date. My new one is 10 lbs lighter and is VESA arm mounted, giving me an ultra clean desk free of any clutter or cables.
Never cared about the chin. Such complaints seem to be academic ones held by people not actually using the product. You’re in that category too, aren’t you?
You may not have had problems due to AOI design issues - and that's what they were - but I and many, many others did.
Try looking into 2009 27" i7 models which were literally 'designed' to slow cook themselves. Or put another way, they were nowhere near efficient enough with the thermals.
'hidden' Air inlets prone to clogging with dust. Impossibilty get dust out of the machines. Inaccessible entry point requiring glass and display removal. Poor thermals in general. System fans not running fast enough and the killer: throwing Radeon graphics cards into that soup and seeing them fail as a result.
Under warranty Apple would gladly throw another identical card into that soup but it would almost certainly meet the same fate.
Back in the day I did some research on the issue and found it to be VERY widespread.
I still have that machine but it will only boot into safe mode.
The screen is perfect but unusable because its tied to the machine. RAM is perfect. Disk is perfect etc.
The root problem is the design. Then fans not running nearly fast enough (probably not to 'annoy' users) and the combination of the i7 and the Radeon.
Now, I would love for someone to dig into this particular scenario and force Apple to hand over data on exactly how many systems failed in this setting according to its records (in or out of warranty).
The chin is very dated. That's why they'll surely get rid of it. A move that is long overdue.
What are you on about? You just listed a bunch of problems, some specific about a model from over a decade ago, and some general issues you have with AIO, but doesn't seem to have anything to do w/ chins, or negate the value proposition that exists for AIO computers.
The chin exists as part of the form factor used to provide room for components & cooling. It goes without saying that as miniaturization continues and components are consolidated further, it won't be necessary. This isn't a prophetic observation. Until that becomes true, I still have never run into an issue using my iMacs with chins. I haven't ever spent even a single second lamenting that my machine has one. Again, I find it's an academic "issue" people who don't own the device have. I get the impression you don't use an iMac so you confirm the pattern...shrug. I don't think Apple is designing computers for what fans of Chinese knockoffs are using, but who knows
Also, your screen from 2009 would look pretty awful compared to a current screen. I sure wouldn't want to work on it.
1. iMac has 32" display 2. iMac is compatible with Pro Display XDR stand and has VESA option 3. External monitor is just the iMac 32 without the computing guts 4. Mac Half Pro is modular, with room for MPX module, +2 PCIe slots, and 2 HDDs. 5. iMac 24" has fanless option.
I'm actually hoping that the Half Pro as you call it is actually an XMac tht has been dreamed of for decades. I don't see it as a Mac Pro replacement but rather a hgh performance desktop machine that starts at around $1200 to $1500, for a model that doesn't suck. Basically a place to give us a higher wattage CPU and GPU than can be had in an iMac. These days no HDD but rather M.2 SSD slots for expansion. Such a machine could easily fit into an enclosure half the size of a Mac Pro. In fact one could do as well to bump up the size of the Mini some.
I would love a headless design for a simple reason. I already have two perfectly good monitors, therefore I could spend more on the actual computer components (faster processor,, more RAM, better storage, etc) without having to factor in the not insignificant cost of an attached display.
I’m hanging on to my 2013 trash can a bit longer to see what Apple does.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
As a guy who built PCs for decades, I love my iMac AIOs. My last one lasted 8 years, by the time it was done even the screen was out of date. My new one is 10 lbs lighter and is VESA arm mounted, giving me an ultra clean desk free of any clutter or cables.
Never cared about the chin. Such complaints seem to be academic ones held by people not actually using the product. You’re in that category too, aren’t you?
I have to agree with you, with the exception of the soldered in memory. Part of the reason I buy Macs is because they are well built and last. My son wanted a gaming computer so I helped him put one together. A year later he started having problems and replaced the motherboard. Then the processor. He's had more problems with his PC in one year than I've had with my Macs in 15.
As far as the chin goes, Meh. You generally need to move the screen up, so the chin isn't exactly lost space. My only complaint about iMacs is having all the ports in back is rather asinine. Put at least one USB port in an accessible spot!
As far as this rumor goes, if true, I wonder how they are going to get the performance for a mac pro. either they have some major improvements in store for the M2 or they are going to have a multi-processor architecture. It seems they would need significant gains on the graphics side, too.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
You expect average people to remove the iMac's display? That's pretty unreasonable.
And from my perspective you're missing the point. If something goes wrong with the CPU or other core component, when the screen is still fine, then there's no reason to remove the screen. The whole point to having a separate screen is so that you don't have to throw out the monitor when the CPU dies. Both of my last two iMacs have given me blue screens of death every week after the first three years (with nothing but Apple's OS on the system.) Yes, I've reinstalled the OS and taken them in to Apple for testing, and they could find nothing. It's just random crashing, and it feels like the Intel CPU. I will never buy an Intel CPU again, even though I can't prove that that's the cause of the problem.
I've had to throw out my iMac screens with my CPUs because they are built in and can't be separated every time the CPU dies. What a waste.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
You expect average people to remove the iMac's display? That's pretty unreasonable.
And from my perspective you're missing the point. If something goes wrong with the CPU or other core component, when the screen is still fine, then there's no reason to remove the screen. The whole point to having a separate screen is so that you don't have to throw out the monitor when the CPU dies. Both of my last two iMacs have given me blue screens of death every week after the first three years (with nothing but Apple's OS on the system.) Yes, I've reinstalled the OS and taken them in to Apple for testing, and they could find nothing. It's just random crashing, and it feels like the Intel CPU. I will never buy an Intel CPU again, even though I can't prove that that's the cause of the problem.
I've had to throw out my iMac screens with my CPUs because they are built in and can't be separated every time the CPU dies. What a waste.
Buy a Mac Mini.
I just found out today that there is nothing user serviceable on the M1 Mac Mini. Even the SSD is soldered to the MB.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
I’ve looked at the steps on iFixit, or OWC to remove the display. Honestly I've built systems, I was a Sysadmin for over a decade, I know computers inside and out, and it scares the hell out of me.
I thought that too but had to tackle parents computer and found it surprisingly easy.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
You expect average people to remove the iMac's display? That's pretty unreasonable.
And from my perspective you're missing the point. If something goes wrong with the CPU or other core component, when the screen is still fine, then there's no reason to remove the screen. The whole point to having a separate screen is so that you don't have to throw out the monitor when the CPU dies. Both of my last two iMacs have given me blue screens of death every week after the first three years (with nothing but Apple's OS on the system.) Yes, I've reinstalled the OS and taken them in to Apple for testing, and they could find nothing. It's just random crashing, and it feels like the Intel CPU. I will never buy an Intel CPU again, even though I can't prove that that's the cause of the problem.
I've had to throw out my iMac screens with my CPUs because they are built in and can't be separated every time the CPU dies. What a waste.
Buy a Mac Mini.
I already own two new Mac Minis as backups to my iMac. But I want to stay faithful to Apple's hardware, so I want an Apple-made monitor, but their current $5000 monitor is too costly for me. Rumours on this website say that there may be a new Apple monitor in the works. If nothing comes out by March, I will buy a third party display.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
As a guy who built PCs for decades, I love my iMac AIOs. My last one lasted 8 years, by the time it was done even the screen was out of date. My new one is 10 lbs lighter and is VESA arm mounted, giving me an ultra clean desk free of any clutter or cables.
Never cared about the chin. Such complaints seem to be academic ones held by people not actually using the product. You’re in that category too, aren’t you?
You may not have had problems due to AOI design issues - and that's what they were - but I and many, many others did.
Try looking into 2009 27" i7 models which were literally 'designed' to slow cook themselves. Or put another way, they were nowhere near efficient enough with the thermals.
'hidden' Air inlets prone to clogging with dust. Impossibilty get dust out of the machines. Inaccessible entry point requiring glass and display removal. Poor thermals in general. System fans not running fast enough and the killer: throwing Radeon graphics cards into that soup and seeing them fail as a result.
Under warranty Apple would gladly throw another identical card into that soup but it would almost certainly meet the same fate.
Back in the day I did some research on the issue and found it to be VERY widespread.
I still have that machine but it will only boot into safe mode.
The screen is perfect but unusable because its tied to the machine. RAM is perfect. Disk is perfect etc.
The root problem is the design. Then fans not running nearly fast enough (probably not to 'annoy' users) and the combination of the i7 and the Radeon.
Now, I would love for someone to dig into this particular scenario and force Apple to hand over data on exactly how many systems failed in this setting according to its records (in or out of warranty).
The chin is very dated. That's why they'll surely get rid of it. A move that is long overdue.
What are you on about? You just listed a bunch of problems, some specific about a model from over a decade ago, and some general issues you have with AIO, but doesn't seem to have anything to do w/ chins, or negate the value proposition that exists for AIO computers.
The chin exists as part of the form factor used to provide room for components & cooling. It goes without saying that as miniaturization continues and components are consolidated further, it won't be necessary. This isn't a prophetic observation. Until that becomes true, I still have never run into an issue using my iMacs with chins. I haven't ever spent even a single second lamenting that my machine has one. Again, I find it's an academic "issue" people who don't own the device have. I get the impression you don't use an iMac so you confirm the pattern...shrug. I don't think Apple is designing computers for what fans of Chinese knockoffs are using, but who knows
Also, your screen from 2009 would look pretty awful compared to a current screen. I sure wouldn't want to work on it.
Actually, the current iMacs have pretty much nothing but speakers in the chin.
I have a 2010 Apple LED Cinema Display hooked up to my 2019 27” iMac. The colours aren’t bad, and obviously the resolution isn’t as good, but it’s plenty good enough as a secondary display.
Smaller (cheaper) Mac Pro might be the fabled xMac??! We can but dream...
Who said cheaper? One can only hope this is going to be better priced than a hackintosh.
I might be wrong, but I would hope even Apple doesn’t have the gall to charge the same price for the full size Mac Pro as the Mac Mini Pro.
Wasn't that basically the reason the G4 Cube failed, that it was basically the same price as the PowerMac G4, with no benefits other that a pretty design?
Smaller (cheaper) Mac Pro might be the fabled xMac??! We can but dream...
Could be! However I never saw the XMac as a Mac Pro replacement.
In any event if the rumor is correct and that new Cube is in fact a Mac Pro replacement then Apple has fallen into the same old trap of not realizing that pros need certain features and capabilities. There is a huge difference between a Pro desktop computer and a Pro workstation type machine. Apple hasn't gotten either one right in decades so I'd be shocked to see them suddenly start marketing machines people will buy.
The current Mac Pro sorta wants to fill the role of a workstation type computer. It fails miserably though for many reasons.
As for the fabled XMac the key there is a it being a desktop machine with a decent Video card and SSD expansion capability. That isn't asking a lot and for the life of me I've never understood why Apple hasn't been able to produces such a machine. There is almost zero overlap with respect to potential customers of the XMac and the Mac Pro (or what the Mac Pro should be).
Finally I have this big fear that Apple is going to move away from supporting discreet GPU's believing that their integrated stuff will be good enough. That will be a sad day because a discreet GPU will always have a performance advantage in some situations simply due to die area.
True, perhaps more of a Mac Mini Pro. Or a Mac Pro Mini.
Let’s hope they’ll listen to their customers as they sort of did with the new Mac Pro, rather than being “courageous” and introducing another dead end Mac like the 2013 Mac Pro or Cube. That’s the thing - people here say an xMac hasn’t enough of a market, but I would wager that it’d have a much bigger market than a $5000 machine with $600 wheels…
I agree though, the current MP is more of a tech demo than a real workstation or a desktop. I had a 2006 Mac Pro, which was great. Much more sensible price and had decent expansion and upgrade capability.
I suspect part of the reason they don’t want to have an expandable xMac, is people adding their own GPUs requires driver support. Since Apple writes the AMD drivers they know what GPUs they have to write the drivers for traditionally, and would have to expand that to support more AMD cards.
I feel the same way with integrated graphics too. There’s no way Apple’s integrated GPUs will ever come close to discrete ones. Apple has had an unbelievably childish spat with Nvidia over a leak years ago, so refuses to use their GPUs - nor will they sign the Nvidia drivers for Mojave+. Surely an antitrust issue right there. Not only were the Nvidia drivers updated for 7 or 8 years after the card’s release, they were always *much* better than the drivers Apple writes for the AMD cards. Before the Nvidia spat, Apple refused to use AMD (then ATI) cards due to a leak, but had to go back to them tail between their legs post-Nvidia leak. It’s absurd that a 2tn company behaves like a spoilt child.
I feel the same way with integrated graphics too. There’s no way Apple’s integrated GPUs will ever come close to discrete ones. Apple has had an unbelievably childish spat with Nvidia over a leak years ago, so refuses to use their GPUs - nor will they sign the Nvidia drivers for Mojave+. Surely an antitrust issue right there. Not only were the Nvidia drivers updated for 7 or 8 years after the card’s release, they were always *much* better than the drivers Apple writes for the AMD cards. Before the Nvidia spat, Apple refused to use AMD (then ATI) cards due to a leak, but had to go back to them tail between their legs post-Nvidia leak. It’s absurd that a 2tn company behaves like a spoilt child.
I'm not entirely sure what I think about the Apple/Nvidia spat, because the rationale behind the spat is not publicly available, but I'm confused by something in your post. I.e., you said that Apple writes the drivers for AMD (and signs them), but you said Nvidia writes its own drivers. I don't see why that's an antitrust issue because you stated the obvious difference - Apple writes the AMD drivers, but not the Nvidia drivers. Therefore this is not a difference in treatment between two companies by Apple. If Apple was signing AMD's drivers but not Nvidia's then it might make sense, but Apple is signing its own drivers. nobody can force Apple to write drivers for Nvidia. The fact that Nvidia writes its own drivers is irrelevant because AMD also writes its own drivers but Apple does not sign those drivers, instead it writes its own. At least according to you.
I was always under the impression that Apple had a problem with Nvidia because Nvidia would not write Metal drivers for their cards, whereas AMD would. Nvidia supports only OpenGL, and that would make Apple's computers slower because Apple's hardware is designed for Metal. There is no technical reason for Nvidia to refuse to write Metal drivers other than they think they are big enough to refuse to cooperate. But the fact that Nvidia refuses to support OpenGL is all the evidence I need that there's no antitrust issue. You can't force a computer manufacturer to support the APIs of every video card out there.
Finally I have this big fear that Apple is going to move away from supporting discreet GPU's believing that their integrated stuff will be good enough. That will be a sad day because a discreet GPU will always have a performance advantage in some situations simply due to die area.
I think you’re right that there is a very good chance that there will not be user-upgradable GPUs in ASi systems. Apple’s rhetoric with the M1 intro seemed to strongly suggest it.
But is that really something to fear? I’m not so sure.
You’re right that a discreet GPU could always be faster for some use cases. But it’s also true that many professional (non-gaming) uses for a GPU can benefit from parallelization across computers (eg, render farm). So, many pros could be satisfied with a cluster of these mini-max Macs.
Suppose this alleged mini-max Mac maxes out with a 32 core CPU and 32 core GPU (so, 4 times faster than the M1). What use case can benefit from, say, a 64 core threadripper with the highest end discreet GPU available BUT cannot also benefit from a cluster of those mini-max Macs?
It might end up being a tiny, tiny sliver of the market that needs all of that parallel computing power in the same box.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
As a guy who built PCs for decades, I love my iMac AIOs. My last one lasted 8 years, by the time it was done even the screen was out of date. My new one is 10 lbs lighter and is VESA arm mounted, giving me an ultra clean desk free of any clutter or cables.
Never cared about the chin. Such complaints seem to be academic ones held by people not actually using the product. You’re in that category too, aren’t you?
You may not have had problems due to AOI design issues - and that's what they were - but I and many, many others did.
Try looking into 2009 27" i7 models which were literally 'designed' to slow cook themselves. Or put another way, they were nowhere near efficient enough with the thermals.
'hidden' Air inlets prone to clogging with dust. Impossibilty get dust out of the machines. Inaccessible entry point requiring glass and display removal. Poor thermals in general. System fans not running fast enough and the killer: throwing Radeon graphics cards into that soup and seeing them fail as a result.
Under warranty Apple would gladly throw another identical card into that soup but it would almost certainly meet the same fate.
Back in the day I did some research on the issue and found it to be VERY widespread.
I still have that machine but it will only boot into safe mode.
The screen is perfect but unusable because its tied to the machine. RAM is perfect. Disk is perfect etc.
The root problem is the design. Then fans not running nearly fast enough (probably not to 'annoy' users) and the combination of the i7 and the Radeon.
Now, I would love for someone to dig into this particular scenario and force Apple to hand over data on exactly how many systems failed in this setting according to its records (in or out of warranty).
The chin is very dated. That's why they'll surely get rid of it. A move that is long overdue.
My 2009 27" i7 still runs fine after 11 years of use... and 99.999999% of that time it has been powered on - I haven't had any heat issues. And currently, it is also being used as an external display for my new M1 Mac mini.
Over the years, I have upgraded, memory, hard drive, Bluetooth/WiFi card, and replaced the DVD drive with an internal SSD to create a fusion drive.
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
You expect average people to remove the iMac's display? That's pretty unreasonable.
And from my perspective you're missing the point. If something goes wrong with the CPU or other core component, when the screen is still fine, then there's no reason to remove the screen. The whole point to having a separate screen is so that you don't have to throw out the monitor when the CPU dies. Both of my last two iMacs have given me blue screens of death every week after the first three years (with nothing but Apple's OS on the system.) Yes, I've reinstalled the OS and taken them in to Apple for testing, and they could find nothing. It's just random crashing, and it feels like the Intel CPU. I will never buy an Intel CPU again, even though I can't prove that that's the cause of the problem.
I've had to throw out my iMac screens with my CPUs because they are built in and can't be separated every time the CPU dies. What a waste.
It extremely unreasonable to believe that average people actually do anything to upgrade their systems. Most don't and those that do, have someone else do it.
And, I think you're missing the point... If you don't like AIO then don't buy one. A lot of people do like them and it's nice HAVING A CHOICE.
Also, aren't all laptops AIO systems? Are you going to argue that they are a waste as well?
But I've had it with all-in-ones. I'll never buy a desktop Mac with the screen glued onto it again.
Your complaint is moot. I think the new iMacs will be essentially a giant iPad with everything soldered in. Gone will be the days where one could at least upgrade the RAM.
What's your beef with removing the display? Open that display once (or twice) ever in its life it too much? I can remove and reinstall my iMac display in minutes. It's a non-issue.
You expect average people to remove the iMac's display? That's pretty unreasonable.
And from my perspective you're missing the point. If something goes wrong with the CPU or other core component, when the screen is still fine, then there's no reason to remove the screen. The whole point to having a separate screen is so that you don't have to throw out the monitor when the CPU dies. Both of my last two iMacs have given me blue screens of death every week after the first three years (with nothing but Apple's OS on the system.) Yes, I've reinstalled the OS and taken them in to Apple for testing, and they could find nothing. It's just random crashing, and it feels like the Intel CPU. I will never buy an Intel CPU again, even though I can't prove that that's the cause of the problem.
I've had to throw out my iMac screens with my CPUs because they are built in and can't be separated every time the CPU dies. What a waste.
It extremely unreasonable to believe that average people actually do anything to upgrade their systems. Most don't and those that do, have someone else do it.
And. think you're missing the point... If you don't like AIO then don't buy one. A lot of people do like them and it's nice HAVING A CHOICE.
I wasn't against choice. My problem was that Apple doesn't give me a choice because I insist on buying from Apple but Apple doesn't sell a display for under $5000. That was my point, which you didn't get. I guess I could have been clearer, but that was my point.
My 2009 27" i7 still runs fine after 11 years of use... and 99.999999% of that of that time it has been powered on.
Only 30 seconds of downtime over 11 years, that's amazing. I managed a system with 99.9% uptime, and that was difficult. We even maintained the system when the province-wide electrical power went off for a week. Which was kind of funny because nobody in the province could use our system when the province had no electricity for the population. But we were up.
Comments
The chin exists as part of the form factor used to provide room for components & cooling. It goes without saying that as miniaturization continues and components are consolidated further, it won't be necessary. This isn't a prophetic observation. Until that becomes true, I still have never run into an issue using my iMacs with chins. I haven't ever spent even a single second lamenting that my machine has one. Again, I find it's an academic "issue" people who don't own the device have. I get the impression you don't use an iMac so you confirm the pattern...shrug. I don't think Apple is designing computers for what fans of Chinese knockoffs are using, but who knows
Also, your screen from 2009 would look pretty awful compared to a current screen. I sure wouldn't want to work on it.
As far as the chin goes, Meh. You generally need to move the screen up, so the chin isn't exactly lost space. My only complaint about iMacs is having all the ports in back is rather asinine. Put at least one USB port in an accessible spot!
As far as this rumor goes, if true, I wonder how they are going to get the performance for a mac pro. either they have some major improvements in store for the M2 or they are going to have a multi-processor architecture. It seems they would need significant gains on the graphics side, too.
I agree though, the current MP is more of a tech demo than a real workstation or a desktop. I had a 2006 Mac Pro, which was great. Much more sensible price and had decent expansion and upgrade capability.
I suspect part of the reason they don’t want to have an expandable xMac, is people adding their own GPUs requires driver support. Since Apple writes the AMD drivers they know what GPUs they have to write the drivers for traditionally, and would have to expand that to support more AMD cards.
I was always under the impression that Apple had a problem with Nvidia because Nvidia would not write Metal drivers for their cards, whereas AMD would. Nvidia supports only OpenGL, and that would make Apple's computers slower because Apple's hardware is designed for Metal. There is no technical reason for Nvidia to refuse to write Metal drivers other than they think they are big enough to refuse to cooperate. But the fact that Nvidia refuses to support OpenGL is all the evidence I need that there's no antitrust issue. You can't force a computer manufacturer to support the APIs of every video card out there.
It might end up being a tiny, tiny sliver of the market that needs all of that parallel computing power in the same box.
My 2009 27" i7 still runs fine after 11 years of use... and 99.999999% of that time it has been powered on - I haven't had any heat issues. And currently, it is also being used as an external display for my new M1 Mac mini.
Over the years, I have upgraded, memory, hard drive, Bluetooth/WiFi card, and replaced the DVD drive with an internal SSD to create a fusion drive.
It extremely unreasonable to believe that average people actually do anything to upgrade their systems. Most don't and those that do, have someone else do it.
And, I think you're missing the point... If you don't like AIO then don't buy one. A lot of people do like them and it's nice HAVING A CHOICE.
Also, aren't all laptops AIO systems? Are you going to argue that they are a waste as well?
Only 30 seconds of downtime over 11 years, that's amazing. I managed a system with 99.9% uptime, and that was difficult. We even maintained the system when the province-wide electrical power went off for a week. Which was kind of funny because nobody in the province could use our system when the province had no electricity for the population. But we were up.