Apple shipped the same 5W charger in millions of iPhone boxes for ten years.
Isn't that 'anti' innovation in charging technology?
You lost the plot line there. How did Apple bundling 5W chargers with their phones prevent other manufacturers from innovating their own charging technology? In fact, you contradicted yourself in the very next sentence:
Although it has upped its game in the last couple of years, it still lags behind competitors in charging technology.
So competitors did innovate and create better charging technology.
I'm still failing to see the point of this regulation. I don't know many people who have both an Android phone and an iPhone, so how is this saving the environment? And once Apple switches from Lightning to USB-C on the iPhone, most iPhone owners are going to have to buy new charging cables, thus adding more environmental waste.
I get that there's a loud minority of tech enthusiasts who own a plethora of devices, already have USB-C on at least one of them, and want a standard. But there are many more who simply own a single phone these days.
I'm definitely not contradicting myself.
What I mentioned (admittedly somewhat tongue-in-cheek) was Apple specific with regards to charging innovation.
It has done little to nothing and shipped a 5W charger in every iPhone box for 10 years while being privvy to exactly who was a new client and who was upgrading.
Competitors have been literally years ahead in every single aspect of charging. From battery chemistry, cooling, safety gates...
And that includes wireless charging.
But my post was not about Android manufacturers.
Not shipping a charger in the box could have been done for its cited environmental concerns many, many years prior. That didn't happen and it was likely, IMO, that the reason for shipping such a poor default charging option in the box was part of an 'upsell' strategy of 'your phone can actually charge a bit faster and we can sell you such a charger if you want one'. AFAIK there was never an option to return a 5W charger at purchase time for a discount on the phone itself or against a faster charger. So much for the environment.
I can understand Apple trying to spin things to its advantage, though even if virtually no one actually swallowed the environmental line. It was something for them to put on a banner.
Now, this EU proposal is a general move which goes way beyond phones. It aims to harmonise a huge swathe of the small consumer electronics market and it admits that there is no perfect solution.
If you read through the impact assessments you will find that there are different combinations of pros and contras to be evaluated and no given combination will bring the best result for everyone so the real goal here is to improve the current situation.
That is a good enough start.
Innovation and accommodating future improvements is absolutely taken into account.
And yet, the battery life (hrs of usage between charge) on an iPhone has managed to keep up with and often exceeds that of an Android smartphone. Not bad for a company that has done nothing to advance battery charging tech in the past 10 years, while Android smartphone companies battery charging tech are years ahead of Apple.
That 5W charger first shipped was meant to charge an iPhone overnight. For the average users back then, they only needed to charge their iPhone once every 2 or maybe 3 days. And it was usually done overnight and didn't take more than 4 hours. Even the heavier than average users only needed to charge their iPhone overnight, for it to last all day. And even back then, there were external battery packs that allow you to use your smartphone if the internal battery died. They were not as slim and powerful as todays external battery packs but they were often more practical to carry around, than a charger.
Did it ever occur to you that there are other ways to increase battery life, which really matter more than charging rate, other than increasing the size of the battery and then having to come up with better charging tech to charge that bigger battery in a reasonable amount of time? Android is not nearly as energy efficient as iOS, so Apple from the get go didn't have to concentrate as much as Android smartphone companies, on battery charging tech to increase the battery life on an iPhone. I bet improve screen tech, has increase battery life on a smartphone, way more than any improvement on battery and battery charging tech. If your phone can last from the time you leave the home and the time you get back, does it really matter that much that you can charge it back to full in less than 30 minutes rather than 4 hours when you're sleeping?
That ten year old 5W charger that came with an iPhone can still be used to charge todays iPhone 13. (It slower than the newer iPhone chargers and much slower than a USB C fast charger, but will still charge it full overnight) Can you say the same for the 10 year old Android smartphone charger with the USB mini? How about the 5 year old Android smartphone with USB micro? Were they much more than 5W? Do they still work on the newest Android phones? So even after ten years, that 5W iPhone charger can still be use to charge every iPhone (that Apple still supports) in use. The 10W lightning iPad charger that came with the first iPad Air 9 years ago, was a 10W USB charger that can still be useful today with the proper adapter cable and will work fine charging todays iPhones and iPads. Even with any Android smartphones and the iPad Pro with the USB C port.
Some of Samsung phones still come with a 15W USB C charger. Why? Why not a 25W USB C charger so that it can "fast charge"? Oh, I get it. This so that Samsung can "upsell" and sell a 25W (or higher) charger separately. Because the Samsung phone can charge faster. A 15W charger will barely work on a laptop (with USB C). Most require at least 25W. So another USB C charger will be needed for that laptop. No eventual E-waste saving there. Even though both phone and laptop have a USB C port.
And think about this. Over 85% of new iPhone purchasers every year, are buying at least their 2nd iPhone. Which means that with 85% of iPhones sold, the buyer most likely already have a charger for it and more than likely several. There were some outcry, even from some consumers in the EU, when 2 years ago, Apple discontinued providing a charger with each new iPhone purchase. But not nearly enough for Apple to change that policy. That did more to reduce E-waste globally, than this EU policy of forcing all smartphones to have a standard USB C port for charging. But now, when Apple is forced to use an USB C port for charging, a very large percentage of iPhone buyers will have buy a USB C charger (or a couple to have a spare) or Apple might be forced to provide one for at least several years.
This has nothing to do with USB C charging technology. The EU couldn't care less about charging rate. All they care about is that every smartphone has the same port. The port itself has nothing to do with charging tech. The charging tech is in the charging board inside the device and/or inside the charger and with software. They are not requiring smartphones to use a USB C charger. One can charge any iPhone with any USB C charger, now. One can use a USB C charger to "fast charge" using the lightning port, on newer model iPhones. (Not as fast as with Android phones with a USB C port, but 30 minutes for 50% charge is not bad.) The best policy would be for the EU to force companies to stop providing chargers and let the consumers determine what charger they need, if they need to buy charger. A charger becomes E-waste when it becomes obsolete, no matter what port it's for. The 15W USB C chargers that still comes with some Samsung phones, are not prone to be any more useful in 10 years, than that 10 year old 5W Apple charger. But if it still have some use, it will save it from being E-waste. Better it sitting in a drawer full of other chargers, than in a landfill.
Apple stop providing chargers and Samsung have stop providing chargers on most of their higher end phones. Since it's been nearly the case in the past 5 years, that about 85% of both iPhones and Android new phones purchasers are sticking with the same, they most likely already have a charger for their new phone purchases. So this policy will not have as much impact on reducing E-waste, as the EU politicians think it will. The reason why there are so many chargers stored in consumers drawers, though not technically considered E-waste yet, is because at one time, they got a charger with every purchase of a new phone. And not so much because the chargers in their drawer no longer work with their new phone purchase, because it's for a different charging port. In many cases, even that can easily be remedied by just buying a new adapter cable and not the whole charger.
Watch, several years from now, when the EU measure a reduction in E-waste from chargers, they will take credit for it with this policy of forcing all smartphones to use the same USB-C port. When in reality, the reduction will be the result of Apple and Samsung (and probably others) policy to stop providing chargers with each new purchase of their phones. Which had nothing to do with any EU policy. I'm willing bet that if Apple and Samsung were to offer new phone buyers a coupon for 50% off the standard charger, to make up for not including one, nearly every purchaser will use the coupon, whether they needed a charger or not. Wouldn't you? Even if just to have another spare. The thinking being, if every rechargeable device is going to eventually use a USB C port, then there's no harm having extra USB C chargers on hand. Or how about this one. Apple and Samsung provide a new charger with every new phone purchase but offer a $10 rebate if it's mailed back unopened. I bet most will still keep the charger, even if they don't need it right then and will toss it unopened into the drawer with their other chargers not being used.
From an E-waste point of view, does it make sense to you that Apple can't just supply a new iPhone with an USB C to lightning cable? This would be the best of both world. Consumers with a USB C charger can charge the iPhone. Even fast charge it if it's one with a high enough wattage And consumers with an Apple charger can still charge the iPhone. Even with the 10 year old 5W Apple charger. In the EU, about 25% of smartphone users already have a charger for an iPhone. And in the UK, it's over 40% of smartphone users.
The iPhone has managed to keep up?
No. It has simply got its act together over the last couple of years. Prior to that it was a different story.
Now, as for the rest of your essay, it really doesn't have much to do with my point.
Battery capacities have increased as users have squeezed more usage out of them.
As for efficiencies between operating systems, I very much doubt you could get anywhere near a good comparison. There are simply too many variables involved and there isn't 'one' Android to compare with. Many Android vendors don't simply 'skin' the system. They have entire runtime environments operating on their phones. They have very deep battery optimisation schemes in place.
Batteries and charging permeate the entire system. Charger, protocol, cables (including chips inside the cables), thermal and voltage management, chemistry, safety gates.
Let's go back to 2017. An Android vendor attempted to run a photo comparison to the iPhone X in Alaska. The iPhone X came back with no photos because the battery couldn't handle the cold and the phone shut down.
A year earlier (2016) the same vendor announced a major breakthrough in battery composition and extreme temperature scenarios.
When you have a research center dedicated to battery technology you can apply breakthroughs to all manner of fields. Cell tower operations in desert or arctic climates. Mobile electronics, data centers (UPS solutions), PV solutions, the automotive industry...
Car battery tech can now warn of potentially catastrophic events as far as twenty four hours in advance.
The use of graphene has also allowed for major advances in technology.
The original 5W Apple charger may have been just fine for a couple of years but should not have stuck around as long as it did.
There is no getting away from that.
The iPhone 6 battery-gate fiasco could probably been avoided if it had shipped with a larger capacity battery from the get go. Users might not have burnt through their cycles as fast and perhaps upgraded before they knew they had issues (or before the Apple system update brought them to the fore).
Fast charging allows users to change habits in a good way. I've seen countless iPhone users change their usage habits over the years to try and avoid running out of juice in mid afternoon. Other countless users carrying around, not an external battery, but the charger itself and looking for sockets.
Battery backups are nice get out of jail cards for all users but Apple ended up selling battery packs in cases. Clearly the on board batteries weren't fulfilly needs for a while.
Not in terms of capacity out of the gate nor charging speed.
As I said, it has only been recently that Apple has begun to offer what is regarded as standard in the industry and it still lags in some areas.
If the charging function on any phone is standards compliant then yes, an old charger will still charge a modern phone. At least in my experience. You won't get the satisfaction of literally seeing the charge percentage surge as you watch the screen but it will charge.
The bottom line is that Apple took its eye off the ball in battery terms and has played catch up recently.
As for the EU and e-waste, we know (through the impact assessments) that the move will have an impact. It will also provide the basis for clarity on the user side as it will not be limited to phones and will aim to make the current charging mess decipherable to users.
Unbundling of chargers is a given - industry wide.
That's a lot of words just to fixate on one technical detail myopically: the battery.
You only look to the materials, capacity, and charging ability. What you likely don't understand is that the way the operating system performs power management is also incredibly important.
I was at the tech talk where Bud Tribble explained how iOS minimized power loss by coalescing interrupts to minimize on/off transitions, which is where the majority of power is wasted in devices. I knew very little about power management, but the clarity and simplicity of his design was brilliant. Something someone who was only fixated on one way of looking at the world, like battery materials, would never come up with.
Those runtimes I was mentioning, together with a myriad of other optimisation technologies (both in hardware and software) are present on Android and some vendors squeeze efficiencies out of everything they can. Efficiency isn't an Apple only thing.
But the thing about vendors who use Android on their systems is that they are limited in what they can do because there are some fundamental design decisions which cannot be changed. For instance, the decision to use Java as the main development language for Android apps from day 1. I'll expand after your next point...
It's is literally impossible to judge which 'system' is more efficient in realworld terms because users determine how much energy will be used and there is no 'one' usage pattern. You simply try to give users the best experience possible and overnight charging doesn't cut it in today's world and hasn't for quite some time.
The point is, efficiencies are baked into everything. A huge effort is made to save energy - by everyone. Looking at a particular runtime or language or hardware isn't going to help determine overall values.
Oh but it does if nearly every single app running on the system is using that language/runtime. It doesn't matter how someone is using their Android phone, if every app is written in Java (or Kotlin, which is built on Java), then they'll be subject to the same inefficiencies. Now I get that Java has been optimized a lot since the days of purely interpreted bytecode (which is definitely far less efficient than native CPU code), but it still has core inefficiencies like garbage collection for memory management, which has been proven to be a less efficient way to manage memory in apps than automatic reference counting (ARC), which is what's used by Swift/Obj-C.
Let me reiterate the fact that nearly every single app uses this. So it is important no matter how someone is using their phone. I get that it's only one part of optimization, but it's an important part nonetheless given how central it is to everything. And it's not something vendors can change without breaking every app.
What you may or may not gsin in one area, you may or may not gain in another.
You simply cannot look at one element and say that somehow swings it. What counts is the balance you end up with (however that is achieved) and except for the last couple of iPhone revisions, Android vendors have been sitting at the top of the endurance tests.
And the main point wasn't efficiency. It was battery technology and fast charging.
Apple shipped the same 5W charger in millions of iPhone boxes for ten years.
Isn't that 'anti' innovation in charging technology?
You lost the plot line there. How did Apple bundling 5W chargers with their phones prevent other manufacturers from innovating their own charging technology? In fact, you contradicted yourself in the very next sentence:
Although it has upped its game in the last couple of years, it still lags behind competitors in charging technology.
So competitors did innovate and create better charging technology.
I'm still failing to see the point of this regulation. I don't know many people who have both an Android phone and an iPhone, so how is this saving the environment? And once Apple switches from Lightning to USB-C on the iPhone, most iPhone owners are going to have to buy new charging cables, thus adding more environmental waste.
I get that there's a loud minority of tech enthusiasts who own a plethora of devices, already have USB-C on at least one of them, and want a standard. But there are many more who simply own a single phone these days.
I'm definitely not contradicting myself.
What I mentioned (admittedly somewhat tongue-in-cheek) was Apple specific with regards to charging innovation.
The word "anti" would imply that what Apple did somehow prevented innovation. It didn't. Said with tongue-in-cheek or otherwise.
It has done little to nothing and shipped a 5W charger in every iPhone box for 10 years while being privvy to exactly who was a new client and who was upgrading.
Competitors have been literally years ahead in every single aspect of charging. From battery chemistry, cooling, safety gates...
And that includes wireless charging.
Let's not forget the haphazard nature of this "innovation". Like the Samsung exploding battery fiasco. As I mentioned in my previous post, tech companies and engineers often get caught up in the race to get products and "innovations" to market as quickly as possible to gain a competitive edge. Apple is rarely the first to market with technology, but they're usually the first to get it right with respect to real world benefit to consumers (not just checkbox comparisons).
But my post was not about Android manufacturers.
Not shipping a charger in the box could have been done for its cited environmental concerns many, many years prior. That didn't happen and it was likely, IMO, that the reason for shipping such a poor default charging option in the box was part of an 'upsell' strategy of 'your phone can actually charge a bit faster and we can sell you such a charger if you want one'. AFAIK there was never an option to return a 5W charger at purchase time for a discount on the phone itself or against a faster charger. So much for the environment.
Could they have dropped the charger earlier? Sure. But regardless, they were still the first in the industry to do it.
Could they have put a more powerful charger in the box? Sure, but that wouldn't have changed the environmental impact of shipping a charger with every phone. Not to mention that the vast majority of consumers weren't going out and buying a more powerful charger. Only tech enthusiasts took the time to look into details like that. Most of the non-technical people I know either didn't know/care, or they used the USB charger from their iPad or power bar/splitter that they already had purchased (despite your implication of Apple trying to upsell).
Now, this EU proposal is a general move which goes way beyond phones. It aims to harmonise a huge swathe of the small consumer electronics market and it admits that there is no perfect solution.
If you read through the impact assessments you will find that there are different combinations of pros and contras to be evaluated and no given combination will bring the best result for everyone so the real goal here is to improve the current situation.
That is a good enough start.
The move to USB-C was already happening, so the legislation is redundant and a waste of time IMO.
In battery technology on phones, Apple has done next to nothing in terms of innovation. That was advanced through Android manufacturers, and in a big way.
Samsung screwed up with one design error on one battery revision. That is not indicative of progress made in the industry in general. It isn't indicative of Samsung's efforts either. After all, a great deal of Apple's screen innovation piggybacks on Samsung display innovation.
The line that Apple isn't usually first with something but usually gets it right is questionable and in a big way.
Android vendors get a lot right too. The difference is that there are many vendors trying different things and some of those won't gain market traction. Again, this isn't indicative of much as they also get a lot of things right. Probably more than Apple. That doesn't make Android innovation any more haphazard than Apple’s. In fact, virtually all of the iPhone's camera 'innovation' was following on from Android vendors' efforts that were done right. Many are still to appear on the iPhone.
Why hasn't FaceID been able to unlock phones in landscape orientation for years when competitors have?
With all that processing power, why hasn't the depth sensing array been able to do 3D small object modeling and animation? Why isn't that array used for eyes on display?
ADC? Good for the consumer in terms of clutter. Bad for the consumer in terms of flexibility and standardisation. Disastrous for the consumer when it gets dropped.
Non-standard connectors. Good for the consumer? Disastrous for upgradability.
Butterfly keyboard? This is indicative of something very, very Apple. Going to great pains for something and then doing the complete opposite in another design.
So, in some areas the onus was on keeping ambiental noise levels down (fanless designs for example). So how could they then justify probably the most clackety keyboard design they ever produced? How did that pass the noise test? Ah! The travel was shorter so the machine could be thinner. Typing would be more accurate. But what about the noise! That clackety keyboard was probably more irritating than any fan might be when it kicked in. And then the integration from a design perspective. Replacing the keyboard required replacing the top cover and the battery. Let's not forget that the design itself was a time bomb for users and could 'break' at any time. Not great for consumers.
Is that innovation done right?
Competitors had spillproof keyboard designs that could stop even your coffee getting into the machine, much less a pesky crumb or other particle getting in there and causing an expensive repair.
Apple has been adding features from Android devices that were done right (or better!) on those devices for years.
Yet here we are, still waiting for an always on display, a secondary biometric option, periscope lens and a host of other features and designs that are already 'done right' on Android phones.
Sometimes it is not so much the design but the 'politics' behind it.
The 5W charger for ten years is simply a classic example.
FaceID was never a one-size-fits all solution but it was the sole biometric offered.
When competitors moved to 3D depth sensing they did not eliminate the fingerprint biometric option (wherever that might be implemented). As a result, Android users haven't suffered nearly as much as iPhone users during the pandemic.
FaceID has never been able to 'learn' my wife's face while she is wearing glasses.
USB-C is another example. Some say that it is the cash cow that is 'lightning' that has prevented users seeing faster transfer speeds up to now.
AirDrop? Cloud, Bluetooth and Wi-Fi to send a file to another device? Why? Why don't they let you use the Bluetooth profile for file transfer on iPhones?
I can simply tap my phone onto my tablet and see its contents on the screen. I can drag and drop files in either direction at, pardon the pun, lightning speed. Multitasking, multiple windows...
NFC? Why can't my wife use NFC vía her bank's wallet app on her iPhone just like I can on mine?
Remind me how many years it took to get reasonable local file management on iOS. Years! It was impossible to download a simple email attachment locally.
All that is politics. Company politics but it can have a direct impact on innovation.
The EU proposal is far from redundant as it tackles the charging situation beyond Apple iPhones and seeks to harmonise charging across a very wide range of devices. It also seeks to lay the groundwork for future designs and make the industry move in unison.
The impact assessments have already determined the environmental savings.
As for Apple, it easy for people to say they are moving to USB-C anyway. Apple has made no public announcement and it is very likely that any move away from lightning was actually accelerated by the EU as it has been in different phases of consultation and assessment for years now.
The previous MoU with industry was never limited to micro-USB. The goals of the EU were to resolve a huge problem. This is just another step along that road.
It is also a mere piece of a larger puzzle with much more regulation to come.
Regulations that will put the consumer and the environment centre stage.
So, repairability (design for repair) will be encouraged. Consumers will get clear information on how 'repairable' devices are. Companies will have to inform users how long devices will be supported by software. New 'functionality' that gets added through updates will have to be user reversible etc.
Assuming everything gets approval as currently envisaged of course.
Comments
You simply cannot look at one element and say that somehow swings it. What counts is the balance you end up with (however that is achieved) and except for the last couple of iPhone revisions, Android vendors have been sitting at the top of the endurance tests.
And the main point wasn't efficiency. It was battery technology and fast charging.
Samsung screwed up with one design error on one battery revision. That is not indicative of progress made in the industry in general. It isn't indicative of Samsung's efforts either. After all, a great deal of Apple's screen innovation piggybacks on Samsung display innovation.
The line that Apple isn't usually first with something but usually gets it right is questionable and in a big way.
Android vendors get a lot right too. The difference is that there are many vendors trying different things and some of those won't gain market traction. Again, this isn't indicative of much as they also get a lot of things right. Probably more than Apple. That doesn't make Android innovation any more haphazard than Apple’s. In fact, virtually all of the iPhone's camera 'innovation' was following on from Android vendors' efforts that were done right. Many are still to appear on the iPhone.
Why hasn't FaceID been able to unlock phones in landscape orientation for years when competitors have?
With all that processing power, why hasn't the depth sensing array been able to do 3D small object modeling and animation? Why isn't that array used for eyes on display?
ADC? Good for the consumer in terms of clutter. Bad for the consumer in terms of flexibility and standardisation. Disastrous for the consumer when it gets dropped.
Non-standard connectors. Good for the consumer? Disastrous for upgradability.
Butterfly keyboard? This is indicative of something very, very Apple. Going to great pains for something and then doing the complete opposite in another design.
So, in some areas the onus was on keeping ambiental noise levels down (fanless designs for example). So how could they then justify probably the most clackety keyboard design they ever produced? How did that pass the noise test? Ah! The travel was shorter so the machine could be thinner. Typing would be more accurate. But what about the noise! That clackety keyboard was probably more irritating than any fan might be when it kicked in. And then the integration from a design perspective. Replacing the keyboard required replacing the top cover and the battery. Let's not forget that the design itself was a time bomb for users and could 'break' at any time. Not great for consumers.
Is that innovation done right?
Competitors had spillproof keyboard designs that could stop even your coffee getting into the machine, much less a pesky crumb or other particle getting in there and causing an expensive repair.
Apple has been adding features from Android devices that were done right (or better!) on those devices for years.
Yet here we are, still waiting for an always on display, a secondary biometric option, periscope lens and a host of other features and designs that are already 'done right' on Android phones.
Sometimes it is not so much the design but the 'politics' behind it.
The 5W charger for ten years is simply a classic example.
FaceID was never a one-size-fits all solution but it was the sole biometric offered.
When competitors moved to 3D depth sensing they did not eliminate the fingerprint biometric option (wherever that might be implemented). As a result, Android users haven't suffered nearly as much as iPhone users during the pandemic.
FaceID has never been able to 'learn' my wife's face while she is wearing glasses.
USB-C is another example. Some say that it is the cash cow that is 'lightning' that has prevented users seeing faster transfer speeds up to now.
AirDrop? Cloud, Bluetooth and Wi-Fi to send a file to another device? Why? Why don't they let you use the Bluetooth profile for file transfer on iPhones?
I can simply tap my phone onto my tablet and see its contents on the screen. I can drag and drop files in either direction at, pardon the pun, lightning speed. Multitasking, multiple windows...
NFC? Why can't my wife use NFC vía her bank's wallet app on her iPhone just like I can on mine?
Remind me how many years it took to get reasonable local file management on iOS. Years! It was impossible to download a simple email attachment locally.
All that is politics. Company politics but it can have a direct impact on innovation.
The EU proposal is far from redundant as it tackles the charging situation beyond Apple iPhones and seeks to harmonise charging across a very wide range of devices. It also seeks to lay the groundwork for future designs and make the industry move in unison.
The impact assessments have already determined the environmental savings.
As for Apple, it easy for people to say they are moving to USB-C anyway. Apple has made no public announcement and it is very likely that any move away from lightning was actually accelerated by the EU as it has been in different phases of consultation and assessment for years now.
The previous MoU with industry was never limited to micro-USB. The goals of the EU were to resolve a huge problem. This is just another step along that road.
It is also a mere piece of a larger puzzle with much more regulation to come.
Regulations that will put the consumer and the environment centre stage.
So, repairability (design for repair) will be encouraged. Consumers will get clear information on how 'repairable' devices are. Companies will have to inform users how long devices will be supported by software. New 'functionality' that gets added through updates will have to be user reversible etc.
Assuming everything gets approval as currently envisaged of course.