Apple prepares HomeKit architecture rollout redo in iOS 16.3 beta

Posted:
in iOS edited January 14
After halting its rollout of HomeKit's new architecture in iOS 16.2, Apple has resumed testing of the platform, with it resurfacing in the iOS 16.3 beta.

HomeKit


In December, Apple withdrew the option to upgrade Homekit to the new architecture, following reports the update wasn't working properly for users. It now seems that Apple is preparing to try it all again for the next set of operating system updates.

Screenshots from the iOS 16.3 beta show there is a message in the Home app confirming there is a "Home Upgrade Available," with a "new underlying architecture that will improve the performance of your home." This is the same update message that appeared in iOS 16.2 before being pulled.

Screenshots  sent in by Anthony Powell
Screenshots sent in by Anthony Powell


The inclusion of the notification in the beta is a strong indication that Apple believes all is fine with the update, and will be trying to release it to the public once again.

For the previous attempt, users reported seeing devices stuck in an "updating" mode after the upgrade completed, with some seeing devices unresponsive or failing to update fully. At the time, it was unclear what had caused the problems as there wasn't any spottable commonalities between accounts of the issue.

With the appearance in the iOS 16.3 beta, it seems Apple is confident it's worked out the problems and is willing to give it a second try.

Read on AppleInsider

Comments

  • Reply 1 of 10
    I’ll be the first person to let a whole bunch of other people try this for several days before I feel confident in trying myself.
    dewmebeowulfschmidt
  • Reply 2 of 10
    I’ll be the first person to let a whole bunch of other people try this for several days before I feel confident in trying myself.

    I’ll be the first person to let a whole bunch of other people try this for several years before I feel confident in trying myself.

    It would break homekit on the old iPhones I've got scattered around the house to use as lighting controllers.
  • Reply 3 of 10
    darkvader said:
    I’ll be the first person to let a whole bunch of other people try this for several days before I feel confident in trying myself.

    I’ll be the first person to let a whole bunch of other people try this for several years before I feel confident in trying myself.

    It would break homekit on the old iPhones I've got scattered around the house to use as lighting controllers.
    That’s true for me, as well, but in my case I’ll be fine not using the older devices for HomeKit control. As it is now the devices that won’t work after the upgrade are barely used for HomeKit already, so an acceptable loss. 
  • Reply 4 of 10
    elijahgelijahg Posts: 2,669member
    I'm stuck in limbo where I can't invite anyone who has ever opened the Home app to my home, because no one can join the upgraded architecture without upgrading their own home first for some ridiculous reason. Therefore since the rollout was cancelled people can't upgrade their home, and so it's impossible for them to join.

    I can't downgrade my home even if I reset everything, because despite the upgrade being cancelled new homes still use the new architecture. And besides that resetting doesn't work properly anymore either, even with the special home it reset profile. It's a mess. 

    Apple's software QA is abysmal these days, it used to be top notch. It's extremely disappointing for a "premium" brand. Some things are nearly as bad as OS 9 - though the kernel seems to be rock solid at least. 

    That said, the upgrade seemed to improve the responsiveness and reliability of homekit. I'm sure they could have used the homekit hubs as a bridge between old and new versions - though less incentive to upgrade of course. 
    edited January 14 dewme
  • Reply 5 of 10
    dewmedewme Posts: 4,660member
    elijahg said:
    I'm stuck in limbo where I can't invite anyone who has ever opened the Home app to my home, because no one can join the upgraded architecture without upgrading their own home first for some ridiculous reason. Therefore since the rollout was cancelled people can't upgrade their home, and so it's impossible for them to join.

    I can't downgrade my home even if I reset everything, because despite the upgrade being cancelled new homes still use the new architecture. And besides that resetting doesn't work properly anymore either, even with the special home it reset profile. It's a mess. 

    Apple's software QA is abysmal these days, it used to be top notch. It's extremely disappointing for a "premium" brand. Some things are nearly as bad as OS 9 - though the kernel seems to be rock solid at least. 

    That said, the upgrade seemed to improve the responsiveness and reliability of homekit. I'm sure they could have used the homekit hubs as a bridge between old and new versions - though less incentive to upgrade of course. 

    Apple's quality challenges are actually very typical of most large scale software projects being done over the past couple of decades. The whole notion of software "QA" as it was once was, something performed by a dedicated team that descended upon a product development process, usually late in the cycle, with a great deal of zeal to make sure that nothing bad leaked out the door and that everything that was promised was actually delivered to an acceptable level of quality, i.e., verification, and actually solved the intended end-user problem. i.e., validation no longer exists.

    Don't get me wrong, software is still tested. Heavily tested in fact. It's tested at many levels from the nuts & bolts deep in the code as unit testing as part of test-driven-development, to integration testing, to system testing, and of late security testing, e.g., penetration testing, interface fuzzing, etc. The lower levels of testing are done repeatedly, very often automated, at typically on every code commit and build. Collections of tests that comprise a cross section of wider functionality are often declared as "regression tests" which are essentially a high level smoke test to provide a level of confidence that the most recent changes introduced into the code base didn't break what was already working.

    So why do software products and systems that are supposedly so extensively tested still seem to break so often? Imho, and as someone who's worked at pretty much every level of product and system development engineering, it's a combination of continuously increasing complexity, never-ending releases (the software is never really done, so we can fix the broken things in the next release which may be tomorrow or next month), monotonically accumulation of technical debt (anomalies that don't get addressed), and an insufficient number of team members who fully understand the problem domain and the customer challenges the product is intended to solve, which ultimately results in validation failures.

    Of course there are several other factors like schedule pressure, cost pressure, poorly defined specifications (nowadays - maybe no specifications at all), late breaking changes, bad planning, over promising, actual bad code, naive software development processes, and all manner of management problems. But from an engineering standpoint, the lack of having a clear understanding of the problem domain, the customer's needs/concerns/pain points/cost concerns (acquisition and lifecycle, TCO), the larger system the software has to live within, and all of the things that define whether the software meets the validation bar are what's missing. There's testing aplenty, but veritably "good code" that does the wrong thing or sacrifices consideration for quality attributes that  code that passes low level testing impacts, like security, privacy, transactional integrity, etc., still dooms the software product.

    What's missing today are the system engineers, product owners, and the problem domain aware architects and system testers who in the past would collaborate upfront on making sure the product to be built was the "right product" and would be subjected to the appropriate validation standards that would be applied for the problem being solved. Most of these things have been eliminated or scaled back for the sake of time-to-market, velocity, cost, resource availability and cost (outsourcing, contracting, anyone can code farms, etc.), and "agility." Building the wrong thing quickly and iterating over it sixteen times is okay, because you know, the software is never really done.

    Learning how to "build software right" is a much easier challenge than learning how to "build the right software." The latter requires decades of investment.

    Has Apple fallen into these traps? Probably. I suppose a "glass half empty" interpretation of their greatly expanded Beta Testing programs is a soft admission that they simply don't have the full scope of internal expertise in the primary problem domains that they are targeting with their software. This kind of soft admission is much better than them foisting well-intentioned but poorly executed software on their customer base or trying to hide their recognized shortcomings. I think it's an admirable approach. The constantly increasing complexity of software in-general may make Apple's approach inevitable for many more software products. Nobody likes scary surprises. Adding actual customers to the feedback loop is a good thing. Hopefully it frees up their internal teams to give more attention to the "build software right" aspects of the task at hand.
    WTimbermanelijahgBenCstevegeewilliamlondonStrangeDaysgregoriusmmuthuk_vanalingambeowulfschmidt
  • Reply 6 of 10
    This stuff is insanely complicated, and there are now so many products from so many different manufacturers on the market that testing anything like all possible scenarios is simply unthinkable. And that’s not even to mention the nightmare of troubleshooting systems actually installed in the real world. I run a strictly HomeKit-powered smart home, but the more stuff I add to my network, the more peculiar failures I get. Even though I’m rocking a very recent mesh router setup and have been scrupulous about keeping all the verkakte software and firmware up-to-date, there are only a few of my attached devices that I can rely on to stay functioning as advertised for anywhere near 100% if the time. If I had a nickel for every failed geofencing routine on a device that was working perfectly fine four hours ago, or had Siri tell me randomly that  “You can only make one request at a time,” or “I’m having trouble connecting to the Internet,” or settled down on the couch to watch something on HBO only to have the HomePods linked to my TV suddenly decide not to play any sound, I’d already be richer than Tim Cook.

    I was one of the poor saps who somnehow managed to successfully install the 16.2 HomeKit upgrade before it was pulled. The way it organizes and groups devices does seem to be more rational, but it hasn’t significantly improved their reliability. Maybe the work being done on Matter and Thread will bring us to within sight of the Holy Smart Home Grail, but I’m not holding my breath. For now IoT installations of any complexity remain a crazy-making series of random failures. The whole mess reminds me of those legendary struggles in the 90s with Windows printer and video drivers every time some pointy-haired boss decided it was time to switch from Dell to Gateway, or some hotshot in IT sold him on the idea that it was going to be super cool and painless to switch from Windows 3,t to Windows NT in the middle of a company-wide restructuring. I’m honestly beginning to think I’ve lived too long.
    edited January 15 cg27williamlondontwokatmew
  • Reply 7 of 10
    Anyone who want to downgrade / reset the new architecture check here!
    https://www.reddit.com/r/HomeKit/comments/10bm6ba/cant_use_ipad_with_162_running_as_home_hub_after/
    williamlondon
  • Reply 8 of 10
    I ditched microsoft xbox for apple tv box. Hope to see some more updates soon for games & homekit on it. 
    williamlondon
  • Reply 9 of 10
    StrangeDaysStrangeDays Posts: 12,333member
    dewme said:
    elijahg said:
    I'm stuck in limbo where I can't invite anyone who has ever opened the Home app to my home, because no one can join the upgraded architecture without upgrading their own home first for some ridiculous reason. Therefore since the rollout was cancelled people can't upgrade their home, and so it's impossible for them to join.

    I can't downgrade my home even if I reset everything, because despite the upgrade being cancelled new homes still use the new architecture. And besides that resetting doesn't work properly anymore either, even with the special home it reset profile. It's a mess. 

    Apple's software QA is abysmal these days, it used to be top notch. It's extremely disappointing for a "premium" brand. Some things are nearly as bad as OS 9 - though the kernel seems to be rock solid at least. 

    That said, the upgrade seemed to improve the responsiveness and reliability of homekit. I'm sure they could have used the homekit hubs as a bridge between old and new versions - though less incentive to upgrade of course. 
    So why do software products and systems that are supposedly so extensively tested still seem to break so often? Imho, and as someone who's worked at pretty much every level of product and system development engineering, it's a combination of continuously increasing complexity, never-ending releases (the software is never really done, so we can fix the broken things in the next release which may be tomorrow or next month), monotonically accumulation of technical debt (anomalies that don't get addressed), and an insufficient number of team members who fully understand the problem domain and the customer challenges the product is intended to solve, which ultimately results in validation failures.
    For me it's this -- complexity. Even in my own career focusing on this century, complexity has vastly increased. I am and work with senior engineers and many of them don't really understand even our own deployment models of Docker images, Kubernetes clusters & pods, etc. Architecture & infrastructure are hand-in-hand now. Microservices have advantages over monolithic applications, but there is a cost in added complexity. So many more moving pieces, small cogs in the watch...few understand how they all work together. Modern operating systems (not my domain) are likely even much more complex than what I see in SOA & SaaS. 
    dewme said:
    elijahg said:
    I'm stuck in limbo where I can't invite anyone who has ever opened the Home app to my home, because no one can join the upgraded architecture without upgrading their own home first for some ridiculous reason. Therefore since the rollout was cancelled people can't upgrade their home, and so it's impossible for them to join.

    I can't downgrade my home even if I reset everything, because despite the upgrade being cancelled new homes still use the new architecture. And besides that resetting doesn't work properly anymore either, even with the special home it reset profile. It's a mess. 

    Apple's software QA is abysmal these days, it used to be top notch. It's extremely disappointing for a "premium" brand. Some things are nearly as bad as OS 9 - though the kernel seems to be rock solid at least. 

    That said, the upgrade seemed to improve the responsiveness and reliability of homekit. I'm sure they could have used the homekit hubs as a bridge between old and new versions - though less incentive to upgrade of course. 
    What's missing today are the system engineers, product owners, and the problem domain aware architects and system testers who in the past would collaborate upfront on making sure the product to be built was the "right product" and would be subjected to the appropriate validation standards that would be applied for the problem being solved. Most of these things have been eliminated or scaled back for the sake of time-to-market, velocity, cost, resource availability and cost (outsourcing, contracting, anyone can code farms, etc.), and "agility." Building the wrong thing quickly and iterating over it sixteen times is okay, because you know, the software is never really done. 

    Learning how to "build software right" is a much easier challenge than learning how to "build the right software." The latter requires decades of investment.
    I dunno about this part. I'm a fan of agile software development -- it came into being exactly for the reason of "building the right thing & building the thing right". It's specifically for dealing with high amounts of complexity & uncertainty -- challenges that made plan-based waterfall projects very difficult and ripe for failure with modern, complex software and shifting requirements. 

    That being said, many engineering teams think they know agile development, without actually knowing agile development -- lacking in training and true understanding of the concepts & practices. This is to their detriment.
    edited January 15 muthuk_vanalingambeowulfschmidt
  • Reply 10 of 10
    Looks like the Homekit architecture update did not make it to the Final Cut?!
Sign In or Register to comment.