dewme
About
- Username
- dewme
- Joined
- Visits
- 932
- Last Active
- Roles
- member
- Points
- 15,798
- Badges
- 2
- Posts
- 6,115
Reactions
-
Apple's bug-squashing week was part of its efforts to minimize errors
chasm said:tophatnosocks said:One week to fix all the bugs they have? That just doesn't sound like a true commitment. Are they just placating us?a. The "bug fix week" was not actually intended for public consumption, it was an internal matter that leaked.b. Again FTA, the purpose of the week of focus on bug-fixing was for the OSes first feature complete internal build. Get rid of the bugs in the framework of the house, and you are less likely to have system-level flaws later.c. Nobody has said there won't be another such "focus week" as development moves along. Don't know where you jumped from to get to that conclusion.
Mark Gurman, writing for the Bloomberg "Power On" newsletter on Sunday, points out that this isn't the first time Apple has done this sort of thing, as bugs have been a problem for the company in the past.
This is reassuring and makes sense to me. I too was a bit surprised when I first heard about the pause because one week is way too little time to make a serious dent in their bug backlog. Many software teams intersperse these "maintenance" sprints into their software development process.
As chasm points out, there is something called a "cost of quality" that evaluates the cost to fix bugs discovered over the software product's development and release cycle. The least expensive anomalies to fix are those discovered during design and architectural review, before any code is written. The next least expensive anomalies are those discovered by the developer as they test their code, ideally as part of a test driven development process that builds unit testing test cases at the same time or even before the functional code is written. The relative cost to fix bugs between the two least expensive phases is typically around 5X. On the other extreme, the most expensive bugs to fix are those that leak into the released product and have to be addressed after the fact. The cost ratio between the two extremes can rise above 30X (according to NIST) when you factor in business losses associated with critical functionality being unavailable.
Speaking of bugs. The easiest way to prevent bugs is to not write the code, for example, by culling requirements. If I had to recommend a feature that Apple should have culled, in my opinion of course, is the "Reactions" feature added to FaceTime. It is currently broken on iOS, but more so, it's what I consider a totally worthless feature because it provides no value and in its broken state, it is so annoying as to question why they bothered to put it in there in the first place when it gets in the way when you're not using it. Everyone is entitled to their opinion, but these fluffy features just add bloat to an already quality challenged codebase. -
Apple insists 8GB unified memory equals 16GB regular RAM
I always laugh when I see or hear someone describe a base machine with 8 GB of main memory and a 256 GB solid state storage as if it’s a “kiddie” computer only suitable for web surfing and email. That’s crazy talk.A closer rendition of a kiddie computer is a Raspberry Pi with 4 GB of RAM and a 16 GB micro SD card. But even with this type of setup and an all-in price less than $100 w/o monitor (but any TV will suffice) these little computers are embarrassingly effective at many of the same use cases that are unrealistically assumed to be a good fit for a $1000 base configured Mac.Like many others have noted, Apple Silicon based Macs are no slouches, even at the base level configuration. I would not feel encumbered having to use a base level M1 Mac mini for software development.Some things, like live debugging or doing full local rebuilds would be slower on an 8 GB Mac compared to a higher spec Mac, but it wouldn’t be crippling from a productivity standpoint because the longest poles in my software development process are the times I spend thinking, organizing my thoughts, and deciding how to proceed to the point where I start physically interacting with the development environment, like typing or setting the right breakpoints.In other words, every computer I’ve ever used spends most of its cycles waiting for me. If my computer could talk it would mostly be asking “Is he still there?”
For the most part, every increase in computing performance has been offset by increasing demands by the operating system, applications, tools, and always running bloat. The wow factor doesn’t last forever and you always normalize to the new baseline.To be honest, the biggest step-increase in software developer productivity that I’ve ever experienced is having multiple monitors and having much faster storage, I.e., solid state drives. If I had to choose between stepping up one level on the choice of CPU, increasing memory, increasing storage space, or adding a second monitor I would choose the second monitor assuming the starting point was sufficient for my needs. Of course I’d want everything plus a fast SSD.
Buying a new computer is all about maximizing the things you find most essential at a cost you can afford, with consideration for future needs. But I’ll never buy into the notion that any Apple Silicon Mac is a lightweight machine only suited for trivial tasks. It’s not for everyone, but a vast majority of computer users will find even the base machine configurations to be more than satisfactory. -
Apple insists 8GB unified memory equals 16GB regular RAM
This is a bit of a spin and wordplay because they are focusing primarily on performance and efficiency and glossing over the reality that the size of running applications working sets cannot be ignored. Yes, having very fast backing store improves memory virtualization, i.e., swapping, but it’s still not as fast as having more real memory available.
They can certainly say that they are getting superior performance and efficiency with 8 GB compared to other competing platforms or architectures running with only 8 GB. But if you’re doing an Apple Silicon-to-Apple Silicon comparison and you would benefit by having 16 GB available due to the combined working set of your running applications, the benefits of having more memory available are real and there is no equivalence between 8 GB and 16 GB.
Apple Silicon effectively rebases my expectations. I already know it’s better on so many levels (but not all) than other platforms. I don’t want to compare it to lesser platforms. Upping the base level Unified Memory to 16 GB would make the Apple Silicon argument even more pronounced, no song and dance required. We’re paying a premium for choosing Apple, so why not make the perceived value and useful lifetime of the products stand out from the crowd even further? Software is not getting smaller. -
First M3 benchmarks show big speed improvements over M2
Most run of the mill software productivity applications don’t benefit much from adding more cores beyond a certain point because their speed up is greatly hampered by the non parallelism in their code execution.However, these apps still benefit from increasing the clock frequency or the number of instructions per clock cycle as long as everything else like memory and I/O doesn’t impose limitations and move the bottleneck. Reducing power consumption per clock cycle allows the clock frequency to be increased to higher rates before running into power and heat issues, which is seen in the increased single threaded scores.Increasing the number of GPU cores and their efficiency greatly benefits applications that have a high degree of parallelism in their execution, but the improvements are not linear with the increase in core count because at some point all of the results of processing done in parallel need to be consolidated, synchronized, or coordinated with shared resources that can be blocked by other executing processes. Even small amounts of serialization cause a large reduction in the actual speed up.As several others have said, buy whatever best fits your current and anticipated needs for however long you plan to keep the machine you’re buying. I can’t think of a time when an Apple product didn’t live up to my expectations, prompting me to purchase a newer model sooner than I anticipated. I’m sure it happens with some folks, especially those who are nearly maxing out their systems capabilities at the time of purchase.But time is money, so there are inevitably some buyers who have no choice than to buy whatever machine maximizes their profitability, regardless of their expectations going into their previous purchase. -
References to iPad mini in tvOS 17.2 beta revive 'HomePod with display' rumors
eightzero said:AppleInsider said:Apple already has a lot going for it in the smart home category, with HomeKit and poducts [sic] like a HomePod and Apple TV helping users set up and manage a smart home network in their abode.
OK, I gave away my last ATV, as I don't need or use it (I have a Roku TV that has the ATV app on it or the rare occasion I watch something on ATV) but don't recall ATV actually having any networking capability. What is this reference to? Because if the rumor might include such a ATV with a touchscreen to actually work as a new Airport...well...whoa. Take my money.Certain versions of the Apple TV 4K also act as Thread border routers (LAN to Thread routing) to bring Thread/matter smart home device support into Apple HomeKit. This is distinctly different from the higher level networking functionality (WAN to LAN routing) provided by Apple's Airport products.A reasonable model for what Apple is potentially envisioning with iPad mini + HomePod integration is more akin to Amazon's 3rd Gen 10" Echo Show product, which resembles a HomePod with an iPad mini permanently stuck to it. If you think about it, Apple could achieve comparable functionality using separate HomePod and Apple TV devices. The differentiator from Amazon's approach would be the fact that you could still use the iPad mini as a standalone device. Seems reasonably straightforward.