dewme

About

Username
dewme
Joined
Visits
560
Last Active
Roles
member
Points
10,556
Badges
2
Posts
4,250
  • Editorial: WSJ Jony Ive story scoffed at by Apple experts, delicious to critics

    I'm glad to see you hit the industrial design topic very hard. This is one of Apple's core competencies and goes largely unappreciated by those who don't understand full scale product development. Too many neophytes and clueless pundits believe that all the magic occurs in design studio and just automagically transforms into plastic wrapped products neatly packed into tidy white boxes with an embossed Apple logo on them. The great product designers, people like Jony Ive and companies like Apple, don't separate design and operations into walled off enclaves with throw-it-over-the-wall handoffs. They continuously iterate over all phases of the product development cycle from concept, through several design-for attributes, including industrial design, manufacturability (DFM), testability, serviceability, etc.

    Viewing design and operations as separate and independent activities is so 1970s. Just look at where software development practice has ended up after 25+ years of believing that design centricity was king. We had structured design, followed by object oriented design, followed by component based design, with GoF design principles thrown in, and some SOLID principles to go with it. Where did all this design-is-king get us in terms of software maturity? How does decades worth of integrated processors produced with horrible security holes, blue screens of death on grandma's laptop, multi-million dollar space probes cratering Mars, an entire generation of defense programs that couldn't get to step 1 because of the prospect of unmanageable defect rates. Maybe design was not really King, but more like a Jack - or a deuce. So where did software development go to soothe their burned egos? DevOps - or development operations, the unification of development (including design of course) and operations. There's reason that companies like Apple can handle huge beta programs, deliver new builds nearly continuously, and address field issues in hours or days versus what used to be weeks or months. Putting together software builds and releases used to be really big deals, now they are routine single click operations with high fidelity traceability as to where all the pieces and parts came from and what test case provided the required verification and validation. Sure, bugs still get out, but when you look at the volume of code that is shipping today versus the defect rates, the improvement is astounding versus the bad old days when design was thought to be our savior.

    So yeah, operations is just as important as design, but it's also inseparable from design. 
    tmayapplesnorangescornchipimergingeniousdedgeckoradarthekatFileMakerFellerfastasleepp-dogjdb8167
  • Apple increases credit for returning DTK to $500 following developer outcry

    wood1208 said:
    Apple should have offered DTK at lower price and let them keep it. Not sure what Apple will do with returned DTK unless rip off processor,memory,etc from it and use in Macbooks products because of component shortages.
    The DTKs were leased equipment, not purchases. In all likelihood, the lease payments are fully tax deductible business expenses for companies who run their development as a business and not a hobby. 

    I do realize that in the current mindset of universal entitlement, anything that goes against one’s personal wishes and desires, regardless of anything else, is viewed as an offensive move by an overlord. This lease program was setup by Apple under the expectation of it being adults dealing with other adults at a business level. 

    Apple knew, going in, that they needed to get these DTKs back, for whatever reason, and structured the terms and conditions of the business arrangement to increase the likelihood that lessees would return Apple’s property to them under the terms that were stipulated in the agreement. Apple has not deviated in the slightest amount from following through on their part of the agreement. They are trying to be adults.

    Hey, I like extra cheddar as much as the next guy, but it does bother me that a great number of people in our society, all the way up to the highest levels of power, are basically children stuffed into adult sized bodies. Anything they don’t like is instantly viewed as a personal affront and categorically labeled as an offense, and of course, they’re now the victim. Business agreements and keeping your word don’t seem to matter. If I’m not happy, it must be wrong. 

    I guess I’ll blame it all on us, the Baby Boomer generation, who have never quite gotten past the “Baby” part of our generational contribution. Now we’re sadly passing it along to subsequent generations who know how to weaponize “whining at scale” across the social media mobosphere until they get their way. It’s nice that Apple, as an indulgent parent, is letting junior have extra cookies just to shut him up, but it’s also a sad commentary on where we are as dysfunctional semi-adults. 
    ukrunromar moralesacejax805stompymacplusplusroakeflyingdpjdb8167macguibageljoey
  • Internal Apple memo addresses public concern over new child protection features

    These services are likely a proactive move by Apple to finally quell the barrage of requests from government agencies for Apple to open a backdoor for authorities to go around Apple's privacy and security protection features. It's not at all unusual for those who seek the backdoor solution to bring up child protection as a reason why backdoors are needed. If Apple takes the child protection argument off the table it gives them further justification for not adding the backdoor that authorities so desperately covet. In reality, it just buys Apple (and us) more time because those who seek the backdoor approach are never going to be satisfied with anything less than a master key that allows them unfettered access to whatever they want, whether or not they truly have a legitimate need or even a legal right to it. The standard operating principle in nearly all of these cases of authoritarian overreach is to ask for forgiveness, not to ask for permission.

    I haven't delved into all of the details of Apple's safeguards but from what I've read so far it sounds like classic signature matching, much like the technology behind Shazam. Everyone everywhere should always assume that everything sent over a communication link unencrypted (or easily decrypted) is being scanned and analyzed to extract information of interest (for whatever reasons, bizarre or legitimate) from the raw data. Everyone everywhere should also assume that all data and information that is collected from every source of acquisition in multiple formats is being fused together for additional processing.

    I'm not advocating that anyone live in a constant state of paranoia. I'm simply saying that we no longer live in a world where individuals can rationally sustain a universal expectation of privacy. Once you step outside of your own personal space, physically or virtually, you are sharing some bit of data about yourself with someone or something.  If you're walking or driving in an area with other humans, there are public and private cameras that see you even if you're not carrying a connected device. Last time I went into a popular gas station I counted no fewer than 14 cameras inside the store while I was waiting for my sandwich order to be completed. If you're carrying a connected device you're divulging a heck of a lot more data to be fused with captured images. Traversing the internet is no less private, no matter what you do to limit your exposure. VPNs and companies like Apple that value privacy and security in their products/services are helpful, but I still see their "protection" as temporary and quite fragile, as we've seen with recurring privacy and security breaches.

    All I'm saying is assume you're always being watched, and if there's something that you're planning on doing physically or electronically that you wouldn't anyone else observing, maybe think twice about doing it. This is simply where we are at in today's society, whether we like it or not.


    llamadysamoriamartinp13hcrefugeelolliverFileMakerFellersteveau
  • Amazon introduces native Mac instances for AWS, powered by Intel Mac mini

    Rayz2016 said:
    pembroke said:
    I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

    Someone will hopefully correct me if I'm wrong, but I don't think so, no.

    That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

    I think @StrangeDays will know a bit more about this.

    Yes and no. The yes part: This is a platform as a service (PaaS) flavor of cloud computing where you purchase access to additional Mac machines (and supporting AWS infrastructure) for a period of time to do with whatever your needs dictate. Yes, if you have a big job or process pipeline to execute that would benefit from distributing it across multiple machines, you’ll need to have a way to divide the work up, parcel it out to multiple machines, and aggregate the results. There are several software development automation tools on the market that are explicitly designed for this purpose, e.g., Jenkins, XCode Server, and some development shops build their own automation.

    That’s only one type of distributed processing use case, but there are many others especially in modern software development processes, like having a pool of dedicated compile machines, build machines, automated test machines, configuration management machines, deployment machines, etc., in a pipeline such that every time a block of code gets committed to a development branch from an individual developer it is compiled/built, integrated, unit tested, and regression tested against the entire software system that it is a part of. Each of the machines participating in the individual tasks along the pipeline feeds its output to the next machine in the process. These pipelines or processes run continuously and are event driven, but can also be scheduled. Thus the term continuous integration (CI) and continuous delivery (CD).

    There is nothing preventing an individual user from purchasing access to a cloud based machine (Mac, Windows, Linux, etc.) to do whatever they want to do with it. The inhibitor tends to be the cost and terms of service, which are typically geared towards business users with business budgets. The huge benefit of PaaS cloud computing tends to be the ability to acquire many machines in very short order, and the ability to add/delete many machines nearly instantaneously as your needs change, i.e., elasticity. Try asking your IT department to spin up 125 new Macs overnight if they have to order physical machines from Apple or a distributor. They will laugh, you will cry.

    The no part: If you need to deploy micro services you may not need complete, dedicated machines to deploy your micro services. You may just need a container to run your service, which could be a shared computing resource that is hosting several “tenants” running completely isolated in their own container. If you are building out the service hosting platform as part of your solution, then sure, you could have cloud based machines for your homegrown hosting platform, but this reverts to the previous use case that I mentioned.

    I don’t get the “bizarro” comment from Blastdoor. This type of PaaS cloud computing has been in very widespread use for a very long time, with Amazon’s AWS and Microsoft’s Azure being two of the major players. You may want to see if AWS offers a test drive to get a feel for how it works. Microsoft used to and may still allow you to test drive their Azure PaaS solution. There’s nothing at all bizarre about how it works. You’re sitting in front of a Mac with full access to everything you can get to via the keyboard, mouse, and monitor. Instead of sitting under your desk it is sitting in the cloud.

    Nothing bizarre looking here: https://aws.amazon.com/blogs/aws/new-use-mac-instances-to-build-test-macos-ios-ipados-tvos-and-watchos-apps/

    My only concern with the Mac version of EC2 is that it relies on VNC for remoting the desktop. VNC is not as secure or as performant as RDP that is used for Windows remoting. Any way you cut it, Amazon providing support for Mac in AWS is a very big deal and brings Apple another step closer to competing at the enterprise level against Windows and Linux. 
    GG1Rayz2016dysamoriaroundaboutnowwatto_cobra
  • Elon Musk uses iPhone email bug to illustrate the importance of software innovation

    All software seems to follow a common lifecycle model over time. As more bugs are addressed by more and more developers who were not part of the original design team it starts to accumulate a lot of cruft and quick fixes and workarounds to meet release deadlines. This accumulation of cruft, crud, and crappy shortsighted quick fixes is collectively known as “technical debt” because the current software team has literally taken out a bad loan to buy a bunch of shitty workarounds that some future team of maintenance developers is going to have to pay for, and pay for at loan shark interest rates.

    When developers occasionally grow a little piece of spine they get up in front of management and talk poignantly about the dire need to pay down some of their technical debt, perhaps using “refactoring” or redesign, or god forbid, rearchitecting of the current code base. At some point in the spiel the management team challenges them with something to the effect of “so you’re saying you want us to spend a bunch of man-years of development resources, a boatload of money, and so many millions of dollars, and so many months of schedule to give us a refreshed code base that does pretty much what the old code  does, but without the hanging chads and dingleberries?”  At that point the little piece of developer spine turns to jelly with a “well yeah, pretty much.” So much for grandiose plans. The end result is that not only is the trash can of technical debt kicked further down the road, but the development team is tasked with adding a bunch of new money-making features on top of the shaky foundation that is like a rickety bridge waiting to collapse. Of course the new features introduce more technical debt and have to work around the shortcomings caused by the underlying technical debt. 

    It’s rather easy for a startup like Tesla to feel emboldened by their software prowess because they haven’t had time and customer volume enough to suffer the indignities that accumulate over time when you’re serving a billion customers around the world and have the second and third generation removed teams poking into a business-critical code base that is handed down to them, a code base that is tied to business revenue that has to keep flowing no matter what. Designing new software is usually fun and rewarding. Maintaining existing software is usually a grind and a thankless struggle. Technical debt is like a slow growing cancer, but as long as the supply of band-aids, duct tape, and baling wire is cheaper in the short term than excising the tumors they’ll keep adding on more layers of bandages until they are forced to blow it all up and start over again. Or maybe buy a bunch of software and people through an acquisition.        
    pscooter63radarthekathydrogenFlaSheridnjdb8167spice-boysirbryaninTIMidatorStrangeDaysFileMakerFeller
  • Why Thread is a game-changer for Apple's HomeKit

    gatorguy said:
    Thread was a very underappreciated invention IMO. I'm honestly surprised it's taken so long to gain traction. Outside of Nest devices I'm not aware of other high-profile devices using it so good on Apple including it with the new Mini. Perhaps that means Nest devices may soon be Homekit friendly. 

     Home automation should be much more straightforward than it has been and Thread will be a major part of making it so.
    The main benefit of Thread is that it is IP based (IPv6 in fact), while ZWave Plus and Zigbee each employ their own non-IP protocol that requires a gateway/bridge to connect their unique protocols to IP based clients and other nodes. The use of IP has major benefits with respect to things like device discovery, device identity, connection management, and routing because everyone is speaking the same language. Bridging across protocols always introduces more complexity and delays because the intermediary (gateway/bridge) has to be a fully functioning node on both networks at the same time. It's far more than simply language translation, it involves things like managing communication timeouts, node health status reporting, maintaining routing tables, etc,. or basically doing everything a node has to do to be a good citizen on a network - times two.

    Having IP everywhere simplifies everything and positions Thread very well for the emerging IoT challenges and opportunities.   

    All three of these networks are mesh based, have device profiles to allow auto-configuration, employ self-healing techniques, have accommodations for optimizing their use with battery operated devices, support range extension, and have encrypted communication, so none of those things are really differentiators. The special sauce for Thread is the use of IP and the fact that it builds on 6LoWPAN, which has some degree of maturity.

    I must add that one recurring challenge with device networks is that there are always too many standards from the customer's perspective. Despite claims of being a "game changing" new standard, the new standard rarely replaces all of the existing standards. Instead, it typically fragments the market even further. Every single time I've seen a new "one standard to replace all standards" come along, which I've experienced several times over a few decades of being involved in similar standards, the end result is that standard A and standard B do not go away or get replaced by standard C. You end up with standard A and standard B and standard C. In fact, standards A and B will even update and improve their standards over time to make them less vulnerable to being obsoleted. The end results is usually: Dear customer - please pick one, and can I interest you in a nice gateway? 


    flyingdptenthousandthingsp-dogroakerundhvidunsui_grep
  • Apple Silicon M1 Mac mini review - speed today and a promise of more later

    The 16GB of RAM is a deal breaker for me.
    My 2020 iMac has 64GB of RAM which I figure will last for 5+ years.
    RAM handling is a bit different with Apple Silicon. While if you're hitting swap space on that 64GB often now, it won't help you, how the M1 is handling RAM basically leads to a "8GB is the new 16, and 16 is the new 32" situation.

    We'll see more with time. I don't expect the 16GB limit to remain on whatever comes after the M1.


    As someone working as a software developer/engineer/whatever since years before appleinsider.com existed, I can state with perfect certainty that if you need 32 GB RAM because of the size of your data, running it on 16 GB RAM and thinking you’ll get comparable performance as running it with 32 GB RAM defines “wishful thinking” because you’ll be swapping horribly, and you’ll be limited to I/O speed and latency for the swap drive.

    Where a fast SSD may make even 8 GB RAM seem much more efficient than you’d expect is where you have an application doing data processing in a linear address/array order, and the processing that’s done takes at least as much time as I/O for input and processed output data, as then that can be implicitly handled via the regular swap file, or more explicitly a memory-mapped file, which maps a file into main memory as needed in the virtual address space in a relatively small window of the physical memory address space.  As soon as you have applications following pointers in data structure in a random memory access pattern (this happens in the majority of applications the majority of the time, especially once there have been enough memory allocations and releases of memory) there isn’t an SSD currently available that’ll make 16 GB RAM remotely as efficient as having 32 GB RAM, as you’ve then taken what would (with enough RAM) be a compute-bound problem and translated it into an I/O-bound problem.
    I agree that if you have an application, or a collection of applications running concurrently, that you have verified through testing and profiling cannot run within its operational requirements without a full 32GB of memory, regardless of the nature of the paging medium, yes indeed, you'll need the full 32 GB. I will also say that if I were up against that kind of hard limit with zero wiggle room, I probably wouldn't be running that application on a 32 GB Mac mini to begin with. That represents a pathological use case with no engineering margin to account for the variabilities that are always present when using a general purpose, multiprocessing, time slice scheduled, shared resource computing system like Mac and macOS.

    In every instance that I've encountered with super tight memory constants I've reverted to using a custom, statically allocated, fixed block memory manager with custom defragmenter rather than the one provided by the operating system. My point here is only to say that if you truly need a certain memory capacity with zero tolerance you're not going to settle for any of the nondeterminism that a general purpose operating system like macOS introduces. macOS, like Windows, Unix, and Linux have way too much indeterminism to run them right up agaist a hard limit of any kind - not just memory capacity. 

    A more common approach, and one that is probably more indicative of Mac mini hosted solutions and what Mike is alluding to, would be to say that a particular app configuration runs optimally with a certain amount of memory available to the operating system but will run, perhaps sub-optimally, but not go belly up and crap itself, if the memory load-out is less than optimal. In fact, this is the overriding assumption about most every Mac (and Windows and Unix/Linux app) running an operating system that has virtual memory management, process isolation, paging, with RAM backed by secondary storage (slower), and with the process memory address space mapped independently of where it actually resides (RAM or disk).

    It remains to be seen how apps perform on the M1 versus Intel processors with various amount of memory and where memory requirements are variable versus being hard and fixed. You're right about the hard & fixed use cases, but I have a strong feeling that for the vast majority of apps that are tolerant of variability in the amount of physical RAM, which is almost all apps that are widely used. the Apple Silicon equipped Macs are going to run at comparable performance levels with less RAM and at better performance levels with the same amount of RAM as their Intel counterparts. I say this because Apple had total control over the use of ALL memory on its SoC and didn't have to defer to anyone else's design choices. Plus, when you're hit with paging, it's not like the old days of trading off nanoseconds for milliseconds, it's paging to SSDs that are orders of magnitude faster than hard disks.

    Only testing will reveal the truth, but I'm putting my chips on Apple Silicon coming out ahead. 

     


    seanjJWSCAlex1Nhmlongcochiamacplusplusroundaboutnowwatto_cobra
  • Sonos criticizes Google, Amazon, and Apple at Senate antitrust hearing

    Pathetic.

    I could be wrong, but I don’t seem to recall Steve Jobs and Apple getting up in front of congressional committee in the circa 2003-2007 time frame and whining like a baby about how Nokia, Blackberry, and Motorola weren’t “playing fair” in the mobile phone market.

    Does anyone have a video of that having happened, or maybe a transcript of Apple’s whine fest in front of Congress? 

    Or perhaps I missed the part about Congress granting Apple some sort of special protections that allowed Apple to enter the mobile phone market with its own phone that was so “outrageously overpriced” that the giant mobile phone makers of the day could laugh in Steve Jobs’ face about his audacity of swinging and whiffing so blatantly in a market that they so thoroughly dominated.

    Did I miss those news stories too?

    Congratulations Sonos, you’ve just earned yourself a Participation Trophy.

    Yay.

    pichaelroundaboutnowKTRrepressthismacxpressStrangeDaysBeatspumpkin_kingrainmakerbaconstang
  • Windows on Apple Silicon is up to Microsoft, says Craig Federighi

    seanj said:
    Old timer that I am, I remember the days before the WinTel duopoly dominated the computer industry. We had machines running 68000, ARM, SPARC, Alpha, etc, and a consequence we had year on year real innovation and advances; the completion that the free market promotes. 
    Whereas we’ve seen stagnation with Intel’s CISC x86 architecture for over a decade, a fact that Apple recognised. The irony of course is that things have come full circle as Apple was one of the original investors in ARM back in the late 1980’s.
    Hopefully exciting days ahead!
    ... and the first "working version" of Windows NT, which is present in Windows 10's DNA, ran on the Intel i860 - which is a RISC chip. Talk about full circles...

    I'm not losing any sleep about running Windows on Apple Silicon. If I was currently traveling and doing all of my work on a laptop/notebook that has M1 Apple Silicon then I'd care a little bit about Windows compatibility (for running in VMWare). But for desk-bound use, if I really need to run Windows I'll just buy a small form factor (SFF) Windows box, preferably with a contemporary AMD cpu, and plug it into the second port on my desktop monitor and get a Bluetooth keyboard and mouse that support multiple computers. In fact, I'm already using all of these pieces (keyboard. mouse, and monitor) that support multiple computers so all I really need is a SFF Windows box. If I want to live large I'll get a NUC, but there are many choices across a broad spectrum of prices, from HDMI stick PCs to low cost NUC knock-offs. Easy peasy.
    rezwitsAlex1Nhippotmayrazorpitwatto_cobra
  • Apple to test hybrid in-store, work-from-home model for retail employees

    My dream is to buy only at the apple store!

    But in São Paulo Brazil, the prices are always crazy at the Apple store!

    For example the Apple Watch series 6 40mm blue 
    Its $555 on third parties, all legal and with 1 year Apple warranty. 
    The same model is $1050  at the Apple store. Almost double.

    You must not be telling us the whole story. Apple follows all applicable laws and regulations when selling products in its stores. They collect all required taxes and tariffs and turn over all of those monies to the revenue authorities. They only bring products into the country through legal channels that are governed by import regulations of the country in question. There is absolutely no reason why Apple would charge nearly 2X the prevailing market price for the privilege of purchasing a product in their store - no way.

    The third parties you've mentioned must be using a mechanism to bypass or circumvent local requirements. This could be any number of things, such as obtaining products through gray market sources, not collecting required taxes (VAT), or some other scheme that they have found a way to exploit. The VAT in Brazil can be 70% of the product price. Such as scheme would not be significantly different than schemes employed in other countries.

    For example, in the US buyers often purchase products online from sellers who have no obligation, and no legal authority in fact, to collect sales tax on sales in states the seller does not have a legal presence in, i.e., nexus. From a tax collection standpoint, buyers are still legally obligated to pay the sales tax equivalent, in the form of a use tax, to their state for products purchased out-of-state where the seller did not, and legally could not, collect sales tax. Do the vast majority of people remit this use tax when they submit their yearly tax submissions? Hell no. Along the same lines, sellers who prominently proclaim "we don't collect sales tax for out-of-state orders!" in their advertising aren't doing you a favor or giving you a bonus. They cannot legally collect sales tax in states where they do not have nexus. All they are doing is subtlety reminding you, perhaps, that it's totally up to YOU to decide whether you want to cheat your state out of the use tax on the items you purchase. It's all on the buyer and the seller is totally out of the picture.

    Apple always plays 100% by the rules. If third party sellers are selling Apple products at a 50% discount, those sellers are either sidestepping requirements or passing requirements on to the buyers and the buyers are choosing to ignore the law. Anything else would make zero sense at all.
    gregoriusmmuthuk_vanalingamuraharabyronlFileMakerFeller