anonconformist

About

Username
anonconformist
Joined
Visits
111
Last Active
Roles
member
Points
585
Badges
0
Posts
202
  • Apple Silicon will force industry to reconsider use of Intel chips, says ex-Apple exec

    mjtomlin said:
    mjtomlin said:
    razorpit said:
    Agree with this. Don't think Intel is going anywhere soon, but if you have stock I think now is a good time to sell. Intel is vulnerable right now.

    There's a lot of laziness and content out there right now. Apple Silicon is going to wake a few business units up at MS and Intel, at least it better for their sake.
    This isn't true at all. It doesn't solve the main reason why PC users don't buy Macs.

    I think you missed the point here. This really has nothing to do with Apple gaining sales or users, it has more to do with Apple awakening OEMs from their x86-64 induced comas. The writer opines that Apple Silicon will demonstrate just how powerful and efficient ARM-based computers can be while running a real traditional operating system.

    This, if anything, will cause a lot of OEMs to take note and wonder why Microsoft hasn't been able to do the same with Windows as Apple has done with macOS... and should light a fire under Microsoft's ass to get ARM-based Windows up to par. Apple's DTK, which runs the iPad Pro SoC (A12Z), is completely capable of translating and executing applications compiled for Intel-based Macs with a slight performance hit. That's a "mobile" ARM-based SoC running native x68-64 code!!! With very usable real world performance. "Why hasn't Microsoft been able to do the same with WindARM? Why is it such a clunky incompatible mess?", they'll ask.

    Also, Apple Silicon will essentially allow them to make Mac form factors and have specs that other OEMs will not be able to reproduce with WIntel-based systems.
    If the Apple Silicon ran x86-64 code without translation, it would be an emulator.  Instead, the majority of the code is translated from x86-64 into ARM instructions during installation, which will definitely slow down the usual installation: the DTK CPU doesn't run anything else natively.  That being said, with the available information, it suggests Apple has Rosetta 2.0 performing rather well.

    Sorry I didn’t mean to imply that Apple Silicon ran x86-64 code natively. I know it is translated. The point was to show what Apple is capable of doing and to wonder why Microsoft isn’t able to do the same?
    Well, an interesting question, and while I can't speak to knowing what the strategic objectives are and that's all speculation, I do know quite a lot about the technical bits.  Microsoft Windows has an already-established huge set of software written using various compilers from various vendors, including Microsoft's, and quite a bit of hand-coded assembly in a lot of applications.  On average, there's not awful lot of assembly in any application, as compilers these days do a darn good job, including Microsoft's.  The facts of the matter is Microsoft has top-end compiler people, and has for a very long time: one of Microsoft's biggest software lines has always been the software development tools.

    Microsoft has something Apple has never had in-house: an optimized managed runtime complete with a rather powerful SDK using it, in the form of .NET, and now .NET Core (open source).  In UWP applications, they're then compiled over to .NET Native, but for all (or pretty much all, if there are any that use hand-coded assembly at all) UWP applications, it literally is just a recompile away for UWP applications, and most .NET applications, as .NET applications are CPU-independent already, except for where they use interop to interface with native code in either the OS or externally-created DLLs.

    Does Microsoft have any projects where they're working on a much more efficient ARM-based CPU?  I don't have insight into that, I'd have to ask around, but if they're behaving themselves,  they'd not tell.  It's not worth the risk/time to concern myself with that.  Microsoft is working on a quantum computing CPU design in their Research arm: I've interviewed in the past for contracts involved with implementing design tools for that.  While they've worked with Qualcomm, the facts of the matter are that Qualcomm haven't put out an ARM CPU of great performance: while Microsoft has the tech talent available to do a Rosetta tool, it's not going to get great performance.  One part I don't know about (since I've not bought one, I don't care to investigate) is if Microsoft does any translation of native compiled code now.  But, where the input code is of the managed languages using the CLR, there's no need to translate: just do the normal thing done with .NET code on any CPU.

    I speculate that Microsoft won't do a Rosetta entirely for business reasons: consider that if they did create one, what priority would there be for other companies to spend any effort at all to recompile (with some changes, because Windows on ARM isn't 100% identical) their applications?  Right now, Microsoft isn't even a large vendor for x86/x86-64 PCs that can run Windows, and there are plenty of cheap vendors of them.  Because Microsoft hasn't gone out of their way to try to do the Apple route for top-performance ARM-based CPUs on their own, they're caught with the rest of the pack for ARM performance as the best they can hope for at this time.  Perhaps Microsoft has gone with an ARM version of the desktop version of Windows (remember, Windows Phone was ARM, and the Windows OS is platform-agnostic at the low level once the small Hardware Abstraction Layer has been built for a given CPU) to keep Intel honest and on their toes.  As long as all the other PC vendors keep selling Intel machines and not put their toes in the water of ARM, users can always go the cheaper-for-speed route of Intel.  Perhaps Apple's silicon will finally light the fire under the various CPU designers to up their game and help push Intel out of that 8000 pound gorilla status.
    GG1watto_cobra
  • Apple Silicon will force industry to reconsider use of Intel chips, says ex-Apple exec

    mjtomlin said:
    razorpit said:
    Agree with this. Don't think Intel is going anywhere soon, but if you have stock I think now is a good time to sell. Intel is vulnerable right now.

    There's a lot of laziness and content out there right now. Apple Silicon is going to wake a few business units up at MS and Intel, at least it better for their sake.
    This isn't true at all. It doesn't solve the main reason why PC users don't buy Macs.

    I think you missed the point here. This really has nothing to do with Apple gaining sales or users, it has more to do with Apple awakening OEMs from their x86-64 induced comas. The writer opines that Apple Silicon will demonstrate just how powerful and efficient ARM-based computers can be while running a real traditional operating system.

    This, if anything, will cause a lot of OEMs to take note and wonder why Microsoft hasn't been able to do the same with Windows as Apple has done with macOS... and should light a fire under Microsoft's ass to get ARM-based Windows up to par. Apple's DTK, which runs the iPad Pro SoC (A12Z), is completely capable of translating and executing applications compiled for Intel-based Macs with a slight performance hit. That's a "mobile" ARM-based SoC running native x68-64 code!!! With very usable real world performance. "Why hasn't Microsoft been able to do the same with WindARM? Why is it such a clunky incompatible mess?", they'll ask.

    Also, Apple Silicon will essentially allow them to make Mac form factors and have specs that other OEMs will not be able to reproduce with WIntel-based systems.
    If the Apple Silicon ran x86-64 code without translation, it would be an emulator.  Instead, the majority of the code is translated from x86-64 into ARM instructions during installation, which will definitely slow down the usual installation: the DTK CPU doesn't run anything else natively.  That being said, with the available information, it suggests Apple has Rosetta 2.0 performing rather well.
    tmayjony0fastasleepargonautwatto_cobra
  • Apple Silicon will force industry to reconsider use of Intel chips, says ex-Apple exec

    I say the industry will claim Apple is making a big mistake and those ARM processors will never equal the power of X86 processors although there is nothing stopping Apple from scaling up its ARM processors in core count while having a 5nm node that's more power-efficient and much denser for higher transistor count.  There is already an ARM Ampere Altra processor that has 80 cores but costs over $5000.  It's a beast of a processor for high-end use. See: https://amperecomputing.com/ampere-altra-industrys-first-80-core-server-processor-unveiled/
    So if anyone is going to BS about how no ARM processor can touch an X86 processor, they're lying.  I don't know about how Apple Silicon is going to handle GPU processing but I heard Apple is going to be building discrete ARM GPUs along with SoC ARM processors with integrated GPUs.  Amazon is using ARM-powered servers on AWS that are equal to their X86 counterparts and require far less cooling.  See: https://www.extremetech.com/computing/307498-amazon-launches-a-killer-arm-server-chip-with-the-graviton2

    There may be some advantage to having X86 on desktops that have huge amounts of cooling and high-wattage power supplies, but Apple Silicon is going to have a huge advantage when it comes to laptops due to lower TDP and longer battery life.  There are going to be plenty of naysayers claiming Apple doesn't know anything about high-end processors or gaming GPUs and such but any company with enough money can at least try to figure it out if the incentive is there.  In Apple's case, it's all about profits and they're fed up with paying Intel for sub-par processors.  Yeah, let the naysayers believe Apple will fail because that's what the Apple naysayers always believe when Apple brings something new to the table.

    Anyway, time will tell if Apple is making a huge mistake moving to Apple Silicon, but I don't think they are.  It's about time for ARM processors to shine for consumer use and possibly beyond.  I can't wait to get an Apple Silicon Mac later this year.  I think I'm done with Intel processor Macs but I'll wait and see how well Apple Silicon Macs perform.  I'll still keep my older Intel Macs so I'm not going to be backed into any corner.


    I'm a multi-platform software developer, with decades of experience.  Yes, Apple can throw lots of transistors to get lots and lots of their custom cores like you mentioned for number of cores, but that makes sense only for a very limited number of use-cases: the typical desktop application with 64 cores (or more, or even 8) is completely unable to make good use of them, if any at all. Huge core counts make great sense with embarrassingly-parallel workloads like raytracing and some other things where there's not much sharing of data between cores, and they share the same instructions for a long period of time that are executed.

    For this reason, Apple has had fewer cores in their A series chips than the typical Android smartphones/tablets, but they've focused on making them faster in single-threaded cases, because that's the majority of software in practice: almost always a single thread in use, with minor I/O threads occasionally, because there are too many dependencies between compute A to choose which B needs to be done, etc. that doesn't work with more cores.  For more information, search "Amdahl's Law" to understand better.  It's quite common that throwing more cores at a task will make it take LONGER!

    That said, I've been observing the GeekBench and other benchmark scores for A series processors for a number of years and wondering when Apple would announce a switch, since they've been competing with Intel processors in a phone power/heat envelope, and I've been wondering when they'd push such CPUs (or derivatives) into laptops/desktops/Mac Pro configurations, with power usage being up over what they've had for their mobile devices, but lower than Intel processors.
    tmaydysamoriajony0lolliverargonautrazorpitwatto_cobra
  • LinkedIn blames bug for clipboard snooping discovered by iOS 14

    asdasd said:
    loopless said:
    Ascribing some sinister intent is paranoid. Some programmer probably thought it was a good idea to enable some capability. And the clipboard API is public allowed since forever on iOS. As long as the data is simply being checked, then its sort of  absurd  to assume that LinkedIn are somehow 'stealing' clipboard content for nefarious purposes. There are so many better ways to get data than some random clipboard text.  No one ever thinks about this on the desktop like MacOS and Windows. Applications are always checking the clipboard for pasteable content.
    I can’t think of any valid reason to check the clipboard at all. Checking for pastable content? Wait until they paste. 
    This means you haven’t thought things through with how applications work.

    if you have copy/paste support in your application, you’ll want to enable a GUI status to tell the user when that’s possible.  If nothing is currently a selected object, the Copy menu item, button, whatever, is shown as disabled until something is selected by the user.

    By contrast, whatever exists for the user (button, menu, etc.) can’t reflect an accurate state of whether it is possible to paste without taking 2 things into account:

    1. Something is on the clipboard.  If there is only ever one application visible on the screen, and it can only interact within itself, this can be done entirely locally if there is never a transition away from the application’s window.  This assumes no support for other background applications pasting something,

    2. The type of data on the clipboard: if you have a strictly text-based application that has no graphics support whatsoever, no intelligent conversion exists from a picture on the clipboard into raw text, as one example.  Clipboards in GUI-supporting OSes support more than one type of data on the clipboard at a time, but it may not have a format that makes sense for different apps, so it’s not enough to know if data is available for pasting, it’s also required to know if it makes any sense in the current context.

    Thus, it makes sense to check if there’s data on the clipboard when transitioning to being a visible foreground app, or if it is also displayed with another application also displayed, but if it is the only foreground app and there’s no reason to expect a background app to paste to the clipboard to exchange data, then constantly checking the clipboard makes no sense, but doing it each time the application has been made visible to the user makes perfect sense.
    watto_cobra
  • First Apple Silicon Developer Transition Kit benchmarks show Rosetta performance impact

    revenant said:
    seems highly unlikely that apple would show their hand and ship out the zippiest, or even a very zippy device. and I wonder, as these seem to have identifiers, might these developers lose their account as these are not supposed to be benchmarked?
    I was looking at that long string and thinking Apple has them all carefully recorded.

    I’d never post a benchmark like this, unless I determined I didn’t care about ever being able to work again, as that’s just asking to have your name be blacklisted.  If only because of that, but really, there’s more: it’s a matter of integrity, as once you’ve done something like this, you have clearly lost it.
    revenantwatto_cobra