Xavalon

About

Username
Xavalon
Joined
Visits
16
Last Active
Roles
member
Points
61
Badges
0
Posts
14
  • Mini LED 14-inch and 16-inch MacBook Pro production begins

    doggone said:
    entropys said:
    I would laugh if they don’t outclass tiger lake laptops, particularly XPS15 with GTX3050s, but really they should aim at shutting down Razer and Asus with GTX3070s.

    might be hard to do M2 if the next iPhone chip isn’t released yet.  It is interesting that we haven’t heard any rumours about that SOC yet.
    Maybe it won't when comparing in raw specs with a dedicated GPU. But really who cares.  It's just a pissing contest that has no bearing to the typical user. The point is that the M1 processor is already way more advanced than a X86 chip with efficiencies across the board.  Having the next generation of Mac that will take advantages will provide years of usage at performance way above the current Intel Macs and with the knowledge that Apple will continue to refine the OS to improve performance over time.

    Apple silicon is already exceeding most of the competition with only their first generation SOC.  They are going to be leaving the others way behind with subsequent generations.  Intel must be panicking.
    The competition didn't standstill. AMD Ryzen 5000 mobile series, launched after the M1, has some impressive processors. For example, the Ryzen 5800U is also a 15-watt processor) that is on par with the M1 single-core benchmarks, but faster in multicore benchmarks. And even Intel 11th series combined with a dedicated GPU has more performance than the M1. For the price of a Macbook Pro you find easily a 11th gen Intel with a Nvidia's 3050 or 3060 for the same price. You know that is 4-5 times the current GPU performance of the M1 in GPU TFLOPS. Yes, the M1 is much more power-efficient, but why compare this element to a desktop, NUC, all-in-one or even 15/16/17-inch laptop were nobody expects 17 hours of battery life. The forthcoming NUC 12th generation do come with a dedicated Intel GPU (between 3050 and 3060) with up to 16GB RAM and supports up to 64GB RAM and has three M:2 NVME SSDs slots (2 of them supports up to 7GB speeds; about double as fast as the M1 Macs).  Why is such a mini-PC not a better option than the M1 Mac Mini? All remains to be seen of the M1X can beat such configuration, especially the GPU part. The world is not giving up on AMD or Intel yet. Just saying. Still,  I am the Apple fanboy at home with everything Apple. I wanted to replace our "family" PC - primarily in use by wife & children - with Mac Mini but my wife insisted that we buy a new Windows desktop. For now we keep the current NUC longer and wait for the 12th gen NUC.
    williamlondonnadrielfastasleep
  • Windows on Apple Silicon is up to Microsoft, says Craig Federighi

    chasm said:


    Further, I don't see how MS prevents the emulator companies from developing their own virtual environments to run Windows within Big Sur on M1. The Crossover people have literally already done this (though it's early days), and VMWare and Parallels have both committed to it. What's more, even in an emulation environment typical Windows apps are likely to run at least a bit faster on an M1 Mac than they do on all but premium native Intel hardware -- that's how much performance headroom the M1 has, and remember its not even the yet-to-be-announced pro-level M-class chip!

    I think it would be wonderful if Microsoft, Apple, and the emulation companies work together once again to make the Windows experience on the Mac usually better than it is on actual Intel hardware -- there's no reason for MS to be married to Intel for life, and frankly Intel's decade-long stumble has really been holding back every company that relies on Intel chips. This is golden opportunity not just for TSMC and AMD, but for new chipmakers to emerge now that it has been shown that huge performance increases and efficiencies are still possible -- once you ditch Intel!
    ARM has been around for decades, in fact, the processor architecture started in 1983 (!). The ARM stands for Acorn Risc Machine and Acorn computers started in 1983 with the development of their own CPU with the instruction set based on RISC. Acorn computers had in the 80ish several computers on the market, and the Acorn Archimate in 1987 was the first one who used the self-developed CPU (the first ARM CPUs). Around 1990 Apple and Acorn started further developing this CPU. End of 1990 ARM ltd was founded, with Apple and Acorn both equal shareholders. So Apple is already over 30 years directly involved in ARM. 

    Yes, ARM is dominant in the mobile space. Because Apple created 30 years later the M1 doesn't mean other CPU architectures are doomed. Every architecture has it plusses and minuses. The M1 is far from perfect - yes it is perfect for the 'uses cases' for the average user. The fact that Apple doesn't sell the M1 to the market, to other manufacturers like HP, Dell, etc means that those manufacturers continue using what is available to them, either ARM, Intel, AMD, or other CPU's on the market. Just what fits the purpose. Yes, I believe the M1 is a wake-up call for Intel, but they come back. In fact, Intel is about to source manufacturing 7nm CPUs to TSMC. 

    the M1 not neccesairly beat AMD x64 CPU's in performance as the multicore performance still behind the AMD Ryzen 4800/5000 series. AMD is well-positioned to move to 5nm sooner than later and has even greater experience in CPU design than Apple (who frankly just has a relatively small team that focuses just on one CPU design). Also, the M1 is not necessarily faster than the Intel 11th series on 10nm. The decision to integrate everything into one soc, with unified memory, brings limitations that limit some use cases such as for gamers. We will see where Intel and AMD stand in the future against Apple. Apple has always been good in its software in combination with their hardware. This is not different from the M1 and MacOS on ARM. This alone, and the fact that they truly amazing job with the M1, makes that many users would be satisfied and that it met their use cases. 

    To end with some notes. Already in the 80ish, the graphical interface came part of the CPU, even the memory.  30-40 years later we come back to that same principle. 
    ravnorodommuthuk_vanalingam