Rayz2016

About

Banned
Username
Rayz2016
Joined
Visits
457
Last Active
Roles
member
Points
18,422
Badges
2
Posts
6,957
  • Apple pays Samsung estimated $950M for missing OLED purchase targets

    netrox said:
    I am not sure how you get penalized for not meeting the minimum purchase? Isn't it a custom to buy ALL of them to avoid being penalized? Can't those extra screens be saved for repairs when needed? Apple could repair for a lot less and their loss would be minimized. 


    They will get penalised because Samsung will have spent millions on materials to build the screens. They probably can’t sell them on because no one else can afford them or needs them. 
     Apple won’t want them because they’d need to store them, and if they might decide to go for a better screen next year, in which case how would they unload them? 

     Sounds unfair, but it’s not. It may surprise you to learn that there is probably a clause in the contract that means Samsung would have to pay millions in penalty fees if they wanted to deliver the screens earlier than planned. 

     I used to develop supply chain software, and yes, companies will fine suppliers who deliver before they’re ready for it.
    designrrevenantwatto_cobra
  • Apple Silicon MacBook Pro migration starting in late 2020, new model in late 2021 says Kuo...

    @Xed ;

    With all due respect, it is less "knowledge of business, supply chain and vertical integration" that gets bandied about by journalism graduates who have never worked a day in engineering, manufacturing, product management etc. than it is Apple boosterism. Simply put: do you folks honestly think that Apple will be the first company to make their own CPUs? Because if you want vertical integration, Samsung has that in spades. They make the CPUs, RAM, SSD, screens and cameras! And unlike Apple, Samsung actually MAKES these components in their own factories and foundries where "Apple Silicon" is actually made by TSMC (and was previously by Samsung). And before Samsung, IBM and Motorola used to make their own components too. 

    What makes this statement so hilarious (aside from obvious factual errors) is that it's very similar to a statement you were schooled on several months ago. Let's revisit it shall we?

    https://forums.appleinsider.com/discussion/comment/3214174/#Comment_3214174

    […] Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later). 

    That was you, wasn't it?

    The reason I bring this up is because you're displaying the same lack of knowledge that led you to believe that Apple didn't have the nonce to create its own chips (when it already had been doing this for years) and that "all those developers would have to port, rewrite or create from scratch their x86 applications."

    Okay, let's begin:

    Simply put: do you folks honestly think that Apple will be the first company to make their own CPUs? Because if you want vertical integration, Samsung has that in spades. They make the CPUs, RAM, SSD, screens and cameras! And unlike Apple, Samsung actually MAKES these components in their own factories and foundries where "Apple Silicon" is actually made by TSMC (and was previously by Samsung). And before Samsung, IBM and Motorola used to make their own components too. 

    I see we've gone from Apple will never come out with their own chips to do you folks honestly believe Apple will be the first company to make their own CPUs? – which no one actually said.

    That's quite a switcharound, but moving on …

    Because if you want vertical integration, Samsung has that in spades. They make the CPUs, RAM, SSD, screens and cameras! And unlike Apple, Samsung actually MAKES these components in their own factories and foundries 
    Yeesss, you see what you've done here is mistake "vertical integration" with "manufacturing parts". Samsung doesn't have vertical integration because they're missing one of the key components: the operating system. The main reason Apple is doing this is so they can optimise the how the system architecture interacts with the OS. Samsung can't do that because they're using someone else's operating system running over someone else's chip design, executing apps running on someone else's programming language optimised for someone else's virtual machine. 
    Apple doesn't do manufacturing because there's no money in it for them. They'd be stuck with manufacturing facilitates that they would need to retool every time they changed product designs. 

    Having said that though, Apple does spend billions on manufacturing processes and robotics. Many of the factories used for assembly are using automated machinery owned by Apple. They also are very proud of the robots they use to strip and recycle their products which allows them to reuse materials for new products.

    We get it: iOS is faster than Android and Apple Silicon is faster than Qualcomm (and Exynos, MediaTek and Kirin). But that doesn't translate everywhere. Allow me to say that I have long been a fan of RISC, which ARM is a subset of. I remember when Sun SPARC and Motorola 68xxx UNIX workstations and servers could crush anything that Wintel was capable of. I have also been keeping up with ARM-based servers, which some quarters have been hyping for years. Linus Torvalds claims that Mac switching to ARM will be the catalyst for ARM-based servers really taking off.
    At this point, it's worth noting that Apple's architecture doesn't use the ARM reference designs, only the instruction set. This is an architecture built from scratch, so there is little point comparing what they're doing (trying to do) with existing ARM reference chips. The question is, will Apple think there is enough money to be made by taking their chip design knowledge and applying it to the cloud server market?

    But please know this: not even Apple claims that their 5 nm A14 chip will outperform the 10 nm Intel i9 or even the i7. They merely claimed that the iPad Pro beat an unspecified MacBook (i9? i7? even i5?) on some internal tests. So keep these 3 things in mind.

    Well first, you keep one thing in mind: the iPad Pro already outperforms 80% of the desktop PCs in use today, which is weird considering it's running a chip that isn't optimised  for desktop use and is already quite old.

    https://www.tomsguide.com/us/new-ipad-pro-benchmarks,news-28453.html

    Okay, carry on:

    1. The MacBook Pro runs tons of heavy duty performance software that the iPad Pro can't run at all rendering that test worthless for people with serious computing needs.
    2. The i9 isn't even Intel's most powerful chip. The Xeon, which goes in the Mac Pro, is.
    3. Intel won't be at 10 nm forever. AMD is at 7 nm, after all, and is expected to reach 5 nm as early as 2021.

    Okay, I don't know how many times Apple has to say this before it sinks in:

    The architecture  used in the current test hardware is not the one that will be used in the first shipping devices. Apple has given folk a modified iPad Pro in a Apple Mini case. This does not represent the ability and performance of the final design. It doesn't even have Thunderbolt, which we know Apple will be supporting going forward.


    What you aren't considering: "Pro" users whose computing needs tend to the ultra-high performance scale make up a tiny percentage of Mac sales. We already know that Mac is willing to give up the similarly tiny percentage of Windows (bootcamp and virtualization) users. The switch to ARM may mean that Mac is willing to give up workstation crowd too. (Because, er, making workstations will mean that Apple CAN'T put the same chips in workstations that they put in iPhones, ok? There are power/heating/expense constraints that smartphones have to work within. Make custom Apple Silicon to run in workstations? Yeah ... that's a worthwhile expense for the 250-500k Mac Pros that they sell a year. Not to mention it would drive up the cost.) 

    Again, your logic fails at the first hurdle:

     (Because, er, making workstations will mean that Apple CAN'T put the same chips in workstations that they put in iPhones, ok? There are power/heating/expense constraints that smartphones have to work within. Make custom Apple Silicon to run in workstations? Yeah ... that's a worthwhile expense for the 250-500k Mac Pros that they sell a year. Not to mention it would drive up the cost.) 


    Jesus Henry Christ on an ecoScooter … :-( 

    Okay, let's try again:

    Apple is not using the same chips in the phones and iPads that they're going to use in the desktops. They are designing a new line of chips optimised for desktops and workstations. They are not going to drop an iPhone chip into a desktop.

    The purpose of Mac switching to ARM may well to increase convergence with iPad and iPhone users. Not to match Intel i9s and Xeons on computing power.

    Well, given your track record so far, I think I'll just wait and see. Apple's obsession with AR means they're going to need so pretty hefty architectures going forward.

    XedthtroundaboutnowrandominternetpersonbestkeptsecretGG1fastasleep
  • Apple Silicon MacBook Pro migration starting in late 2020, new model in late 2021 says Kuo...

    rain22 said:
    rain22 said:
    Things are going to get more expensive as Apple squeezes its customers through software - and will justify higher hardware prices soon after. 
    Apple user's already pay 30% more for everything - software, ringtones, in-game purchases, subscriptions, books, hardware, cables, hotel rooms, you name it... 
    Once Apple has full control over its users - $700 coasters are going to look like a deal. 
    Put down the crack pipe, you've had quite enough. 
    You must be pretty new to the world - definitely Apple's history. 
    So what you're saying is that no other store (Google Play, Amazon, Xbox whatever) takes a 30% cut?
    crowleyDetnator
  • Apple silicon Macs to support Thunderbolt despite shift to ARM

    4min33 said:
    melgross said:
    rcfa said:
    People forget that Apple and Intel developed TB TOGETHER. It’s not like a PROTOCOL is depending on a specific CPU 🤦🏻‍♂️
    See, this is interesting. Apple is saying that they developed it together. But shortly after the technology became out, Intel said that it wasn’t true. They said that Apple came to them with the idea of a fast port, but that Intel did all the work, and that Apple had nothing to do with the development. So this statement is interesting.


    I'm not sure where intel made this statement, perhaps you can reference?  I can tell you this:  Intel's original Thunderbolt was a fiber-optically based wet dream.  Apple decided to couple with Intel post-transition to make a high speed port to supersede Firewire and then DisplayPort based on copper, and both parties shared the awesome amount of work on the physical layer it took to make Thunderbolt happen.  Apple in the interest of unification and end-user simplicity also then authored most of the USB-C phy specification, after never having been part of the USB coalition by donating the IP around signal muxing (auto-polarity), legacy DP support, guest protocols, and power control (not exhaustive).  If you download the USB-C spec there is a large number of Apple authors, before that none (although they did help update usb 3 to 3.2).  Apple has been driving these interfaces in conjunction primarily with Intel for a long time, they know as much as anyone about the issues.
    Interesting. 

    In 2015, there was a lot of speculation that Apple was largely responsible for the USB-C spec. 

    https://www.cultofmac.com/321363/apple-patent-explains-how-usb-c-will-make-every-other-connector-obsolete/


    Jon Gruber reckoned that Apple wanted to keep its involvement low-key to further the spec’s chances of adoption. 

    tmaywilliamlondonaderutterargonaut
  • Apple is still working on under-display optical Touch ID reader

    LeoMC said:
    Fingerprint scanner makes no sense...
    Maybe the patent is for an AI-related sensor that senses the presence and fires up the Face ID, unlocking the phone, even before one reaches for the device.
    That's the stuff Apple is (should be) thinking about, not resurrecting ancient features.
    The problem with FaceID is face masks. The Japanese have been wearing them in public for years to make sure they don’t spread germs if they’re out and about with a cold. Now it’s becoming something of the norm, FaceID has a bit of a problem. 
    muthuk_vanalingamMplsP