thadec

About

Username
thadec
Joined
Visits
18
Last Active
Roles
member
Points
469
Badges
0
Posts
97
  • Mac Pro in danger after fumbled Apple Silicon launch

    dewme said:
    Just checked Apple’s leadership team profiles on https://www.apple.com/leadership/. Did I miss something? I don’t see Mark Gurman’s profile anywhere on that site. Where does he sit in Apple’s leadership team that decides the execution of Apple’s product strategy?
    Good grief. 2 points here.
    1. Apple has released four Mac Pros in the entire line's existence: 2006, 2013, 2019 and 2023.  No idea why anyone ever expected the shift to Apple Silicon was going to mean it adopting some 2 or even 3 year refresh schedule especially when you consider that the Mac Studio and  according to some people - the 16" MacBook Pro can replace the (cheaper with default specs) Mac Pro for a ton of users. 

    2. Intel and AMD aren't done. The 5nm Threadripper PRO 7985WX that AMD will announce in September will max out at 64 cores. And September 2025 Intel is going to announce a Core i9 - a regular desktop chip, not a workstation or server chip - with 40 cores. Apple knows what Intel and AMD are doing. (They also know that a ton of the Mac Pro customer base has since switched to Linux - before you snark "Linux has about 100 users" take a good look at the Mac Pro sales figures - and this is what they will be using.) As this group doesn't care nearly as much about power efficiency as Apple fans think that they do (should) Apple knows that keeping up with Intel and AMD in this space will be a challenge.

    3. Why? I have been saying this on this board for some time. And this is why the "Apple bungled the Silicon switch" argument is wrong: Apple's main problem is they have only one basic CPU core design. Give them credit: they get a lot out of it. They reuse old performance cores as efficiency cores, and for the performance cores they vary the number and power draw from 4 in a low power mode in a thin, inexpensive fanless device like the M1 iPad Air to 24 cores that they pump massive power to in the M2 Ultra Mac Pro. But still: they use an outdated version of this core as efficiency cores and the modern version as performance ones. Intel, for example? Does not have this problem.
    Intel has an actual efficiency core that is fundamentally different from their performance cores (1 thread instead of 2 and for now doesn't support AVX-512 ISA). They also have Core i3, Core i5, Core i7, Core i9 and Xeon cores  for 6 total (they also had Pentium and Celeron but starting with 13th gen they ditched them in favor of the faster e-core). AMD? Same. Athlon/A-Series (which seems to have met the same fate as Intel's Pentium and Celeron), Ryzen 3, Ryzen 5, Ryzen 7, Ryzen 9, Threadripper, Epyc. The Core i9/Ryzen 9/Threadripper/Xeon/Epyc batch uses a ton of power - though notably not so much in the Core i9/Ryzen 9 laptop versions - but Apple simply cannot match the single core performance of Core i9/Ryzen 9 and above. And no, this isn't x86 Windows fanboy nonsense. Apple themselves admitted the same. Yes, the 3nm M3 Ultra will beat anything that Intel and AMD have ... because Intel will be on 7nm (10nm for desktop/workstation/server) and because AMD, though on 5nm, hasn't adopted big.LITTLE on desktop yet (their big.LITTLE chips for laptops and handheld gaming consoles are just now coming out). But when Intel and Apple get to 3nm in 2025? 

    Yes, Apple could spend billions in R&D on developing cores to compete with Ryzen 9 and above. Or to save time and money they could license the Neoverse ARM server cores and customize them. But why? Remember: Apple only sells 25 million Macs in a good (actually great) year with the vast majority being M2 Pro and below. Were Apple to develop a Mac Pro capable of beating Linux workstations running the AMD Threadripper Pro in single core and multicore performance, how many more people are actually going to buy it? Apple knows the answer to this question already, even if the many Apple fan sites out there aren't going to admit it. Consider that even were Apple to start selling 10 million workstations a year, you will see Nvidia, Qualcomm and the rest react with their own ARM workstation chips. Nvidia would release a scaled down version of their Grace server/cloud SOCs - which use Neoverse V2 cores and their excellent GPUs - for workstations the very day Apple provides evidence of a market for it. There is even an ARM Ubuntu workstation and server stack already in place to take advantage of it. 

    Sorry guys, while I do buy and use Apple products, I am not an "Apple knows best" sort by any stretch of the imagination. But this is one instance where you are flat out wrong. Making a competitive Mac Pro requires making a competitive workstation CPU, and even if Apple could make one it wouldn't benefit them financially or in terms of market position. So were Apple to abandon the Mac Pro down the line, big deal: Google and Intel both exited the PC market because they were faced with the same set of facts.
    williamlondonAlex1NradarthekatriverkoFileMakerFellerwatto_cobra
  • Intel just took the worst beating in earnings in over a decade

    All right folks. Time to get out of the kiddie pool delusions and join the adults in reality.
    1. Intel's decline in 2022 are due to the recession and the stuff going on with China, Russia and Ukraine. We know this because it impacted iPhone sales too, as their sales were down 15% year-over-year.

    2. Even with one of the worst years in history, 286 million PCs sold in 2022. Even with one of their best years in history, Apple accounted for only 23 to 28 million of those depending on who you believe. OF the remaining 260 million or so PCs that did sell, Intel had 70% of those to AMD's 30%. Intel projects that this is going to rebound to 300 million next year. Analysts are railing (from their MacBooks and iPhones) that Intel is crazy to project this, but the 3 year replacement cycle for devices that were bought at the start of the pandemic will end in 2Q 2023. The enterprise "refresh the hardware we bought during the pandemic" will last through 2024.

    3. Intel did take a big hit in servers ... but from AMD, not ARM. Despite all the hype that you have heard - 99% of it typed on MacBooks and iPad Pros since November 2020 - ARM has only 3% of the server business. Marvell left the ARM server business because they weren't making any money, leaving Ampere as the last major player there. (And Ampere's "major" is "fewer than 1000 global employees.") Granted, Nvidia will join Ampere in a few months. But, AMD released their response to ARM servers in 2022. Intel's response won't come until 2024 but it will be mighty substantial: Sierra Forest, a Xeon with 334 efficiency cores on a 6nm process. It removes the sole advantage that ARM servers have over x86, which is better power efficiency per thread. 

    4. Apple Silicon fans do their best to evade this, but even Apple acknowledges that Intel has superior single core performance. The M1 Ultra scored 1780. The M2 Pro? 1952. Fine but the last gen Intel Core i9-12900K (1986) as well as the current gen Core i9-13900KS (2286), Core i7-13700K (2107), and Core i5-13600KF (2011) all clearly beat it. You should know that the latter is in a budget gaming desktop (16 GB RAM, 1 TB SSD, NVIDIA GeForce RTX 3060) that costs $1200. Yes, the 3nm M3 will come out this year. No, it will not beat the 10nm Core i9-13900KS. It won't beat the 14th gen Core i7 will be out by the end of the year either.

    5. This makes the "Combined they are aiming for 1.8x improvement in performance-per-watt. That only brings them to where Apple is now if they deliver" wishful thinking. Intel's 14nm chips were already beating the M2 Pro. This means that when Intel reaches 3nm, they will be able to make 28W chips (the M2 Pro Mac Mini's TDP is around 26.5 W) that have the same performance as what Apple offers as well as 75W-125W chips that will crush it. How? Simple: where Apple currently has only 1 performance core design (used for smartphones/tablets, laptops and PCs) Intel has 5: Core i3, Core i5, Core i7, Core i9/Xeon. (This is down from 7, as Intel ditched Celeron and Pentium.) Core i7 and Core i9 are bigger, meaning better performance but worse efficiency. Core i5 and Core i3? Smaller cores for better efficiency but worse performance. As Intel shrinks the die from 10nm to 3nm, they will keep the 125W base for Core i9 and Core i7 ... but reduce it for Core i5 and especially Core i3. The Intel core i3-13100F, for example, has unofficial single core scores of about 1700, or 95% of the M1 Ultra. They only need to maintain that while significantly decreasing the TDP for certain Core i3 and Core i5 versions. Granted, this will occur in the laptop versions of the chips - for desktops their competition will be AMD's Ryzen 5 and Ryzen 7, not Apple - but that doesn't matter because mini-PCs like the Intel NUC Extreme that are going to compete with the Mac Mini and Mac Studio will use laptop chips anyway. Wherever Apple is going to be in 2025 with 2nd gen TSMC N3E, Intel will be able to come up with a (likely) Core i5 to match it in 2026.  

    So long as Intel is inside 200 million Windows and Linux PCs sold each year, they aren't going anywhere. 
    williamlondondewmeFileMakerFellernubus
  • M2 Ultra benchmarks show performance bump over M1 Ultra


    Double the Intel performance — ouch, indeed!

    Way to go, Apple. Keep up the good work.
    Someone should sell stickers that say "Apple Inside."
    When the comparisons aren't being made against a 4 year old chip made on a 9 year old process node that was retired 2 years ago:
    https://wccftech.com/apple-m2-ultra-soc-isnt-faster-than-amd-intel-last-year-desktop-cpus-50-slower-than-nvidia-rtx-4080/

    And even the Intel 13900K and AMD 7950X are 2022 chips. Also, neither are workstation chips. A comparison with the AMD Threadripper 7000 and 7000 Pro that gets released in September, which will be made on the same node as the M2 Ultra, won't be particularly favorable. The M3 Extreme - which I am betting that Apple is going to launch on TSMC's 2nd gen 3nm process in 2025 - will be needed to compete, except that in 2025 the AMD Threadrippers made on TSMC's 1st gen 3nm process will be out, as will - and this is a worst case scenario for them - Intel's 5nm desktop and workstation chips.

    Going forward Apple is likely going to de-emphasize direct comparisons between Apple Silicon and Intel - note that they have avoided mentioning AMD altogether - in favor of comparisons with previous generations of Apple Silicon. In a way that will be appropriate. The software that most people are going to run on the Mac Pro and Mac Studio are going to be so different from the software that most people are going to run on Threadripper and Xeon-W workstations and servers that direct comparisons will be impossible anyway.
    entropysmuthuk_vanalingamxyzzy01ctt_zhhypoluxaHreb
  • Mac Studio may never get updated, because new Mac Pro is coming

    rob53 said:
    So why release the Mac Studio in the first then? 
    To try and pacify Mac Pro users. If I remember correctly, the Mac Studio outperformed the current Intel Mac Pro for much less money. You could build a faster Mac Pro but by the time you did, you could have purchased 2-3 Studios. Apple engineers couldn't come through with a chip(s) that would justify the Mac Pro product line--and still haven't. Mac Studio introduced in March 2022 (thought it was older than this) and now it appears the Mac Pro might be released in Spring, which would put it a year after the Studio. From other reports, the Mac Pro will use the same, oversized box as the Intel Mac Pro, which I hope isn't the final Mac Pro product. Stack two Studios on top of each other and that should be enough room for a Mac Pro with 2-4x the power of the Studio. 

    Geekbench scores still put the Studio above the fastest Intel Mac Pro in single and multi-core benchmarks. Metal benchmark still gives the Ultra a score just under 100K with several faster AMD Radeon graphics cards, mainly used in the Mac Pro. 
    The problem is that the current Intel Mac Pro performs about at what an AMD Ryzen 7 7700X currently does and what an Intel Core i7-14700K (coming 4Q this year) soon will. And that doesn't even get into what current Ryzen 9 and Threadripper chips (and their Intel counterparts) do. Or what they will in 2024 when AMD hits TSMC 3nm and Intel hits what is equivalent to TSMC 5nm. 

    See, that is the thing. In 2020 you were comparing Apple Silicon to chips in $799 and $999 devices that had been released a year or two prior. So folks were impressed. Nobody is going to be impressed that a $6000 machine released in 2023 has a CPU that beats the Xeon W-3235 from 2Q 2019. By comparison, 2019 is when the A13 for the iPhone 11 came out. You all know what iPhone 15 and its A17 is going to do to the iPhone 11. To put it another way, beating the Xeon W-3235 is not going to prove that Apple is capable of making a competitive workstation with their own chips. You will be able to get Xeon W-3235 (and similar AMD Threadripper 3960X) workstations on eBay from people who will want to get the AMD Threadripper 7000 in a few months. I am going to restate this: nobody cares that there are not a few AMD and Intel chips that outperform the M2 (including the Pro and Max) because the Mac Mini starts at $599. That is why DigitalTrends is able to call the Mac Mini "the best mini-computer ever" in an article that acknowledges that there are faster machines available https://www.digitaltrends.com/computing/apple-mac-mini-review/
     (and only gives as an example an Intel-based mini PC, knowing full well that the 6nm AMD Ryzen R9 6900HX mini PCs crushes the 10nm Intel Core i9-12900K in performance, graphics and power per watt). But for a machine that will cost 10 times as much as the Mac Mini, the fact that there are a number of faster options available that cost way less won't be nearly as easy to brush aside.

    It is going to be interesting to see how Apple deals with this issue. Maybe they will be able to get away with a first gen Mac Pro that isn't competitive - and yes they do need to get it out the door this year in order to finally declare the transition from Intel "done" as they are already behind schedule and, as mentioned earlier, AMD and especially Intel workstation (and upper echelon desktop) chips are going to make huge leaps in performance between now and 2025 - but 2nd gen and going forward is going to have to be. And as I said in the comment directly preceding this one, no, the Mac Pro isn't primarily for Hollywood people who don't care about having the very best performance. Hollywood shifted to Linux server farms for rendering, 3D animation etc. ages ago. Anything that Hollywood would do on a Mac Pro today would make more sense to shift to the cloud. Instead, Mac Pros are used mainly by the workstation crowd who prefers macOS to Linux and Windows workstations. But if the Apple Silicon Mac Pros aren't competitive, then those folks are going to be forced to ditch their preferences, and the only ones left will be those who simply can't migrate their workloads from macOS to Linux or Windows (or the cloud).  
    radarthekatmuthuk_vanalingamh2pmacike
  • Apple's muted 2023 hardware launches to include Mac Pro with fixed memory

    My dream version of the Mac Pro: Modular Design.

    Create a series of bricks having the same length and width (a la Mac Studio, OWC Ministack STX, etc.) but with different heights.  Allow these to be vertically stacked to customize the server of your dreams.  I would then create:
    • CPU - essentially an upgraded Mac Studio
    • RAID array module based on M.2 SSDs
    • RAID array module based on HDDs
    • PCI expansion modules (for graphics cards, scientific packages, etc.)
    • Power backup module (lithium battery for space)
    The Mac Pro 'special sauce' would be the integration between all the components.  I'd make the footprint larger than the Mac Studio to support the 312mm PCI cards and appropriate cooling fans.  Extra credit for allowing multiple CPU modules to work together.

    Were Apple to go this route, I believe they could capture the majority of revenue associated with server hardware.

    The Mac Pro is not a server. Its hardware is only suitable for a small server. And macOS is not a server OS. Right now, the highest configuration Mac Pro has a CPU with only 28 cores. Currently most ARM server CPUs have up to 128 cores. The Nvidia Grace CPU that will launch in a few weeks will have up to 144 cores and its 2nd gen version will have 196 cores. AMD's Epyc server CPUs go from 128 cores to 192 cores this year with the move to 5nm, and will increase again in 2025 when 3nm Epyc CPUs debut.

    The Mac Pro is a workstation, not a server. There have always been plenty of workstations more powerful than the Mac Pro that cost significantly less. And when Apple switches the Mac Pro from Intel to Apple Silicon, this will become even more so. The M1 Ultra is 2.6X slower than AMD's Threadripper Pro 5995WX. And unlike Epyc, Threadripper is a workstation chip. This means that even if Apple does eventually do an M2 or M3 Extreme, Intel and AMD's real competition in workstations - not servers! - will be each other, not Apple. 
    muthuk_vanalingamadam veniercaladanian
  • Future Mac Pro may use Apple Silicon & PCI-E GPUs in parallel

    Yeah ... Intel and AMD CPUs have supported both integrated graphics (AMD's RDNA 3 integrated graphics on their Ryzen 7040 laptop chips are comparable to an Nvidia RTX 2050) as well as discrete graphics through PCIE or Thunderbolt for who knows how long (and now M.2 slots). Intel drivers will even have an Intel Arc discrete GPU and an Intel Iris Xe integrated GPU be seen by the CPU as a single GPU. (AMD considered this idea but abandoned it.) And no, it isn't an x86 thing. Lots of Linux ARM servers use discrete GPUs. MediaTek and Nvidia wanted to bring discrete GPU support to ARM PCs around 2021 but abandoned it because neither Microsoft (who has an exclusive Windows on ARM deal with Qualcomm that is explicitly designed to lock out MediaTek for Qualcomm's sake and ChromeOS/Linux for Microsoft's sake) or Google (who just isn't very smart when it comes to stuff like this) were interested so it was abandoned.

    So, there never has been any reason for Apple Silicon Macs not supporting discrete graphics via M.2, PCIE or Thunderbolt other than Apple simply not wanting to. Which was the same reason why Apple locked Nvidia out of the Mac ecosystem and had people stuck with AMD GPU options only: purely because they wanted to. My guess is that Apple believed that they were capable of creating integrated GPUs that were comparable with Nvidia Ampere and AMD Radeon Pro, especially in the workloads that most Mac Pro buyers use them for. Maybe they are, but the issue may be that it isn't cost-effective to do so for a Mac Pro line that will sell less than a million units a year.
    keithwwilliamlondonravnorodomcgWerksmuthuk_vanalingamroundaboutnow
  • Apple could spend $5B on servers to catch up in AI race

    danox said:
    Yep, Apple’s gonna buy Nvidia AI servers, when they have everything in house to build their own and learn more from the experience, Nvidia, Intel, and AMD servers are barn burners. I don’t think Apple will go down that path not at that wattage when then can build their own without the need for additional coolant fluid.

    https://www.youtube.com/watch?v=5dhuxRF2c_w&pp=ygUMb3B0aW11bSB0ZWNo 13:12 mark high wattage and MHz at all cost Apple isn’t on that path…… Long term Apple appears to building up from ground level with a different solution.

    https://www.notebookcheck.net/Disappointing-Core-i9-14900K-performance-watt-vs-Ryzen-7-7800X3D-showcased-as-Intel-CPU-appears-to-guzzle-energy.761067.0.html

    https://www.reddit.com/r/hardware/comments/17esez1/exclusive_nvidia_to_make_armbased_pc_chips_in/ Opps, Nvidia wants to make Arm chips I think that cancels them out.
    Good grief. Let me tell you this: Apple is a consumer products company. Apple doesn't even have a workstation chip good enough to compete with the latest Intel Xeon W, which itself (by Intel's admission) can't compete with the latest 96 core 192 thread AMD Threadripper workstation chip. So where on earth are they going to get a server class chip design? Before you answer, Apple Silicon doesn't have anywhere close to the single core performance of an x86 server or even a generic ARM Neoverse V2 or V1 server (maybe a Neoverse N1/N2). As for multicore, Nvidia Grace chips have 144 cores. In 1Q2024 Intel Sierra Forest will have 196 cores. And that is CPU. Where on earth is Apple going to get the parallel processing power from? (AI/ML uses GPUs as parallel processors.) The M2 Ultra only provides raw performance - no we aren't talking about the hardware accelerators that are just good for video, audio and photo processing - equivalent to a midrange Nvidia card that you can pick up off the shelf at a retail store. Comparing it to an Nvidia Hopper data center GPU or the competing products that AMD has and Intel is trying to make is hilarious. And that is the hardware. Where are the software tools going to come from? You know, like Nvidia CUDA, AMD ROCm or even Intel's DPC++?

    You are aware that Apple doesn't use their own chips for their data centers, right? They use Intel Xeons - with some AMD Epyc - just like everybody else, and those servers run Linux, not macOS or some Apple home-grown server OS, like everybody else. Apple loves to talk about how their data centers run on green energy, but the tech that they use in them? Not so much. Just as they don't like to talk about how much they rely on Amazon AWS (plus some Google Cloud Platform for redundancy and data analytics) to deliver iCloud and their other cloud products. It would demystify everything I guess. 

    So just like Apple was unable to design a 5G modem as good as Qualcomm's - and they aren't even starting from scratch because they bought Intel's 5G division way back in 2019, they just reupped with Qualcomm through 2026 so we are talking about at least 7 years here meaning that they will finally have 5G done for 2027 right about the time that the industry is going to start shifting to 6G (testing begins in 2028 and commercial release in 2030) - they won't be able to snap their fingers and create their own genAI server stack at the drop of a hat either.
    williamlondongatorguyctt_zhavon b7watto_cobra
  • Apple insists 8GB unified memory equals 16GB regular RAM

    DrDumDum said:
    thadec said:
    As someone who does a lot of virtualization (Linux and Windows virtual machines in VMWare, Parallels etc.) I can say that 8 GB of RAM on on the latest, fastest Apple CPU definitely is not analogous to 16 GB RAM on an 8th generation Core i3 from 2017.
    I read both posts... but didnt see anywhere where you actually owned a base M1/M2 machine to test your "theory".

    A lot of people here are still stuck in the "intel" mindset. I get it hard to shake.  I do tech consulting for graphic designers, and over the last 15 years if I had a nickel for every time I said "theres no such thing as too much ram" id be a billionaire.

    Happily i dont have to anymore. I moved 15% of my older basic intel clients to M1 minis with 8/512 setups... dual screens + fast SSD external working drives... all loving the speed. those with higher demands went to 16 GB.... many came from 32 GB ram setups. No issues. Watch for refurbs. 16/512 M1 minis were $700. M2 shoudl be th4e same.
    First off, I own an M2 Mac Mini. 
    Second, what "theory"? Virtualization is when you use hypervisor software run a virtual computer on your computer hardware and OS. No matter what CPU, architecture, OS, manufacturer etc. you are talking about it is the same because the requirements of the operating system that you are running doesn't change. So you are still going to need 4 GB of RAM to run a Windows 11 VM. You are still going to need 2 GB of RAM to run a Linux VM. Period. Otherwise, your virtual computer's OS will run out of resources just as it would if you were to install the OS on physical hardware that's insufficient to handle the specs. If you don't believe me, go to ServeTheHome.com. They promote low end - meaning slow - hardware that supports a lot of RAM for virtualization servers all the time. A lot of people buy old - meaning slow - machines off Newegg and Ebay to use as virtualization servers because they have a ton of RAM. But hey, don't take my word for it. You install Windows on VirtualBox or VMWare Player and configure it to use 2 GB of RAM instead of 4 GB because "RAM on a Mac is 'effectively' twice that on a PC." See if your Windows VM even boots up.

    Even better. Get a photo editor, whether Krita, GIMP, Inkscape or whatever (they are all free). Go download one of those high resolution images from NASA: https://photojournal.jpl.nasa.gov/target/Earth
    Try to edit one of those bad boys with 8 GB RAM. Don't worry! Apple's executives claim that it is just as good as doing so with 16 GB RAM, so it will work out fine!
    muthuk_vanalingamdewmewilliamlondonAlex1N
  • United States Apple Watch import ban has begun with no resolution in sight

    Marvin said:
    gatorguy said:
    What's the objection to sitting down at the table with Masimo and accepting their offer for a negotiation? Even the richest and most powerful of all companies should recognize when it's time to cut their losses, the lesson taken. 
    My guess is spite.

    It makes sense for Apple to do things in-house because they can scale better without relying on 3rd party components.

    He also says Apple could have avoided the ban by making the watches/sensors in the US so that's an option Apple could use.

    I wonder how companies like Fitbit are getting around it as they have blood oxygen sensors too.

    The Masimo CEO says he's open to negotiating with them. I expect Apple will appeal first.

    If Fitbit can get away with having blood oxygen sensors, Apple can too.
    Samsung signed a number of deals with Masimo right about the time that they added SpO2 to their smartwatches: https://investor.masimo.com/news/news-details/2020/Masimo-and-Samsung-Partner-to-Package-Masimo-SafetyNet-with-Select-Samsung-Phones-to-Speed-COVID-19-Response-Efforts/default.aspx

    Also Masimo already won a $466 million lawsuit against Philips over this same tech: https://www.reuters.com/article/us-philips-masimo-verdict-idINKCN0HQ54W20141002/ and unlike Apple, Philips is actually a medical device manufacturer. Unfortunately for Philips, they are based in the Netherlands and not the U.S. More on this later.

    There are tons of other companies who make smartwatches that offer SpO2 monitoring: Garmin, Withings, Oura (who offers it in their "smart ring"), FitBit, Google, Realme, Amazfit and OnePlus. If Amazfit can offer SpO2 in a $150 watch then it means that licensing this tech would have been cheaper than raiding Masimo for talent even if Masimo hadn't sued. So nothing to "get around." Just pay for the tech that you use like everyone else.

    My guess: Apple is used to getting away with "sherlocking" like this. After the Obama administration overrode Apple's appropriation  of Samsung's wireless technology https://www.forbes.com/sites/connieguglielmo/2013/08/03/president-obama-vetoes-itc-ban-on-iphone-ipads-apple-happy-samsung-not/?sh=67f3cb488c3d on iPhones and iPads, Apple clearly thought that they could just keep doing it. The best part: the Obama administration didn't even object to the ITC's ruling against Apple itself! Instead, their rationale was that an import ban on iPhones and iPads would harm the U.S. economy. (Compare Samsung having to fork over hundreds of millions to Apple over "trade dress" but Apple getting off scot free for just taking Samsung's wireless tech.) A writer for The Verge - in the comments, not in the article - pointed out that the Biden administration would be way less likely to side with Apple over another American company like Masimo as opposed to a foreign competitor like Samsung. Note how Apple lost their patent fight against Qualcomm - no claiming otherwise as the "out of court settlement" between the two was for the terms that Qualcomm demanded in the first place - and Qualcomm is also an American company. 

    And as for "doing things in-house", I know that Apple is trying to integrate as many components as possible into the SOC, but they are either going to have to come up with their own non-infringing implementations of them or license the designs from Broadcom and everybody else. 

    gatorguyleighrmuthuk_vanalingam
  • Without irony, Microsoft CEO says Google unfairly dominates search

    Not defending Microsoft or Google here, but I just want to point out that Microsoft is not suing Google here (because they would lose). The feds are. The feds called Microsoft as one of the many witnesses for the prosecution. They aren't trying to prove that Google did anything illegal or is a monopoly with respect to Microsoft. Instead, they are using Microsoft as a single data point in a much longer case to prove that Google has either done something illegal or is a monopoly. Microsoft really didn't want to have anything to do with this, because they knew that Google would mock them for their massive failures in mobile and search, as well as remind people of the stuff that Microsoft does to leverage Windows. 


    muthuk_vanalingamjose8964dewmeFileMakerFellerAlex1Nwatto_cobra