name99

About

Username
name99
Joined
Visits
3
Last Active
Roles
member
Points
32
Badges
0
Posts
10
  • A7: How Apple's custom 64-bit silicon embarrassed the industry

    dewme said:
    Apple is like a marathon runner trying to establish a new world record- they are racing to achieve a personal goal that they have set for themselves. The fact that there are other competitors in the race is incidental. 

    If Apple was only racing to beat the competition they could get by with only being marginally better than their closest rivals. 

    Competition is good, but for companies (and people) who are focused on excellence rather than simply winning, competition isn’t a good enough benchmark to get you where you need to be. 

    Let the Samsung’s and tech pundits of the world focus on the excitement of what they perceive as a race, which is in reality a breathless quest to see who’s going to finish in second place. 
    This is, I think, a more perceptive comment than most realize. 
    I've frequently talked about the difference between Technical and Financial companies. Technical companies don't fear the future, don't fear change, because they are confident that they can make and direct change, that whatever comes next will be to their benefit. Finance companies are terrified of change because all they feel confident in is the ability to keep doing what they're doing today. 

    Few companies, even those nominally in "Tech" are Technical companies, companies that aren't scared of change, scared of disrupting themselves. nVidia kinda sorta. AMD right now. Intel back in its glory days, from about the 386 to about Sandy Bridge. 
    When engineers at Apple propose "OK, for the A15 our goal is another 25% performance improvement, and we figure let's add SVE2 to extend our wide vectors to integer support", management is saying "Right on! Make it so!". 
    When engineers at Intel propose "OK, 7nm is so dense, how about we put AVX512 and 4 cores everywhere", immediately the discussion turns into "well how will that affect our Xeon sales? And what if it reduces the prices we can charge for mid-range? How about you cripple the AVX512 support by making the scatter gather really slow? Or how about we give them lots of cores, which sounds good in the ads, but we don't give them enough DRAM controllers to actually run all those cores optimally?" 

    And so it goes. The Apple engineers (at least for now, who knows how long it will last) have carte blanche to make the best product they can, year after year. The Intel engineers are not allowed to do ANYTHING that might make a product at level n of the 29 step marketing segmentation an acceptable, cheaper, replacement for level n+1. 
    StrangeDayswatto_cobracapt. obviouslolliverradarthekatchiabakedbananasJWSC2old4funchaicka
  • ARM to A4: How Apple changed the climate in mobile silicon

    "In retrospect, it's funny that outside observers were stating at the time that Apple wasn't moving quickly enough, that it should have opened the App Store a year earlier, that it couldn't afford to delay its macOS release by even a few months, or that it really should be scrambling to get Sun Java or Adobe Flash working on iPhone, or working on some other priority such as MMS picture messaging. "

    Thank goodness nowadays we have a better class of outside observers, with much more thoughtful insights as to what Apple should and should not be doing...

    Thanks for the article. I look forward to the rest of the series. 
    My one suggestion would be that I'd like to hear more of the PA Semi side of things. 
    Obviously CPU performance is the product of both frequency and IPC (ie a smarter design that can perform many instructions per cycle without wasting much time on dead cycles). Intrinsity seems very much geared to ramping up frequency, but longer term Apple seems to have decided (correctly, IMHO) to concentrate on very wide, very high IPC designs that don't strive for maximum frequency.
    You'd figure what's needed for this decision (the mindset/confidence to go down this path, and the necessary tools like simulators) must have come from somewhere. So did they come PA Semi; or were the tools created pretty much from scratch inside Apple, while Srouji and the other engineers took a massive gamble on the very wide approach to mobile CPU design based not on experience but just on seeing the likely trends of Si going forward? (ie transistors are getting cheaper and cheaper, but not much faster)
    tmayedredDan_Dilgerwatto_cobra
  • A7: How Apple's custom 64-bit silicon embarrassed the industry

    melgross said:

    lkrupp said:
    As Mr. Dilger has pointed out on numerous occasions there is indeed a coordinated campaign by tech bloggers and competitors to spread FUD about Apple, its software, and its hardware. This continues, in part, because of Apple’s legendary secrecy and its reluctance to respond to the fake news published about it. We should have noticed by now that the spec monkeys have receded into the woodwork from whence they came because the superiority of Apple’s A chips is now indisputable.
    Some of this is indeed Apple’s fault. Despite Schiller’s talents in marketing, the top management team, from Jobs onwards, has been very poor at laying out Apple’s philosophy. So the main difference people read about, and this includes from writers in the industry, is that Android is “open” and that iOS is “closed”. even though Apple has, from time to time, given some explanations their as to how, and why they do what they do, there has never been a complete, coherent statement from them as to why they do what they do overall. Their explanations tend to get lost in the noise. They really have needed to just come out and say exactly what they do as compared to their competition.

    so as for that open vs closed debate that techies argue over, but which the general public not only doesn’t care about, but isn’t even aware of, it has its basis in the beginnings of how Apple and Google needed (note that word, needed) to enter the cell industry.

    apple was turned down by Verizon, because Jobs was so secretive that Verizon didn’t trust that Apple had a product they could sell. Jobs went to AT&T, which was number two, and struggling, and made a deal there. As we know, the iphone became a big hit. But was limited to AT&T for, I think it was, three years. Apple I know, was working on an App Store from the beginning, though it didn’t come out with it the first year, a very good marketing ploy. When it did, the second year, it has almost 14 million users starved for apps, which was a large number of smartphone users at the time, and the store became an instant success.

    but in order to not have problems with AT&T and other providers later, given that Apple curated the store from the beginning, they had to walk a fine line in what apps they allowed. So they didn’t allow apps that broke contractural obligations. One of the big ones back then was the fight over hot spots. Cell providers were charging a fairly high fee for that, and of course, a lot of people wanted the service, but didn’t want to pay for it. Apple didn’t allow apps that went around people’s contracts, which angered a bunch of people.

    when Android came out, and I’m not going into the whole thing about that now. I’ve done that a few times already. Google didn’t curate the store at all. This was a strategy to get as many apps as fast as possible in the store in order to catch up with the App Store. A quirk in the law regarding corporate responsibility in publishing which these stores do, is that if you oversee the store, and have some responsibility in determining what goes in, and what doesn’t, you are liable for anything that breaks the laws in any way, and can be sued by copyright holders, or anyone whose services something in the store is working around. There are laws about theft of services.

    apple would be caught in that, so they didn’t allow anything that violated their rules about illegal activity, either civil or criminal, into the App Store. Google, claiming loudly that they didn’t look at anything in the store and that they therefor weren’t responsible for it, had lots of, let’s say, questionable apps.. People were downloading apps to get free hotspots, for example.

    out of this, Apple was said to be closed, and Android, open. The other reason was that in it’s eagerness to get Android quickly adopted, they allowed skinning, which Apple doesn’t. They also allowed side loading of apps from other stores. That these things bypass security and usability didn’t seem to bother Android users, and along with being able, in some cases to even buy, and load software ROMs to bypass Google’s work, gained Android a great reputation in the pirate community, and a bad one for Apple.

    Even after Google needed to curate because of huge amounts of malware (99% of mobile malware is on the Android platform), and hot spots became free, they maintained their rep. Somehow, every Android problem was pushed to the various OEMs building the phones instead of Google itself, while Apple is just Apple.

    anyway, now I’ve ranted enough.
    Some of your points are legitimate, this one ("FUD about Apple's performance is Apple's fault") is not. What I've seen is a consistent unwillingness to accept reality among people who don't want to accept reality, regardless of what Apple does. 
    An especially obvious version of this is with regard to performance. We got people saying that Apple's performance wasn't that great. Then when GB numbers made that untenable it was "well the SPEC performance will suck". When that became untenable I've now seen plenty of twits say with a straight face that SPEC is purely a marketing benchmark with zero real world relevance. 

    A different version of this has to do with the relationship between IPC, frequency, and performance. To me it seems absolutely obvious that Apple understood, the day they began their designs, that everything about the future (power, smaller but not faster transistors, wire delays starting to dominate transistor delays, 2.5D and then 3D) leaned towards chip designs that prioritized wide and smart over frequency. But if you don't want to believe this, you won't, regardless of what Apple does or doesn't say. If your entire self image is built on the idea that Intel (or maybe AMD) is king, and their kingship is justified by their frequency, you're not interested in arguments for why this is a losing proposition going forward.
    Has nothing to do with what Apple does or doesn't say publicly (let alone weird side tangents like App Store policies). 
    thtStrangeDayswatto_cobraradarthekat2old4fun
  • A7: How Apple's custom 64-bit silicon embarrassed the industry


    melgross said:
    sflocal said:
    melgross said:
    Very typically, an article from DED about SoCs, becomes a rant about Samsung. A rant that has nothing to do with the purported subject of the article.
    This article is not just about the SOC's, but more about how the entire industry was ill-prepared to compete with what Apple suddenly threw out there, being so caught off-guard, that iKnockoffs like Samsung would put out lies about Apple's 64-bit announcement.

    It's a valid article.  DED hits it on all points.
    The article’s title, if you read it, is very specifically about the A7 vs other chips in the industry. Most of the article doesn’t even deal with that other than in a very brief an simplistic way. The rest is his usual rant about Samsung, and some others.

    yes, I know that there are people here who only want to read goody articles about Apple, and so whatever he writes is going to get a thumbs up, and that AI likes this because it generates, both positive and negative comments beyond what an objective article will generate. But that doesn’t mean that I, and some others don’t understand what’s happening, even if you, and some others don’t.
    The headline doesn't say what industry, but the article elaborates that it's the mobile computing industry. Who else is making CPUs in that industry? Qualcomm (mentioned), Samsung (mentioned) and ... ?

    So, Samsung _has_ to be in the article because it's a CPU competitor. To provide context, other aspects of Samsung's business (and business practices) were mentioned. Same for Nvidia, but oddly not for Qualcomm. I suppose two out of three ain't bad?

    I think your characterisation of the article being "his usual rant about Samsung" is a little off-target. Providing facts about mobile CPUs isn't a rant, reminding people of facts about a corporation's behaviour isn't a rant, and revealing a roadmap (that may be obvious in hindsight) isn't a rant. I also think there is very little basis for claiming that the article veers away from the premise in the headline, since the author (as usual) explains not just what happened but what the implications are - opinionated, yes, but hardly grounds for your claim.
    Well, just to be fair, Mediatek, Rockchip, Allwinner and Huawei are also all making mobile chips.
    (But, honestly, only Huawei is making chips that can even pretend to be in flagship class.)
    And nVidia, with Denver was, as mentioned, still considered a serious possibility for  quite a few years after the A7 shipped. 

    At this time, of course, Intel was still peddling the fantasy that it was going to be a contender in mobile, so DED could/should probably have included them in his list of Apple competitor wannabe's. 
    watto_cobraradarthekat
  • What history teaches about Apple's windows of opportunity for 2017

    arlor said:
    As somebody who's interested in the products, not the financials, and actually owns an old Mac Pro, I'm still allowed to want an up to date Mac Pro, right? Or should I just shut up about it, because I should care more about Apple's profitability?
    I'm going to say what Daniel won't (because he has to retain future credibility...)

    Let's look at the full picture here.
    - Are Macs PROFITABLE?
    Yes. Certainly as a class, probably even just in the Mac Pro category.

    - Are Mac STRATEGIC?
    Yes. Where else are developers (inside and outside Apple) going to write apps? Sure, in THEORY, they could do this on Linux or using Dev Studio, but get serious. Apple's whole philosophy is based on owning everything, from the CPUs to the dev tools to the UI. Giving up control of dev tools makes ZERO sense. 

    - Are Macs POPULAR?
    Sure they are. I expect every Apple technical employee and most of the non-tech employees owns a Mac. They are white goods (meaning that you update them when they die, after seven to ten years, not every two years) but that's not the same thing as being unloved. 

    OK, this means Apple has serious reasons to keep Macs going. So why does it seem otherwise? To me the answer is obvious:
    Apple is working on their NEXT "large screen+keyboard" platform  and all the serious resources are going into that. Today's Mac is in a holding pattern, receiving the bare minimum of attention and no more.

    On the hardware side, Apple has basically reached the point where they no longer need Intel. If they're willing to give their existing cores an Intel power budget, they can exceed Intel performance. And Intel has been a LOUSY partner over the past few years, constantly limiting Apple's freedom of action. Instead of a competent deep security model they gave something so complicated that no-one understands it and most security experts don't trust it --- and of course with the usual "it's in some chips but not others". Their in-built GPU performance has constantly lagged, and thwarted Apple's attempts at a big-bang introducing 4K h.265 content across the entire product line. Intel took way too long to roll out USB3 support in its chips (and then subsequent USB updates), and Intel's attitude to Thunderbolt makes no sense --- they claim to love it but seem to do everything they can to prevent it taking off in a big way. 
    So Apple would be far better off ditching Intel for its own hardware control. We're getting close to the point where that's possible. The core's are ready and (my guess) the entire chip (ie not just the cores, but multiple cores [3 or 4] on a chip, multi-processing links to glue multiple chips together, more aggressive GPUs, on-board TB and USB, etc etc, probably already exist in various prototype forms and are being tested, both in Apple labs and perhaps even as part of Apple data centers. 
     
    On the software side, likewise, MacOS has its strengths but, of course, is also showing its age. It's riddled with concepts at the OS level that made sense when it was introduced almost 20 years ago, but make rather less sense given the way we do things today. Obviously some iOS ideas have been retrofitted, but even iOS is ten years old, and iOS is solving a different problem. There are a variety of good ideas for how to design future OSs given the current priorities of mobility (between devices), many cores, and low power, and it would make sense to create a new OS built on these ideas. (For example the Barrelfish research OS starts with the idea of having many cores available connected by not necessarily coherent links, and builds a performant OS on these foundations using a shared-nothing model. This gives you much better scaling, more easily translates to a world of heterogeneous processors --- big and LITTLE cores and GPUs --- AND [most interesting] generalizes better to a personal compute cluster where you want your Macs, your iPhone, your iPad, your Watch, your Airpods etc all communicating reliably and rapidly with each other.)

    So the way I see it, Mac looks like its dead because, in a sense, it is. 90% of the "Mac" team, hardware and software, are working on Project Hexagon (or some other cool code name) creating the whole range of new Mac laptops, desktops, Pros, even (at least for Apple internal consumption) serious servers, together with a new OS that not only does what we want our Macs to do (ie fixes all the damn bugs we've been complaining about for the past three years) but also lays the foundations for the next twenty years of computing. There's only a skeleton team working on the Mac today making minimal HW changes and the minimal OS changes necessary to track the new features in WatchOS and iOS. 

    Use common sense people.
    Apple, you think???, know they have more money than IBM, Intel, MS, VMWare, etc.
    They also know, more than the rest of us, how MacOS has been pushed to its limits, and how the future of computing includes many more wearables and other such devices, includes lotsa AI, new sensors, new UI modalities, includes lotsa cloud computing. They also know that there's lotsa money in enterprise, along with lots of pain. And that Mac, in its current form (and for that matter iPhone soon) have pretty much reached maximum penetration. 
    Finally Apple has always been willing to throw out the past and "think different". Mac 1.0 Newton. OSX, iOS, Swift, each time Apple didn't just do continue to patch the old way of doing things the way MS and say, the Unix community, do things --- they rethought everything and threw out most of the past. 
    Why wouldn't Apple put this all together to come up with a new compute paradigm --- new hardware (Apple designed from the ground up) running a new OS and using new frameworks?
    Of course there'll be backward compatibility, just like there was with the PPC transition, the MacOS->OSX transition, the Intel transition. That's so obvious it's not worth talking about. Likewise the supposed "oh it's essential to be able to run Windows apps" argument strikes me as absurd. You can buy PCs on a stick today --- plug one of those into your USB3.1 connector, run a virtualization app (that feeds the PC-on-a-stick virtual keyboard, pointing device and display) and voila, you have your PC solution. 

    So that's my answer. Sometime in the near future (before 2020) Apple introduces the future of desktop/laptop computing...

    GeorgeBMac