One Apple GPU, one giant leap in graphics for iPhone 8

13»

Comments

  • Reply 41 of 54
    ph382ph382 Posts: 43member
    Server farms use GPU's for machine learning and making inferences.  In fact, Google designed their own "Tensor Processing Unit" hardware to save power and money in that effort.   I believe that is another compelling reason for Apple to broaden their in-house abilities in chip design.  I would dearly love for Siri and handwriting recognition to be entirely local, instead of exposing my information to possible interception on its way to and from a server in North Carolina.  Hopefully, the Basic Neural Network Subroutines will have a development history like Metal.
    watto_cobra
  • Reply 42 of 54
    And what about the PowerPC fiasco?  What of it?  Last time I checked, Macs are selling pretty well right now regardless of how the PowerPC drama went.
    Apple almost went out of business because of it. Then there wouldn't have been any Macs for sale now, nor iPhones, nor iPads, nor iWatches. Those who fail to learn from history may not survive  the next iteration etc.

    From the article:
    If Apple had simply worked to adapt Helvetica, other companies could copy its approach, and consumers would have a harder time seeing an apparent difference.
    I'm afraid this is a bit of a stretch.

    Any graphic designer will confirm that 99 out of 100 people couldn't tell one sans serif font from another, if their lives depended on it.

    As for the possibility that Apple
    could eventually seek to use its GPU on the Mac as well.
    This might result in Macs no longer working with other graphic card vendors and becoming a completely isolated system in terms of graphic cards. I think most will not see this as progress.


    foggyhill said:
    Some thing people conveniently forget is that those ultra repairable beasts of the past were in 2017 money, MUCH MORE EXPENSIVE than whatever people are buying right now. My mother's 1967 washer-dryer set would cost $4000 dollars now... Not many people are spending this kind of money these days for this, same thing with TV sets, people routinely pad the equivalent of 2000 2007 dollars for a set.  My own 1987 computer was $12000 in 2017 money and it wasn't even at the top of the line.
    That's true, inflation is often overlooked. On the other hand, '2017' middle class earnings also relatively lower then 50 years ago. Our family farm made € 45.500 in a good year around 1967. At a 3% average inflation (Netherlands) that would be € 200.000 in '2017 money'. Unfortunately, the real 2017 gross profit is not that much higher then in 1967. In fact our income looks a lot like that of the Median Household Income in this graphic from PewResearchCenter (not corrected for inflation).

    Of course, if we would want to compare computer prices from 1987 to 2017, we would also have to take more factors into consideration, like the lack of competition and the negligible number of computers sold in 1987 compared to 2017. But for the middle-class, a 2017 washer-dryer of $ 1000 is probably about as expensive as a $ 4000 washer-dryer was in 1967.
    edited April 2017
  • Reply 43 of 54
    maestro64maestro64 Posts: 5,043member
    appex said:
    Apple should use standards in the market. Not only ports and connectors, but also unsoldered microprocessors, RAM, SSD, GPU, etc. Otherwise may work in the short term, but not in the long one. Remember the PowerPC fiasco.

    Bad example to use, you think x84 from Intel was standard when it started, same with the 68K from Motorola. PowerPC was being used all over the place right before Apple dump it. They dump it for the same reason they design their own A series CPU and will go to their own GPU, they do not want to compete in the commodity and standard chip business. Apple requirements are one of many and the chip guys only have time to spend on so many requirements and many time they are in conflict with one another so you get a seriously compromised product. They is what happen with the PowerPC. FreeScale wast was going in too many direction and was pull apple down. They same thing happening at Intel, they are too business chasing to many requirements they mobile process fall on it face compare to the Apple own chip.
    watto_cobra
  • Reply 44 of 54
    maestro64maestro64 Posts: 5,043member
    lwio said:
    appex said:
    Apple should use standards in the market. Not only ports and connectors, but also unsoldered microprocessors, RAM, SSD, GPU, etc. Otherwise may work in the short term, but not in the long one. Remember the PowerPC fiasco.
    Agreed for the pro market but the the consumer market rarely upgrades anything. An upgradable  Mac Pro and MacBook Pro is ideal.

    When I worked in the computer industry the big things was always provide an upgrade path, why because everyone offer a path to upgrade. However, the real world data over many years and various product with upgrade solution only 10% of the people actually upgraded after the purchase the product. Most all usually bought new. The only except to this was memory, when memory was really expensive. Most Computer had bare bone memory installed and most people then installed more. Beyond this upgrading was a marketing selling part not an actually profit center for a company. I can tell you companies spend lots of money putting upgradable features in to computer only to find out most majority of uses never took advantage of it. But the idea worked from pure marketing stand point that people were convince it had to be upgradable to make it worth buying. Apple broke this mold with the cell phones, you still hear its competitors talking about their phones have memory slots and apple does not but apple sell more.
    watto_cobra
  • Reply 45 of 54
    JayT said:
    I would bet that Apple is working on integrating Mixed Reality (MR) into the graphics pipeline. Cameras with depth fields, GPU with native support for MR... now that gets interesting.
    You are spot on here.

    And to be more specific I think that the information supplied by the main camera will be transformed before being rendered on the screen. This transform will depend on the position of the iPhone owners eyes so that what is on screen will fit in with the "reality" outside the bezels. This "transformation" all taking place in real time on proprietry Apple silicon. 
    watto_cobra
  • Reply 46 of 54
    StrangeDaysStrangeDays Posts: 12,905member

    copeland said:
    lwio said:
    appex said:
    Apple should use standards in the market. Not only ports and connectors, but also unsoldered microprocessors, RAM, SSD, GPU, etc. Otherwise may work in the short term, but not in the long one. Remember the PowerPC fiasco.
    Agreed for the pro market but the the consumer market rarely upgrades anything. An upgradable  Mac Pro and MacBook Pro is ideal.
    But that is not because upgrades are so difficult. It is that way because people have been told for 2 decades how dangerous and difficult these upgrades are.
    Now Apple (and the other companies are slowly following suit) use this "knowledge" to tie you further in by soldering und glueing everything in thus prohibiting you from doing any upgrades.
    You can't even repair these computers because of this. Now you have to buy a $300 extended warranty from Apple to replace a $50 part because you are locked out from your own device.

    Is anybody calculating this behavior in the carbon footprint of Apple - forcing to trash a viable product that could be in good use for another 3 years by an upgrade and forcing you to buy a complete new one with all the emissions tied to it for production? You can copy paste that to the other companies and possibly they are worse, but I would expect better from Apple. Especially for the desktops there are no technical reasons, its just for the money.
    While it's true they're becoming more difficult to repair yourself, your implied conspiracy is poppycock. It isn't done to lock you out, it's done because this is how our devices get slimmer and smaller. The side effect is more difficult to repair. This is not much different than how repairing your car motor is now far beyond the ability of most normals. Or television repair. Etc... They can still be done, but in most cases it'll have to be done by a pro.
    watto_cobra
  • Reply 47 of 54
    StrangeDaysStrangeDays Posts: 12,905member

    "Beginning to pull ahead"??? I guess you are underestimating Apple's achievement here. Apple is already ahead of the rest of the industry, by about more than 1 year!!! What the benchmarks do NOT show is - the sustainable performance. If there are benchmarks created to show "sustainable performance", instead of just "peak performance", everyone will know how much Apple is ahead in the mobile SOCs.
    There are enough vain people around who simply have to have the latest bit of bling. I picked up a hardly used iPhone 7 last week for 40% under Apple's retail price. Lots of them are being traded in as people switch to the S8/S8+.

    1) I upgrade my phone annually and it has nothing to do w/ vanity. 

    2) Please cite your claim that pawn shop iPhone 7's are being traded in for Samsung S8's -- where are you getting the motivation from the presumably anonymous sellers from?
    Soliwatto_cobra
  • Reply 48 of 54
    brakkenbrakken Posts: 687member
    appex said:
    Apple should use standards in the market. Not only ports and connectors, but also unsoldered microprocessors, RAM, SSD, GPU, etc. Otherwise may work in the short term, but not in the long one. Remember the PowerPC fiasco.
    Err... define, 'standard', please. Do you mean the 'standard' ports used, for example, by Samsung on its S4, or the S5, or the S6? Or the S3?

    Standards are not something that exist in a vacuum for all eternity, and as Microsoft proved in the late 80's and definitely by 95, standards can be implemented in a surprisingly nasty series of ways. Look at Samsung: if they hadn't copied every element of Apple's iOS, iPhone, marketing and packaging look and feel concepts, HTC would still be battling it out with Motorola. Samesung would still be a minor player making parts for others, and Android would still be a half-baked Windows Mobile 'killer'. Which it did.

    The justice system has certainly been doing a wonderful job of converting Apple's original work into standards, destroying Apple's right to protect its intellectual property. So, let's look at Apple sans innovation and just recreate the iPhone with standards - naturally, from 2007: no multitouch screen - you have to use a stylus; no 30-pin connector, so that means USB with no power charging. But wait! - it was Apple that created the new market standard of using USB on Macs, so no USB, either. A proprietary connector from someone else...? Ah, Sony! And the battery has to be removable, meaning a very thick and heavy piece of crap. There is no Bluetooth, so no accessories, and no Wifi as it was also Apple that created the market standard of building Wifi into mobile devices.

    There are no Apps because in breaking with the network carriers, Apple broke all standards and created the best, most amazing retail applications store in history. And, last but not least, iOS. The standards back then were Flash and Java. Apple actually ensured the Internet implementation was based on standards, and they were pushing Nvidia and AMD to use OpenGL, but why should Apple keep using 'standards' for its own internal implementations? That doesn't really make sense in light of what Apple has and can achieve by developing its own tailor-made solutions.

    Apple is unique in the history of technology, and 'standards' can only retard Apple, and promote its nasty, lying, betraying 'partners'. No. Apple has the vision and skill to create its own future! They owe Samsung, Microsoft, and Google nothin at all. Over the next three years, Apple will, yet again, change the face of technology and in doing so, change the nature of our daily lives. Standards are essential for cross-platform compatibility, such as the Internet and document processing, email and messaging, but having to cater to competitors' deficiencies is not evolution, and can not result in innovation. I love tech innovation, so I say that I love Apple: who knows - maybe the new Apple GX series will become the new standard, yes?
    randominternetpersonwatto_cobra
  • Reply 49 of 54
    GeorgeBMacGeorgeBMac Posts: 11,421member
    tundraboy said:
    appex said:
    Apple should use standards in the market. Not only ports and connectors, but also unsoldered microprocessors, RAM, SSD, GPU, etc. Otherwise may work in the short term, but not in the long one. Remember the PowerPC fiasco.
    You totally misunderstand the PowerPC fiasco.  PowerPC was not a proprietary component, it was outsourced from IBM.  The lesson there is not that Apple shouldn't have made its own proprietary CPU (because it didn't in that case), it's that Apple shouldn't rely on a supplier to keep its product, and thus Apple's product too, on the cutting edge.  
    You are correct that the PowerPC was not a "fiasco".   But, I think the case is more complex than what you describe:  
    At the time Apple was simply too small to delve into creating its own chips.  And, they had the choice of continuing to buy the from Intel -- or buy a superior chip that produced nearly double the output and that was backed by two of the largest,  most stable companies in tech:  Motorola and IBM.   Yes, those two latter pulled out of the market and abandoned their chip.  But, at the time nobody knew that would happen and besides, it provided technical superiority to Apple for years...

    When judging technical decisions, you have to consider the state of things at that time and "knowing what we knew at the time...".   One of Apple's strengths that was demonstrated both by getting into the PowerPC and getting back out of it has been awareness of reality and the flexibility to respond to that reality.
    watto_cobra
  • Reply 50 of 54
    GeorgeBMacGeorgeBMac Posts: 11,421member
    The debate over whether standards are good or bad is like arguing over whether weather is good or bad -- it is both.

    Standards provide efficiency for both manufacturers and users to take advantage of scale:  Having standards in Bluetooth for instance opens up the ability to connect a wide variety of peripherals (from a chest strap heart rate monitor to a whole house) to a computer without having to lock yourself into a single supplier.

    And yes, after a certain point, they also impede innovation.   Should we all still be using parallel cables to connect our printers?
  • Reply 51 of 54
    ...Just because Apple now has a font doesn't make them king of the GPU world now.... What they have done is good [??], but the competition is just as good [what competition?] - and more focussed on a single product [ copying Apple AND no explosive phones]. You can't tell me the company that can't even make a Pro macbook is all of a sudden going to make a Pro GPU.
    [emphasis mine]

    LMAO! Comparing a MackBook Pro to a multiproduct GPU is like comparing oranges and Earth -- it is way of the mark. Ya might wanna think a bit more before posting #28. IMHO 27 crashed a takeoff!

    edited April 2017
  • Reply 52 of 54
    mcdavemcdave Posts: 1,927member
    Nice piece, this has been weighing on my mind for a few weeks now and I reckon Apple will finally use their integrated HW/SW advantage to good effect.

    For years hardware development cycles have been glacial with architectures developing over decades.  Software, on the other hand, takes days/months/years to evolve but this is changing with hardware evolution speeding up and software (mainly at the OS layer) slowing down and bedding in.  For decades we've been writing software for the available hardware but Apple's control of all layers means they're the only ones who can bring the hardware to the software.  My thoughts were similar to DED's and I'd expect to see a GPU for mobile & portables with custom silicon:-

    1. Accelerating web tech (mainly JS Nitro)
    2. Accelerating macOS (CoreGraphics, CA etc.) frameworks
    3. Accelerating Metal and maybe a nod to OpenGL/CL on the way out (reviewers seem obsessed with game demo frame-rates)
    4. Introducing Ray-tracing to Metal rendering with little software re-writing, I'm assuming there's a reason they've restricted too low-level access.

    Unfortunately these smart approaches won't run brute-force benchmarks for generic/interoperable technologies as quickly as other GPUs and the TFLOP count is likely to be modest so the bleaters won't see the benefit.  Benchmarks invariably ignore the 1st party software users are actually likely to run.

    Although iOS will lead the charge I still reckon we'll see some discrete Apple GPUs/SoCs in MacBooks/iMacs, especially as touch bars require their own SoCs in the machines anyway.
  • Reply 53 of 54
    I would like to see this on Apples Mac Pro, Mac Air Laptop Lineup. The current Intel-based Iris series or lesser just sucks and can't handle hardcore gaming softcore video editing etc ...
    edited April 2017
  • Reply 54 of 54
    I'm starting to see the emergence of the mobile Silicon Graphics machines I pined for as a kid back in 1996. Apple has always had more or less this design philosophy, but going further and further into their own home grown is really starting to show hints of the former supergiant. Where the hardware was almost entirely built in house, CPU, gpu, ram, inter system connects, OS, etc. Where R&D pretty much had a blank check, and the machines in result were amazing. 
    I can't wait to see this new gpu. 
Sign In or Register to comment.