zimmie

About

Username
zimmie
Joined
Visits
172
Last Active
Roles
member
Points
2,737
Badges
1
Posts
651
  • AMD to unveil Radeon RX 6000 GPU family on Oct. 28

    tht said:
    I'm betting that the A14 GPUs will have hardware support for raytracing. So, the iPhone will be the first Apple device to have hardware raytracing support, not Macs with AMD GPUs. ;)

    Anyways, anyone want to clarify AMD GPU codenames? Both the chip and graphics API support? Getting confusing out there. Navi, Navi 14, Navi 12, RDNA, RDNA2, so on and so forth. 
    AMD's internal codenames for things are pretty rough, yeah. They have a set of codenames for the instruction family, a different set of codenames for the chip family, and a third set of codenames for the individual chips.

    The Graphics Core Next (GCN) 4 instruction family is composed of the Polaris chip family. The codename for individual chips is from the "Arctic Islands" family.

    Polaris 10 - RX 470, RX 480
    Polaris 11 - RX 460
    Polaris 12 - RX 540, RX 550

    Polaris 20 - RX 570, RX 580
    Polaris 21 - RX 560
    Polaris 22 - RX Vega M GH, RX Vega M L

    Polaris 30 - RX 590

    After that came the GCN 5 instruction set and the Vega chip family:

    Vega 10 - RX Vega 56, RX Vega 64
    Vega 12 - Pro Vega 16, Pro Vega 20
    Vega 20 - Pro Vega II, Radeon VII

    After GCN5 comes the RDNA 1 instruction set and the Navi chip family:

    Navi 10 - RX 5600, RX 5700
    Navi 14 - RX 5300, RX 5500



    Yes, it's all gratuitously confusing. The higher the number within a given chip family, the lower the performance.
    thtfastasleepPascalxxwatto_cobra
  • Mass production of Apple Silicon's A14X processor to start in Q4 2020

    zimmie said:
    entropys said:
    I was sorta hoping The ASi SOC would be a different chip to the iPad’s.

    humongous even. A power unconstrained monster in comparison.
    "A14X" here is probably a placeholder. There's little reason for an iPad and the entry-level Macs to use different processors, though. It just needs some PCIe lanes or an integrated Thunderbolt controller to be suitable for Macs. Make one chip, and just don't use the Thunderbolt on the iPads (or do use it instead of just USB 3 over USB-C).

    revenant said:
    is it possible to have two chips that run together and the OS sees and uses it as one chip? 
    Sort of. Multiple cores on a single chip have a bus which connects them. This lets them share some data about what they are doing with the other cores on the same chip, so the OS knows which cores have available time for work. This bus can be extended out past the physical chip to let you use many separate chips. For example, the 2006-2012 Mac Pro models had sockets for two processors (yes, some of the 2009-2012 only had a single socket on their drawer, but the same motherboard could take a two-socket drawer).

    What you can't do is connect two processors together and present them to the OS as a single, faster core. More processors means you can do more things at the same speed, not that you can do one thing faster. Fortunately, a lot of processor-intensive work can be split into chunks which can be worked in parallel.

    aderutter said:
    I wouldn’t expect a Mac to have the same chip as an iPad due to the comments Apple have said about a family of SOCs for the new Macs.
    I guess the MacBook Air would be okay with an A14X but I would expect the Macs to have more cores, especially the MacBook Pro rumoured for next weeks event. But having two A14X SOCs would be nice, and efficient from a manufacturing perspective :)
    I don’t really want a MBP that is only performance equivalent to the current Intel MBP, I want something that is far ahead in more than just heat and battery.
    The A12X already keeps pace with current high-end laptop chips from Intel. A13 cores are faster and more efficient. A14 cores will be faster and more efficient still. With laptop cooling, I expect the laptop A14 to be able to perform extremely well for the bottom of Apple's range (MacBook Air, 2 TB3 MBP).
    I understand where you're coming from but during the WWDC keynote, it was stated that Apple was creating a family of SoC's specifically for the Macs
    Sure, but there’s no reason not to use a low-end Mac SoC in an iPad. The significant differences are the touch controller (currently external), cell modem (currently external), M coprocessor, the level of peripherals one would want (that is, PCIe lanes for Thunderbolt), and the power envelope. May as well make one part and shut off the stuff you don’t need. Chips stable at extremely high performance get binned as Mac parts, chips which fail the Mac test but which still work get binned as iPad parts and run a bit slower.

    That would make the chip “specifically made for Macs”, but still usable in iPads.

    Like how net neutrality said telcos couldn’t extort Netflix for more money under threat of deprioritization, so the telcos just slowed everything down and “let” companies pay for “fast lanes”. Technically fits within the promise.

    Plus everything above the low end would definitely need a Mac-specific chip for a 15W+ power envelope.
    watto_cobrathtFileMakerFeller
  • Mass production of Apple Silicon's A14X processor to start in Q4 2020

    entropys said:
    I was sorta hoping The ASi SOC would be a different chip to the iPad’s.

    humongous even. A power unconstrained monster in comparison.
    "A14X" here is probably a placeholder. There's little reason for an iPad and the entry-level Macs to use different processors, though. It just needs some PCIe lanes or an integrated Thunderbolt controller to be suitable for Macs. Make one chip, and just don't use the Thunderbolt on the iPads (or do use it instead of just USB 3 over USB-C).

    revenant said:
    is it possible to have two chips that run together and the OS sees and uses it as one chip? 
    Sort of. Multiple cores on a single chip have a bus which connects them. This lets them share some data about what they are doing with the other cores on the same chip, so the OS knows which cores have available time for work. This bus can be extended out past the physical chip to let you use many separate chips. For example, the 2006-2012 Mac Pro models had sockets for two processors (yes, some of the 2009-2012 only had a single socket on their drawer, but the same motherboard could take a two-socket drawer).

    What you can't do is connect two processors together and present them to the OS as a single, faster core. More processors means you can do more things at the same speed, not that you can do one thing faster. Fortunately, a lot of processor-intensive work can be split into chunks which can be worked in parallel.

    aderutter said:
    I wouldn’t expect a Mac to have the same chip as an iPad due to the comments Apple have said about a family of SOCs for the new Macs.
    I guess the MacBook Air would be okay with an A14X but I would expect the Macs to have more cores, especially the MacBook Pro rumoured for next weeks event. But having two A14X SOCs would be nice, and efficient from a manufacturing perspective :)
    I don’t really want a MBP that is only performance equivalent to the current Intel MBP, I want something that is far ahead in more than just heat and battery.
    The A12X already keeps pace with current high-end laptop chips from Intel. A13 cores are faster and more efficient. A14 cores will be faster and more efficient still. With laptop cooling, I expect the laptop A14 to be able to perform extremely well for the bottom of Apple's range (MacBook Air, 2 TB3 MBP).
    tmayaderutterwatto_cobrarevenant
  • Apple is reinventing eye tracking technology to bring it to 'Apple Glass'

    mcdave said:
    hface119 said:
    tjwolf said:
    Oh, come on!  You give reading a book as an example of why eye tracking would be useful in AR???  That's just about the dumbest application of AR there is - how is reading *augmenting* reality?
    In the example of a blank notebook (should the user wish to have a physical representation of the book), the words could be imposed on the blank pages. I didn't think that was that hard to imagine...
    It’s still a bad example. Adjusting artefact resolution based on the gaze to reduce power consumption would be better.  I’m pretty sure NASA & Samsung showed why using gaze to control is a bad idea.
    Ding ding ding! While gaze-based control is probably part of it, gaze tracking is most useful for foveated rendering.

    As for the supposed difficulty in lowering the requirements for the gaze tracking, Canon had gaze-based autofocus point selection in the EOS-3 back in 1998. It worked incredibly well using only a low-end camera autofocus processor for the gaze tracking (the actual autofocus used a much nicer processor). This patent is more or less how it worked. Since all of our eyes have slightly different shapes, it requires training for each individual user (and each individual eye if the photographer wants to use both), where you look at a series of flashing elements in the viewfinder and the gaze tracking system records what the reflections look like in that position.

    It's actually a lot like Face ID, now that I think about it. The system effectively builds a model of the surface of your eye to tell where the cornea is pointing.
    randominternetpersonfastasleepwatto_cobra
  • First Apple silicon Macs likely to be MacBook rebirth, iMac with custom GPU

    Rayz2016 said:
    Rayz2016 said:
    An in-house GPU eh?

    This is where the bun fight starts. 
    I would imagine that the MacBook would continue to use the SoC style GPU just like the iPad. It will be interesting to see what Apple does for higher end hardware like the MBP or iMac line. Will it still be integrated, or will they do a discrete version of the GPU?
    I have a question, as a GPU layperson. 

    If they’re using their own tech for GPUs then what is the advantage of a discrete GPU?  

    The big advantage is disaggregation. For example, if you put the GPU in a physically separate chip, you can cool it separately. You can also design and update it separately, but that's mostly important for integrators who don't make their own chips.

    This advantage doesn't matter as much in a laptop as it does in a tower like the Mac Pro.

    Rayz2016 said:
    Rayz2016 said:
    An in-house GPU eh?

    This is where the bun fight starts. 
    I would imagine that the MacBook would continue to use the SoC style GPU just like the iPad. It will be interesting to see what Apple does for higher end hardware like the MBP or iMac line. Will it still be integrated, or will they do a discrete version of the GPU?
    I have a question, as a GPU layperson. 

    If they’re using their own tech for GPUs then what is the advantage of a discrete GPU?  

    Most likely Apple won’t be able to compete on speed and features when it comes to high end GPUs such as the Radeon or Nvidia ones, and they’ll offer a replacement for what we consider built-in graphics GPUs (such as the Intel ones).

    Perhaps the MacBook Pro Apple SoC version will contain an Apple integrated GPU plus a discrete ATI GPU, similar to laptops as of today but with Intel GPUs.
    As Apple advances GPU design they will more likely replace the mid to high-end with their own ones as well. 
    That seems odd to say, since the A13's existing GPU gets about half the performance of the Xbox One S, and Macs have much better cooling than iPhones. GPU performance doesn't quite scale linearly, but it's close.
    tmayspock1234watto_cobra