Apple investigating RISC-V instruction set architecture, job listing shows

Posted:
in General Discussion edited September 2021
A job listing posted to Apple's website this week reveals the company is researching RISC-V instruction set architecture solutions, suggesting future in-house chip designs might implement the open-source technology.

A14


The posting for a "RISC-V High Performance Programmer" was published to Apple's corporate jobs website on Thursday and seeks a programmer experienced with RISC-V architectures. An ideal candidate also has a working knowledge of NEON micro architecture in ARM CPU cores.

Programmers hired for the task will join Apple's Vector and Numerics Group, which is responsible for "designing, enhancing and improving various embedded subsystems running on iOS, macOS, watchOS and tvOS."

As noted by Tom's Hardware, which spotted the post earlier today, it appears that Apple is already deploying RISC-V, at least internally. The company is not known to have incorporated the ISA into a shipping product.

"You will work in a SW and HW cross functional team which is implementing innovative RISC-V solutions and state of the art routines," the listing reads. "This is to support the necessary computation for such things as machine learning, vision algorithms, signal and video processing."

RISC-V is a relatively new instruction set that is typically used in low-performance applications, though the architecture is rapidly expanding with input from collaborative partners.

How Apple intends to integrate RISC-V is unknown, though the company could effectively avoid licensing fees for ARM vector cores by adopting the architecture. All current Apple Silicon designs, including A- and M-Series SoCs as well as the S-Series and other system-in-package designs, utilize custom ARM cores.

RISC-V integration is unlikely to overtake ARM, at least in the near-term. Apple's investment in customized ARM-based processor technology is substantial and has produced a balanced platform that is unrivaled in its efficiency and performance.

Read on AppleInsider

Comments

  • Reply 1 of 18
    Interesting. Apart from the licensing freedom this could make interesting new computing devices.
    watto_cobra
  • Reply 2 of 18
    lkrupplkrupp Posts: 10,557member
    The unending quest for total and absolute control of the product without being dependent on a third party as much as possible. It’s part of the reason for Apple’s success and popularity.
    tmaykillroypatchythepirateseanjwatto_cobrajony0
  • Reply 3 of 18
    I'm sure ARM sells the architecture and innovation to competitors.

    Now that Apple cut the bonds with intel, we know that Intel would sell apples custom chips (like the original MacBook Air) 1 year after release, to competitors.

    Apple had a very heavy hand on that and many other intel chip designs.
    watto_cobra
  • Reply 4 of 18
    RISC-V processors are already used in multitude of specialised cases: Seagate is testing them for creating "enable massive parallel computational" storage solutions; Nvidia uses RISC-Vs in their GPUs, likely handling some IO, ("and beyond"); Western Digital is going for RISC-V controllers to their hard drives.

    Don't take my words as gospel, but one of the advantages of RISC-V (as an ISA) over ARM (and x86) is that it doesn't carry baggage from ages ago. Not that information theory and computation theory has really advanced so far that they're from a different universe, but some things can be done more efficiently with modern approaches. Benchmarks made by RISC-V developers highlight some specifics where their platform has an advantage over the others, but in general the difference is really really small or goes to something like +/- 5% difference in a specific compilation on a specific compiler (like GCC). Note the minus.

    The other advantage is that it's really "modular (core is really small) and easy to customise" and create custom instructions for ASIC purposes (for example for handling high amount of IO in a GPU). Even if it's free and open source, the same problem comes around as with other ISAs, if you want something done, you need to hire people or buy the service if you can't do it yourself.

    I'd see the RISC-V becoming a huge thing in accelerators and in some specific controllers (Power, USB, Modem etc.) within the next 5 years. Even if there are already some machines running Linux on RISC-V, they are low performance and have a really long uphill way to become even remotely mainstream. 
    hcrefugeeviclauyyckillroyrundhvidpatchythepiratewatto_cobrajony0
  • Reply 5 of 18
    gatorguygatorguy Posts: 24,213member
    I would not be shocked if Apple uses this to join up with Open Titan development.
    Oferjony0
  • Reply 6 of 18
    A reaction to NVIDIA buying ARM?
    command_f
  • Reply 7 of 18
    A reaction to NVIDIA buying ARM?
    At least a defensive move while it is unclear who will own ARM. While the ARM architecture remains available to all, there are benefits to using it but a 'hostile' (or just misguided) new owner could try to increase licensing costs.
    watto_cobra
  • Reply 8 of 18
    mpantonempantone Posts: 2,040member
    I'm sure ARM sells the architecture and innovation to competitors.

    Arm has no rights to Apple's intellectual property. Apple is an Arm licensee.

    I think today Apple doesn't even use Arm's reference designs for its CPU cores. Apple's CPU designs have been proprietary for many years. They're really only using the Arm ISA.
    Now that Apple cut the bonds with intel, we know that Intel would sell apples custom chips (like the original MacBook Air) 1 year after release, to competitors.

    Nah, without a doubt, there is still an active contract in place between Apple and Intel since Apple is still selling Intel-powered devices. Apple likely will continue to do so for several more years, most likely keeping an Intel-powered Mac Pro available for the professional customers that rely on Intel hardware for their established workflow.
    Apple had a very heavy hand on that and many other intel chip designs.
    Previous Apple-Intel contracts seem to indicate exclusivity for certain chips. The logical conclusion is that exclusivity is in perpetuity for the component's life cycle. We have not seen any old Apple-only Intel CPUs being widely marketed. There's little demand for an ancient CPU.
    FileMakerFellerpatchythepirateseanjwatto_cobrajony0
  • Reply 9 of 18
    mpantonempantone Posts: 2,040member
    A reaction to NVIDIA buying ARM?
    It's possible, but adding a couple of RISC-V engineers isn't going to eliminate Arm license fees regardless of who owns Arm. Right now Softbank is pocketing those fees.

    Note that Apple did not publicly object to Nvidia's acquisition. If they really didn't want the Nvidia-Arm acquisition to occur, they probably would have vocally pushed the FTC and international NGOs to block the merger.

    A couple of job requisitions in a small group isn't going to halt all Arm silicon development at Apple.

    A more likely scenario right now is that Apple is trying to keep options open looking toward the future. Remember that RISC-V does not currently have performance that comes close to what Apple has implemented with their recent A-series and M-series SoCs.
    edited September 2021 patchythepirateseanjwatto_cobra
  • Reply 10 of 18
    melgrossmelgross Posts: 33,510member
    nadriel said:
    RISC-V processors are already used in multitude of specialised cases: Seagate is testing them for creating "enable massive parallel computational" storage solutions; Nvidia uses RISC-Vs in their GPUs, likely handling some IO, ("and beyond"); Western Digital is going for RISC-V controllers to their hard drives.

    Don't take my words as gospel, but one of the advantages of RISC-V (as an ISA) over ARM (and x86) is that it doesn't carry baggage from ages ago. Not that information theory and computation theory has really advanced so far that they're from a different universe, but some things can be done more efficiently with modern approaches. Benchmarks made by RISC-V developers highlight some specifics where their platform has an advantage over the others, but in general the difference is really really small or goes to something like +/- 5% difference in a specific compilation on a specific compiler (like GCC). Note the minus.

    The other advantage is that it's really "modular (core is really small) and easy to customise" and create custom instructions for ASIC purposes (for example for handling high amount of IO in a GPU). Even if it's free and open source, the same problem comes around as with other ISAs, if you want something done, you need to hire people or buy the service if you can't do it yourself.

    I'd see the RISC-V becoming a huge thing in accelerators and in some specific controllers (Power, USB, Modem etc.) within the next 5 years. Even if there are already some machines running Linux on RISC-V, they are low performance and have a really long uphill way to become even remotely mainstream. 
     Right now, RISC-V is years behind even the simplest ARM general purpose CPU. It’s being used for specialized tasks. It’s designed to be a controller for other sub units, such as a neural engine, or a machine learning processor, or optical analytics.

    It will take years to be useful, if it ever is, for a low end phone, or similar device. A major reason it’s so efficient is that it only has 50 instructions.

    I can see Apple being interested because it would be irresponsible to look the other way. Notice that the engineer they’re looking for Also needs experience with ARM. I suspect that Apple is looking to see if some aspects of this can be used inside their ARM based instruction set chips, but not to replace them
    edited September 2021 fastasleeppatchythepiratewatto_cobrajony0
  • Reply 11 of 18
    melgrossmelgross Posts: 33,510member
    command_f said:
    A reaction to NVIDIA buying ARM?
    At least a defensive move while it is unclear who will own ARM. While the ARM architecture remains available to all, there are benefits to using it but a 'hostile' (or just misguided) new owner could try to increase licensing costs.
    As a founder of ARM, it’s believed that Apple has a license to ARM in perpetuity. How that affects their costs, I can’t say but it’s also believed that Apple has a long term contract as to licensing costs.
    patchythepirateseanjwatto_cobrajony0
  • Reply 12 of 18
    How do you figure? FreeBSD, Linux, LLVM, GCC, golang, (O)KL4 and most recently OpenBSD run on RISC-V. Generally speaking, code branches in FreeBSD and OpenBSD are *years ahead* of consumer computing offered by Apple which still borrows heavily from those upstream libre/open source software operating systems (e.g. LibreSSL, OpenSSH).

    In 2020, Micro Magic demonstrated a 5GHz RISC-V CPU consuming just 1W of power. Do you know of *any* ARM CPU at that clockspeed sipping power so frugally? I do not. ONiO.zero also announced a 24MHz RISC-V CPU some time ago which uses energy harvesting techniques so that it has no active power draw. Sure, there are some ARM CPU implementations which are similar to that, but the licensing fees, particularly as ARM IP is caught up in a SoftBank/NVidia murky mess with litigation being involved doesn't exactly make it seem as if it is primed for the future. As it is, ARM is really a 1980s vintage CPU ISA, with some 64bit grafts, not entirely dissimilar to AMD64 extensions for the 1970s x86 Intel CPU ISA.

    Since 2010, RISC-V has already been taught in undergraduate programs in schools of merit, meaning that the future workforce will be ready to jump on RISC-V in the commercial sector. MIT's Xv6 (https://pdos.csail.mit.edu/6.828/2012/xv6.html) is essentially a "Lion's Commentary on Unix" without the apocrypha due to the AT&T vs BSDi lawsuit which caused so many headaches in the 1990s.

    Maybe you're thinking: "but RISC-V doesn't even implement an FPU!" without acknowledging the reality that most FPU designs are 1980s vintage IEEE 754 which in academia are losing favor as contrasted with posit/unum alternatives.

    Sure, RISC-V isn't ubiquitous, yet, but I think that is just a matter of time. The HiFive Unmatched which began shipping earlier this year uses U740 64-bit RISC-V CPUs in a 28nm process from TSMC. However, TSMC and SiFive have already demonstrated a 5nm iteration of their next gen RISC-V CPU as a proof of concept, and we already know that TSMC is spinning up a 2nm fab, so I think it follows that RISC-V will be ready for that whenever the wafers are primed for lithography.

    Others already mentioned NVidia's roadmap from migrating from 32bit ARM cores in their GPUs to 64bit RISC-V designs. Moreover, the RISC-V spec already has provisions for 128bit addressing. Even from an assembly perspective, the ISA is refreshingly, intentionally, simply designed without decades of vendor proprietary cruft tacked on as afterthoughts to attempt to differentiate/feature parity themselves with "competitors" (e.g. MMX, Neon, SSE, etc.), it has vector math as a default, not as an after thought. When reading Andrew Waterman and David A Patterson's The RISC-V Reader: An Open Architecture Atlas, I was in awe at how much information was presented in so few pages. I think not since K&R have I read a technical text which seemed so condensed and so useful.

    You can run a full blown BSD or Linux distro on RISC-V systems *today*, I think that is already immediately useful. The BeagleBoard.org® Foundation has begun shipping preliminary samples of the BeagleV to developers who are looking for a RISC-V Raspberry Pi sort of alternative and its estimated street price is going to be $149 to $199 depending upon equipped RAM. At least in my experience with a BeagleBone Black in years past, their efforts were significantly better than any Raspberry Pi I ever used. That's just the low end too. For people who are using FPGAs for higher end research, Olof Kindgren's SERV RISC-V implementation has already demonstrated being able to spin up > 5000 cores on some higher end FPGA development boards. For organizations which tape out silicon (or at least design it even if they may be fabless, such as AMD, NVidia and Apple), RISC-V seems extremely promising on a number of levels.

    I've applied to jobs at Apple in the past without much luck, even when I had internal recommendations the furthest I ever got was a phone screening, but I will probably toss my résumé at this requisition. Honestly, while I did end up buying an M1 Apple Silicon Mac last year, I was pretty disheartened and hoped that they would have already been iterating their own RISC-V designs rather than simply present yet another renamed variant of an ARM CPU. I was also programming on MC68K NeXT machines before most people had ever heard of objective-C and at least circa 1994, the best performing systems running NeXT Step were some 486dx2 systems running at 66MHz at my college, though they still seemed pokey and inefficient as contrasted with MIPS based Silicon Graphics workstations, and certainly Commodore (RIP) made substantially more efficient and versatile use of MC68K hardware than Apple ever figured out.

    I don't see this as being a bad move by Apple by any stretch of the imagination. Intel has already been rumored to be working on their own RISC-V designs, and supposedly made an offer to buy SiFive earlier this year, despite having been an early investor in the company, that seems to have not come to fruition. Apple, and others can continue to iterate ARM, but it is pretty clear that it's kind of like hot rodding an old jalopy at this point. RISC-V is where the puck is heading, rather than where it has been, to mangle a quote Steve Jobs borrowed from Wayne Gretzky.

    I mean, I guess you could use a RISC-V as a microcontroller, insomuch as the SiFive HiFive1 (rev B, though ostensibly the discontinued initial release as well) is more or less a drop in replacement for an Arduino, but as someone who was doing PIC level coding in the 1980s, there are more efficient and smaller package alternatives to Arduinos too, PIC logic never went away, it just got denser and more powerful with increased ASIC integration and these days there are even video games such as Shenzhen I/O which are more advanced logic tools than SPICE systems from a few decades ago (it honestly wouldn't surprise me if the developer of Shenzhen I/O is using a video game as a way to crowd source cheaper IC designs for strange consumer products that the players will never realize they had a hand in directly). Realistically though, RISC-V hardware can run real operating systems *today*. macOS and Windows haven't joined the party yet, and I think we can already guess Microsoft will be last to finish as usual.

    melgross said:

     Right now, RISC-V is years behind even the simplest ARM general purpose CPU. It’s being used for specialized tasks. It’s designed to be a controller for other sub units, such as a neural engine, or a machine learning processor, or optical analytics.

    It will take years to be useful, if it ever is, for a low end phone, or similar device. A major reason it’s so efficient is that it only has 50 instructions.

    I can see Apple being interested because it would be irresponsible to look the other way. Notice that the engineer they’re looking for Also needs experience with ARM. I suspect that Apple is looking to see if some aspects of this can be used inside their ARM based instruction set chips, but not to replace them

    edited September 2021 nadrielFileMakerFellerrundhvidpatchythepiratewatto_cobrajony0
  • Reply 13 of 18
    gatorguygatorguy Posts: 24,213member
    How do you figure? FreeBSD, Linux, LLVM, GCC, golang, (O)KL4 and most recently OpenBSD run on RISC-V. Generally speaking, code branches in FreeBSD and OpenBSD are *years ahead* of consumer computing offered by Apple which still borrows heavily from those upstream libre/open source software operating systems (e.g. LibreSSL, OpenSSH).

    In 2020, Micro Magic demonstrated a 5GHz RISC-V CPU consuming just 1W of power. Do you know of *any* ARM CPU at that clockspeed sipping power so frugally? I do not. ONiO.zero also announced a 24MHz RISC-V CPU some time ago which uses energy harvesting techniques so that it has no active power draw. Sure, there are some ARM CPU implementations which are similar to that, but the licensing fees, particularly as ARM IP is caught up in a SoftBank/NVidia murky mess with litigation being involved doesn't exactly make it seem as if it is primed for the future. As it is, ARM is really a 1980s vintage CPU ISA, with some 64bit grafts, not entirely dissimilar to AMD64 extensions for the 1970s x86 Intel CPU ISA.

    Since 2010, RISC-V has already been taught in undergraduate programs in schools of merit, meaning that the future workforce will be ready to jump on RISC-V in the commercial sector. MIT's Xv6 (https://pdos.csail.mit.edu/6.828/2012/xv6.html) is essentially a "Lion's Commentary on Unix" without the apocrypha due to the AT&T vs BSDi lawsuit which caused so many headaches in the 1990s.

    Maybe you're thinking: "but RISC-V doesn't even implement an FPU!" without acknowledging the reality that most FPU designs are 1980s vintage IEEE 754 which in academia are losing favor as contrasted with posit/unum alternatives.

    Sure, RISC-V isn't ubiquitous, yet, but I think that is just a matter of time. The HiFive Unmatched which began shipping earlier this year uses U740 64-bit RISC-V CPUs in a 28nm process from TSMC. However, TSMC and SiFive have already demonstrated a 5nm iteration of their next gen RISC-V CPU as a proof of concept, and we already know that TSMC is spinning up a 2nm fab, so I think it follows that RISC-V will be ready for that whenever the wafers are primed for lithography.

    Others already mentioned NVidia's roadmap from migrating from 32bit ARM cores in their GPUs to 64bit RISC-V designs. Moreover, the RISC-V spec already has provisions for 128bit addressing. Even from an assembly perspective, the ISA is refreshingly, intentionally, simply designed without decades of vendor proprietary cruft tacked on as afterthoughts to attempt to differentiate/feature parity themselves with "competitors" (e.g. MMX, Neon, SSE, etc.), it has vector math as a default, not as an after thought. When reading Andrew Waterman and David A Patterson's The RISC-V Reader: An Open Architecture Atlas, I was in awe at how much information was presented in so few pages. I think not since K&R have I read a technical text which seemed so condensed and so useful.

    You can run a full blown BSD or Linux distro on RISC-V systems *today*, I think that is already immediately useful. The BeagleBoard.org® Foundation has begun shipping preliminary samples of the BeagleV to developers who are looking for a RISC-V Raspberry Pi sort of alternative and its estimated street price is going to be $149 to $199 depending upon equipped RAM. At least in my experience with a BeagleBone Black in years past, their efforts were significantly better than any Raspberry Pi I ever used. That's just the low end too. For people who are using FPGAs for higher end research, Olof Kindgren's SERV RISC-V implementation has already demonstrated being able to spin up > 5000 cores on some higher end FPGA development boards. For organizations which tape out silicon (or at least design it even if they may be fabless, such as AMD, NVidia and Apple), RISC-V seems extremely promising on a number of levels.

    I've applied to jobs at Apple in the past without much luck, even when I had internal recommendations the furthest I ever got was a phone screening, but I will probably toss my résumé at this requisition. Honestly, while I did end up buying an M1 Apple Silicon Mac last year, I was pretty disheartened and hoped that they would have already been iterating their own RISC-V designs rather than simply present yet another renamed variant of an ARM CPU. I was also programming on MC68K NeXT machines before most people had ever heard of objective-C and at least circa 1994, the best performing systems running NeXT Step were some 486dx2 systems running at 66MHz at my college, though they still seemed pokey and inefficient as contrasted with MIPS based Silicon Graphics workstations, and certainly Commodore (RIP) made substantially more efficient and versatile use of MC68K hardware than Apple ever figured out.

    I don't see this as being a bad move by Apple by any stretch of the imagination. Intel has already been rumored to be working on their own RISC-V designs, and supposedly made an offer to buy SiFive earlier this year, despite having been an early investor in the company, that seems to have not come to fruition. Apple, and others can continue to iterate ARM, but it is pretty clear that it's kind of like hot rodding an old jalopy at this point. RISC-V is where the puck is heading, rather than where it has been, to mangle a quote Steve Jobs borrowed from Wayne Gretzky.

    I mean, I guess you could use a RISC-V as a microcontroller, insomuch as the SiFive HiFive is more or less a drop in replacement for an Arduino, but as someone who was doing PIC level coding in the 1980s, there are more efficient and smaller package alternatives to Arduinos too, PIC logic never went away, it just got denser and more powerful with increased ASIC integration and these days there are even video games such as Shenzhen I/O which are more advanced logic tools than SPICE systems from a few decades ago (it honestly wouldn't surprise me if the developer of Shenzhen I/O is using a video game as a way to crowd source cheaper IC designs for strange consumer products that the players will never realize they had a hand in directly). Realistically though, RISC-V hardware can run real operating systems *today*. macOS and Windows haven't joined the party yet, and I think we can already guess Microsoft will be last to finish as usual.

    Excellent detailed post from an obviously knowledgable person. That's why we love AppleInsider.
    FileMakerFellerrundhvidjony0
  • Reply 14 of 18
    mattinozmattinoz Posts: 2,316member
    mpantone said:
    A reaction to NVIDIA buying ARM?
    It's possible, but adding a couple of RISC-V engineers isn't going to eliminate Arm license fees regardless of who owns Arm. Right now Softbank is pocketing those fees.

    Note that Apple did not publicly object to Nvidia's acquisition. If they really didn't want the Nvidia-Arm acquisition to occur, they probably would have vocally pushed the FTC and international NGOs to block the merger.

    A couple of job requisitions in a small group isn't going to halt all Arm silicon development at Apple.

    A more likely scenario right now is that Apple is trying to keep options open looking toward the future. Remember that RISC-V does not currently have performance that comes close to what Apple has implemented with their recent A-series and M-series SoCs.
    Is it even that grand a plan?
    does it even need to be Apple custom to need the engineers. if Apple can tap new suppliers with better offers in terms of all the other controllers in the device then there is advantage to them.
    watto_cobra
  • Reply 15 of 18
    robabarobaba Posts: 228member
    gatorguy said:
    I would not be shocked if Apple uses this to join up with Open Titan development.
    Boy, I would be.  Apple hasn’t really enjoyed working with open source consortiums in the past.  They always want more access than Apple is comfortable providing.
    williamlondonwatto_cobra
  • Reply 16 of 18
    gatorguygatorguy Posts: 24,213member
    robaba said:
    gatorguy said:
    I would not be shocked if Apple uses this to join up with Open Titan development.
    Boy, I would be.  Apple hasn’t really enjoyed working with open source consortiums in the past.  They always want more access than Apple is comfortable providing.
    In just the last three years Apple has joined:

    -open-source Native Cloud Computing
    -The Academy Software Foundation
    -The open-source Data Transfer Project
    -The open-Source FIDO Alliance
    -open-source Matter for smarthomes
    -The open-source Alliance for Open Media

    Open Titan can be next. :)


    FileMakerFellermuthuk_vanalingamjony0
  • Reply 17 of 18
    jidojido Posts: 125member
    How do you figure? FreeBSD, Linux, LLVM, GCC, golang, (O)KL4 and most recently OpenBSD run on RISC-V. Generally speaking, code branches in FreeBSD and OpenBSD are *years ahead* of consumer computing offered by Apple which still borrows heavily from those upstream libre/open source software operating systems (e.g. LibreSSL, OpenSSH).

    In 2020, Micro Magic demonstrated a 5GHz RISC-V CPU consuming just 1W of power. Do you know of *any* ARM CPU at that clockspeed sipping power so frugally? I do not.
    A "demonstration" does not mean a viable product. Where are your super-efficient RISC-V processors now?
    ONiO.zero also announced a 24MHz RISC-V CPU some time ago which uses energy harvesting techniques so that it has no active power draw. Sure, there are some ARM CPU implementations which are similar to that, but the licensing fees, particularly as ARM IP is caught up in a SoftBank/NVidia murky mess with litigation being involved doesn't exactly make it seem as if it is primed for the future. As it is, ARM is really a 1980s vintage CPU ISA, with some 64bit grafts, not entirely dissimilar to AMD64 extensions for the 1970s x86 Intel CPU ISA.
    The ARM 64 bit extensions replace the "vintage" parts that you deride. In reality, both instruction sets are useful, but not necessarily in the same chip.
    Since 2010, RISC-V has already been taught in undergraduate programs in schools of merit, meaning that the future workforce will be ready to jump on RISC-V in the commercial sector. MIT's Xv6 (https://pdos.csail.mit.edu/6.828/2012/xv6.html) is essentially a "Lion's Commentary on Unix" without the apocrypha due to the AT&T vs BSDi lawsuit which caused so many headaches in the 1990s.
    I would hope that the educational institutions who developed RISC-V also teach it. Yes it does have merit, doesn't mean it has won any battle in the market yet.
    Maybe you're thinking: "but RISC-V doesn't even implement an FPU!" without acknowledging the reality that most FPU designs are 1980s vintage IEEE 754 which in academia are losing favor as contrasted with posit/unum alternatives.
    Posit/unum hardware implementations are still a subject of research. It would be nice if RISC-V chose to have it, but I think some kind of IEEE 754 support would be higher priority — and RISC-V already has that support as far as I know (F, D and P extensions).
    Sure, RISC-V isn't ubiquitous, yet, but I think that is just a matter of time. The HiFive Unmatched which began shipping earlier this year uses U740 64-bit RISC-V CPUs in a 28nm process from TSMC. However, TSMC and SiFive have already demonstrated a 5nm iteration of their next gen RISC-V CPU as a proof of concept, and we already know that TSMC is spinning up a 2nm fab, so I think it follows that RISC-V will be ready for that whenever the wafers are primed for lithography.

    Others already mentioned NVidia's roadmap from migrating from 32bit ARM cores in their GPUs to 64bit RISC-V designs. Moreover, the RISC-V spec already has provisions for 128bit addressing. Even from an assembly perspective, the ISA is refreshingly, intentionally, simply designed without decades of vendor proprietary cruft tacked on as afterthoughts to attempt to differentiate/feature parity themselves with "competitors" (e.g. MMX, Neon, SSE, etc.), it has vector math as a default, not as an after thought. When reading Andrew Waterman and David A Patterson's The RISC-V Reader: An Open Architecture Atlas, I was in awe at how much information was presented in so few pages. I think not since K&R have I read a technical text which seemed so condensed and so useful.

    You can run a full blown BSD or Linux distro on RISC-V systems *today*, I think that is already immediately useful.
    The RISC-V ISA is strongly inspired by MIPS, which certainly helps with porting operating systems and other software to it.
    The BeagleBoard.org® Foundation has begun shipping preliminary samples of the BeagleV to developers who are looking for a RISC-V Raspberry Pi sort of alternative and its estimated street price is going to be $149 to $199 depending upon equipped RAM. At least in my experience with a BeagleBone Black in years past, their efforts were significantly better than any Raspberry Pi I ever used. That's just the low end too. For people who are using FPGAs for higher end research, Olof Kindgren's SERV RISC-V implementation has already demonstrated being able to spin up > 5000 cores on some higher end FPGA development boards. For organizations which tape out silicon (or at least design it even if they may be fabless, such as AMD, NVidia and Apple), RISC-V seems extremely promising on a number of levels.

    I've applied to jobs at Apple in the past without much luck, even when I had internal recommendations the furthest I ever got was a phone screening, but I will probably toss my résumé at this requisition. Honestly, while I did end up buying an M1 Apple Silicon Mac last year, I was pretty disheartened and hoped that they would have already been iterating their own RISC-V designs rather than simply present yet another renamed variant of an ARM CPU.
    That's your own prejudice. The ARM CPUs produced by Apple are of their own design, incomparable to other ARM CPUs in the market and at least as interesting as any new RISC-V design you could come up with.
    I was also programming on MC68K NeXT machines before most people had ever heard of objective-C and at least circa 1994, the best performing systems running NeXT Step were some 486dx2 systems running at 66MHz at my college, though they still seemed pokey and inefficient as contrasted with MIPS based Silicon Graphics workstations, and certainly Commodore (RIP) made substantially more efficient and versatile use of MC68K hardware than Apple ever figured out.

    I don't see this as being a bad move by Apple by any stretch of the imagination. Intel has already been rumored to be working on their own RISC-V designs, and supposedly made an offer to buy SiFive earlier this year, despite having been an early investor in the company, that seems to have not come to fruition. Apple, and others can continue to iterate ARM, but it is pretty clear that it's kind of like hot rodding an old jalopy at this point. RISC-V is where the puck is heading, rather than where it has been, to mangle a quote Steve Jobs borrowed from Wayne Gretzky.
    I will say RISC-V is a worthy alternative to ARM, specially if you are looking for a non-proprietary design and you can wait a few years for it to mature. But I don't see the point of Apple throwing away their developments on ARM instruction set when they have been so successful. You can call it an old jalopy after it stagnates and doesn't show improvements in processing power/efficiency, which has not happened today.
    I mean, I guess you could use a RISC-V as a microcontroller, insomuch as the SiFive HiFive1 (rev B, though ostensibly the discontinued initial release as well) is more or less a drop in replacement for an Arduino, but as someone who was doing PIC level coding in the 1980s, there are more efficient and smaller package alternatives to Arduinos too, PIC logic never went away, it just got denser and more powerful with increased ASIC integration and these days there are even video games such as Shenzhen I/O which are more advanced logic tools than SPICE systems from a few decades ago (it honestly wouldn't surprise me if the developer of Shenzhen I/O is using a video game as a way to crowd source cheaper IC designs for strange consumer products that the players will never realize they had a hand in directly). Realistically though, RISC-V hardware can run real operating systems *today*. macOS and Windows haven't joined the party yet, and I think we can already guess Microsoft will be last to finish as usual.

    melgross said:

     Right now, RISC-V is years behind even the simplest ARM general purpose CPU. It’s being used for specialized tasks. It’s designed to be a controller for other sub units, such as a neural engine, or a machine learning processor, or optical analytics.

    It will take years to be useful, if it ever is, for a low end phone, or similar device. A major reason it’s so efficient is that it only has 50 instructions.

    I can see Apple being interested because it would be irresponsible to look the other way. Notice that the engineer they’re looking for Also needs experience with ARM. I suspect that Apple is looking to see if some aspects of this can be used inside their ARM based instruction set chips, but not to replace them


Sign In or Register to comment.