After Silicon, Diamond?

Posted:
in General Discussion edited January 2014
See this fascinating article in Wired (attr: MacSurfer).



The upshot is that synthetic diamond is here, it's incredibly pure, it can be manufactured in wafers, and it's $5 per carat. One of the pioneers is responsible for gallium arsenide chips.



Most of the article is concerned with the impact on the loathsome De Beers monopoly, but at the end he talks semiconductors, and the news is good: diamond has been successfully doped with boron to get both of the states necessary. It's ready for use in chipmaking. Someone just has to use it. Japan is pledging $6M a year to the idea, which is a start. Intel says it takes them 10 years to evaluate a new material, and they're not straying from silicon.



Sounds like an opportunity to me. Diamond is nearly impervious to heat, so it will be able to withstand heat densities that silicon never could. And so the march forward can continue. It'll be interesting to see who takes the first step. Certainly, anyone who's invested billions and billions of dollars in silicon-based fabrication is understandably cautious about moving away from that; but one company's risk is another's opportunity.

Comments

  • Reply 1 of 8
    wrong robotwrong robot Posts: 3,907member
    Yeah, I read about this last night, really awesome stuff.



    Although it is certainly a ways a way, at least de beers will be put out.(well, they might be)
  • Reply 2 of 8
    outsideroutsider Posts: 6,008member
    I read it last night too (links from slashdot). How would you go about making a processor from a diamond wafer? I can't imagine making a 8 or 12 inch wafer of diamond and etching it and all. The technology must be in the works however if its being considered. Cool.
  • Reply 3 of 8
    snoopysnoopy Posts: 1,901member
    Think about it. The whole reason it is even being talked about is heat. We already have CPUs that run in the 80 Watt ball park. Is a 160 Watt or 200 Watt processor the answer? Yes, let's have duals and quads, or better yet, a rack full of blades with these babies cooking away. A better answer is to find ways to reduce power and heat, not new ways to cope with even more.
  • Reply 4 of 8
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by snoopy

    Think about it. The whole reason it is even being talked about is heat. We already have CPUs that run in the 80 Watt ball park. Is a 160 Watt or 200 Watt processor the answer? Yes, let's have duals and quads, or better yet, a rack full of blades with these babies cooking away. A better answer is to find ways to reduce power and heat, not new ways to cope with even more.



    Actually, the issue is heat density, which is dissipation per unit of surface area (effectively, since they're so thin) - and by that measure, CPUs are already approaching reactor cores! So it's possible for a very small 60W CPU to be too hot to be fabbed out of silicon, and that's where diamond comes in.



    The way you make a diamond wafer is detailed in the above-linked article: Condensation of pure carbon out of a plasma onto a flat surface. The Apollo process already makes wafers by default, from which ornamental diamonds are cut. They're steadily increasing the size of the wafers. They have a ways to go - if I remember right, they're currently at 10mm - but their wafers are basically square, which would be an win from a process efficiency point of view (given that most fabricated products are square or rectangular).
  • Reply 5 of 8
    leonisleonis Posts: 3,427member
    Diamond.....I am afraid the CPU will be more expensive
  • Reply 6 of 8
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Leonis

    Diamond.....I am afraid the CPU will be more expensive



    The expense of diamonds is currently 100% due to supply constraints imposed by De Beers. If there were a free market, you'd be astonished by how cheap they would become.



    Currently, even without any economies of scale to speak of, Apollo's plasma-deposited diamonds cost $5 per carat. That's probably still too expensive to replace silicon right now, but relative to the artificially high cost of jewelry diamonds it's nothing at all.
  • Reply 7 of 8
    Does this mean that SGI will get a chance to re-brand themselves yet again?



    DGI - Diamond Graphics, Inc.



    I don't know that it has the same ring to it...
  • Reply 8 of 8
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by Amorph

    Actually, the issue is heat density, which is dissipation per unit of surface area (effectively, since they're so thin) - and by that measure, CPUs are already approaching reactor cores! . . .







    Actually I read the article several days ago, but I was reading fast and missed that one. It's a very funny overstatement. I cannot remember the maximum junction temperature for silicon, but it is somewhere between 100 and 200 degree Celsius I believe. It's true that the chip is very thin, but it is attached to something much more massive with good thermal conductivity to carry heat away. However, it would be interesting to see a graph of chip area vs. power, with a curve showing the limit for silicon. I'd bet it has a long way to go.



    I'm guessing that methods to reduce power (and therefore heat) will be easier to accomplish than making a high performance CPU with diamond transistors. Interesting article, however.
Sign In or Register to comment.