• peeonyou [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    I’m a little surprised by this… I was at a thrift store the other day and found a book from the 90s on telecommunications where they mentioned photonic chips designed by AT&T and IBM back in the mid-90s. They were talking about photonic switches and whatnot but also that there were chip designs as well. How come it has taken this long to actually produce such chips I wonder?

    • ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      I think there’s a number of factors here. It’s often very hard to go from making a proof of concept work in the lab to actually scaling it into mass production. This is especially so for companies that want to see a clear path to profit. If something can’t be made to work within a certain budget, then it’s probably going to be abandoned. Whatever new tech money is allocated towards also has to compete with scaling up existing tech. So, if a company is already making good money off silicon chips, there’s little incentive to divert huge amounts of funds towards making alternative substrates work. Shareholders want to see quarterly profits after all. And this is why we constantly see these kinds of discoveries published in research, but never actually commercialized.

      What’s different now is that the US is actively trying to choke off China’s chip access, and China is investing in chip development at state level in response. The logic becomes different because a moonshot project using an alternative substrate has the potential to leapfrog existing tech entirely. Profitability is no longer the primary driving force here, the state can pour effectively unlimited funds into commercializing this tech until something works.