As a reminder, current estimates are that quantum cracking of a single 2048-bit RSA key would require a computer with 20 million qubits running in superposition for about eight hours. For context, quantum computers maxed out at 433 qubits in 2022 and 1,000 qubits last year. (A qubit is a basic unit of quantum computing, analogous to the binary bit in classical computing. Comparisons between qubits in true quantum systems and quantum annealers aren’t uniform.) So even when quantum computing matures sufficiently to break vulnerable algorithms, it could take decades or longer before the majority of keys are cracked.

The upshot of this latest episode is that while quantum computing will almost undoubtedly topple many of the most widely used forms of encryption used today, that calamitous event won’t happen anytime soon. It’s important that industries and researchers move swiftly to devise quantum-resistant algorithms and implement them widely. At the same time, people should take steps not to get steamrolled by the PQC hype train.

  • Mike1576218@lemmy.ml
    link
    fedilink
    English
    arrow-up
    34
    ·
    13 hours ago

    If qbits double every year, we’re at 20 million in 15 years. Changing crypto takes a very long time on some systems. If we’re at ~20000 in 5 years, we better have usable post quantum in place to start mitigations.

    But I’m not convinced yet, we’ll have those numbers then. Especially error free qbits…

    • humblebun
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      11 hours ago

      If qbits double every year

      And then we need to increase coherence time, which is 50ms for the current 433 qubits large chip. Error correction might work, but might not

      • WolfLink
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        Error correction does fix that problem but at the cost of increasing the number of qubits needed by a factor of 10x to 100x or so.

        • humblebun
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          2 hours ago

          But who guarantees that ec will overcome decoherence, introduced by this number of qbits? Not a trivial question that nobody can answer for certain

          • WolfLink
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            I mean the known theory of quantum error correction already guarantees that as long as your physical qubits are of sufficient quality, you can overcome decoherence by trading quantity for quality.

            It’s true that we’re not yet at the point where we can mass produce qubits of sufficient quality, but claiming that EC is not known to work is a weird way to phrase it at best.

            • humblebun
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 hour ago

              It was shown this year for how many, 47 qbits to scale? How could you be certain this will stand for millions and billions?