A newly discovered trade-off in the way time-keeping devices operate on a fundamental level could set a hard limit on the performance of large-scale quantum computers, according to researchers from the Vienna University of Technology.

  • Em Adespoton@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    1 年前

    This is a really interesting point; I tried flipping it on its head and the reasoning became even more obvious:

    My thought was: “surely we can take advantage of relativistic effects to keep time at a slower pace locally but have it take a short enough time in the referent timeframe.” But in this case, there is a very obvious floor we’re working with: absolute zero. Because making things go relatively faster means making the other things go comparatively slower, and 0 is as slow as you can go. If subatomic particles have no movement, there’s nothing to measure, literally.

    As a result, there is a very specific bound on timekeeping measurements no matter how you try to finesse things, with the amount of energy required to make minor improvements ramping up exponentially as that floor is approached.

    In order to get around this, we’d have to come up with a different way to do error correction and results measurement, and I’m not sure there is one.