• JohnDClay
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 months ago

    I’m surprised it’s only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.

    • 31337
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Same. I think I’ve read that a single GPT-4 instance runs on a 128 GPU cluster, and ChatGPT can still take something like 30s to finish a long response. A H100 GPU has a TDP of 700w. Hard to believe that uses only 10x more energy than a search that takes milliseconds.