Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • @[email protected]
    link
    fedilink
    14 months ago

    As our tech goes up, this has to be simulated as well

    Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn’t matter how those particles are arranged, it’s always the same memory.

    • @[email protected]
      link
      fedilink
      14 months ago

      Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.

      Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.

      I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.

      • @[email protected]
        link
        fedilink
        14 months ago

        where the sim doesn’t render things it doesn’t need to.

        That can’t work unless it’s a simulation made personally for you.

        • @[email protected]
          link
          fedilink
          14 months ago

          I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.

          • @[email protected]
            link
            fedilink
            14 months ago

            Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.

            • @[email protected]
              link
              fedilink
              1
              edit-2
              4 months ago

              If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.

              None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.

              We wouldn’t know the difference.

              • @[email protected]
                link
                fedilink
                14 months ago

                If it was a simulation, there would be no need to go that far

                But you already said you have to go that far whenever someone is doing something where they could notice microscopic effects.

                So it’s not a simulation as much as a mind reading AI that continuously reads every sentient mind in the entire universe so as to know whether they are doing a microscopic observation that needs the fine grained resolution result or an approximation can be returned.

                • @[email protected]
                  link
                  fedilink
                  1
                  edit-2
                  4 months ago

                  There would be no need to go that far at all times is what I’m saying. It’s the equivalent of a game rendering stuff far away only when you use a scope. Why render everything at all times if it isn’t being used and does not affect the experience. It would augment the overhead by an insane amount for little to no gain.

                  This is also just a thought exercise.

                  • @[email protected]
                    link
                    fedilink
                    1
                    edit-2
                    4 months ago

                    Why render everything at all times if it isn’t being used and does not affect the experience.

                    But how does the simulation software know when it needs to calculate that detail? If you are the only person in the simulation, it’s obvious because everything is rendered from your perspective. But if it’s more than one person in the universe, an ai program has to look at the state of the mind of everyone in the universe to make sure they aren’t doing something where they could perceive the difference.

                    Am I microwaving a glass of water to make tea, or am I curious about that YouTube video where I saw how you can use a microwave to measure the speed of light. Did I just get distracted and didn’t follow through with the measurement? Only something constantly monitoring my thoughts can know. And it has to be doing it for everyone everywhere in the entire universe.