So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won’t render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.

The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn’t there while the planet/moon remains illuminated without any light source.

Original Reddit post by u/Yoraxx

  • macniel
    link
    fedilink
    English
    5010 months ago

    Waaaaait… it was a bug and not gross incompetence?

    • e-ratic
      link
      fedilink
      29
      edit-2
      10 months ago

      “Bethesda’s Bug”, when you can’t tell if something isn’t working correctly or if it’s just not implemented at all.

    • geosocoOP
      link
      fedilink
      1810 months ago

      I don’t think we know.

      Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol

      • Hildegarde
        link
        fedilink
        English
        1210 months ago

        The game runs better on AMD, and Bethesda partnered with AMD in some way for this PC release.

          • @booly
            link
            English
            210 months ago

            All GPUs perform equally well the same at ray tracing when there are no rays to trace

        • geosocoOP
          link
          fedilink
          1010 months ago

          That really just means AMD gave them a lot of money, and they just made sure FSR2 worked. lol

          • @Naz
            link
            English
            3
            edit-2
            10 months ago

            I’ve got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.

            100% resolution scale, 128 FPS.

            75% resolution scale … 128 FPS.

            50% resolution scale, looking like underwater potatoes … 128 FPS.

            I don’t know how it’s possible to make an engine this way, it seems CPU-bound and I’m lucky that I upgraded my CPU not too long ago, I’m outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.

            He struggles to break 70 FPS on 1080p Ultra, meanwhile I’m doing 4K Ultra.

            • Xperr7
              link
              fedilink
              210 months ago

              I have noticed it’s better anti-aliasing than the forced TAA (once I forced it off)

            • geosocoOP
              link
              fedilink
              1
              edit-2
              10 months ago

              Some of the benchmarks definitely pointed out that it was CPU bound in many areas (eg. the cities).

              I think the HUB one mentioned that some of the forested planets were much more GPU bound and better for testing.

              I’m on a tv so capped at 60fps, but I do see a power usage difference with FSR - 75% vs FSR- 100% that’s pretty substantial on my 7900xt.

    • @hoshikarakitaridia
      link
      English
      1110 months ago

      If it’s down to very specific Chipsets, that sounds like an unforseeable bug.

      • @hoshikarakitaridia
        link
        English
        2
        edit-2
        10 months ago

        Correction: someone pointed out they are literally interfacing the graphics drivers the wrong way, so it’s still on the their Devs.

    • @[email protected]
      link
      fedilink
      English
      6
      edit-2
      10 months ago

      I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn’t have a clue what they were talking about since it was always fine for my Nvidia card.