• Illecors@lemmy.cafe
    link
    fedilink
    arrow-up
    114
    arrow-down
    2
    ·
    1 year ago

    I’m glad they found a way, but at the same time - what the hell? Why is it OK for game devs of this magnitude to have a hardcoded hardware list? Look for feature support, not a string that is easy to manipulate outside your control!

    • Chewy@discuss.tchncs.de
      link
      fedilink
      arrow-up
      38
      ·
      1 year ago

      The problem in this case is that they automatically trigger XeSS, which isn’t bad unto itself (unless it can’t be deactivated, which this sounds like).

      The GPU does support XeSS but it crashes on Linux. If they just added a toggle/cmd flag to disable the feature changing the vendorId wouldn’t be necessary.

          • Hairyblue@kbin.social
            link
            fedilink
            arrow-up
            13
            arrow-down
            1
            ·
            edit-2
            1 year ago

            I am playing Baldur’s Gate 3 (can’t say enough great things about this game) and they had a toggle in the game setting for upscaling options. And DLSS runs great with my Linux PC. I thought I heard Larian say they are trying to get XeSS too.

            You can enable either Nvidia’s DLSS or AMD’s FSR via the settings menu.

            • VerifiablyMrWonka@kbin.social
              link
              fedilink
              arrow-up
              9
              ·
              edit-2
              1 year ago

              I imagine Larian care. Especially since they’re pushing Steamdeck support.

              The reason this is a “supported platform” issue is that the developers of Hogwarts legacy know their supported platforms support XeSS, so any work that is not “just turn it on” is additional work for no gain.

    • teawrecks@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      I would bet money that Intel’s dev rel team worked closely with Avalanche to add XeSS support to sell more Intel GPUs.

      Most likely the Hogwarts devs were said, “sure, do whatever you want on your own hardware, just don’t you dare break anything on any other platform while we’re trying to ship”. The easiest way to green light this and know nothing else would be affected would be to hard code everything behind Intel’s vendor IDs.

      So this probably isn’t a case of Intel working around a game dev’s code, it’s probably a case of Intel working around its own code.

    • zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      IIRC, with an Nvidia card DXVK will spoof an AMD card in a lot of games because otherwise the game will try to interact with the Windows Nvidia drivers which aren’t there.

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        You remember correctly. From the DXVK conf file:

        # Report Nvidia GPUs as AMD GPUs by default. This is enabled by default
        # to work around issues with NVAPI, but may cause issues in some games.
        #
        # Supported values: True, False
        
        # dxgi.nvapiHack = True
        
      • BaconIsAVeg@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        nteract with the Windows Nvidia drivers which aren’t there

        Funny story. I was trying to get RayTracing working under Wine for a few days and finally found the solution (needed to download the nvlibs zip from GitHub and run the installer).

        Couple weeks later I went back into Wine and it was broken. After another 3 days of struggling, I decided to redownload nvlibs and run the installer, when I noticed it only symlinks the needed libraries into WINEPREFIX. Me, being the resource miser I am, had removed the folder from ~/Downloads when I thought I was done with it …