So now we’re being forced to have poorer performance and locked into buying more expensive hardware.

I’ll have to pass on this type of crap and vote with my wallet when ray tracing still isn’t there for the majority of pc gamers.

  • WereCat@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 hours ago

    I don’t care much about RT but the reason they’ve decided to do RT implementation mandatory is quite good and revolutionary and I can’t wait to see it in action. Using RT for pixel perfect hitboxes? Sign me in!

    “And now, it has been revealed that the game will use ray tracing (RT) not only to enhance visuals but also to offer key gameplay improvements, such as better hit detection and the ability to distinguish materials in a bid to make the game more immersive.”

  • TyrianMollusk@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 hours ago

    However, the reality is that most gamers are now using gear that has some ray-tracing capability.

    Sure, plenty, and I’m still going to hard-pass any idiot game that forces raytracing or upscaling. Find something actually useful to do with the power available, instead of something that worthless and computationally wasteful, or don’t and run at lower power. That’s more valuable than raytracing.

  • Hazzard@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 hours ago

    I think a lot is being made of this headline, honestly. Indiana Jones did the same thing using the same engine… and runs well on a broad variety of hardware, including AMD cards with no dedicated RT accelerators. And that’s not an experience designed with high framerate competitive action in mind.

    I also literally booted Doom Eternal for the first time in a while today, enabled raytracing, and played at 120FPS with 4K native on a 7900XTX, all settings on High. Id knows how to frigging optimize a title, and you can bet their raytracing implementation will be substantially better optimized than the RT we’re used to seeing. So long as you don’t run it with Path Tracing (a future forward feature, like Crysis back in the day), I fully expect you’ll still be able to get high framerates and incredible visuals.

    Wait for the Digital Foundry tests before buying if you’re uncertain, absolutely, but I really don’t see any reason to be concerned with the way idTech 8 has been shaping up.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    11 hours ago

    ray tracing is the AI of pc gaming. A bunch of hardware made “obsolete” just so a few nerds can get marginally “better” lighting.

    • chiliedogg@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      10 hours ago

      It’s to make development easier.

      With ray-tracing it becomes much easier to light environments in game. You don’t have to have devs adding artificial light sources or painting environments as of they’re lit.

      • kugmo
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        unfortunately most developers can’t make a game with raytraced lighting

  • lud@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    9
    ·
    edit-2
    24 hours ago

    Ray tracing really is the future. Instead of doing a bunch of tricks to make things look good all lightning is just simulated using ray tracing.

    I remember when metro exodus enhanced edition came out and they explained that when they remade the game for RTX only, they could remove a lot of workarounds like invisible lights. And actually just light the scenes with actual light from bulbs or the sun and it just looked great.

    IMO we have to move someday to the newer technology. Whenever that’s today or not, I don’t know. But it really is the future. Historically it also isn’t unusual at all that someone had to get new hardware to play new games. It’s just that it was stagnated for a while.

    Edit: pretty sure it was this video from digital foundry: https://youtu.be/NbpZCSf4_Yk?t=1376

      • TachyonTele@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        I agree with you. But if AAA develops it so it’s easier to implement across the board, im all for it.

    • Nik282000@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      11 hours ago

      Game engines don’t have to simulate sound pressure waves bouncing off surfaces to get good audio. They don’t have to simulate all the atoms in objects to get good physics. There’s no reason to have to simulate photons to get good lighting. This is a way to lower dev costs and increase spending on the consumer side, I would not be surprised if Nvidia was incentivizing publishers to use ray tracing.

      • lud@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        The digital foundry video explains better than I can why the dev cost is lower with ray tracing only.

  • Artyom@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    2
    ·
    1 day ago

    I once watched a 20 minute video on how in order to compute trajectories for the rocket launcher in the original doom, they did some of the most advanced math I have seen in any context to avoid doing any division which is computationally expensive. How the nightly have fallen.

    • Nik282000@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      11 hours ago

      Game development is about maximizing revenue while minimizing development costs. There won’t be many more Mysts, Dooms, Quakes or Half Life 2s in the gaming future. Get ready for “Generative AI” stories/levels and ever increasing hardware feature set requirements.

    • Fredselfish@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      19 hours ago

      Yep the original Doom was something else and the way John Carmack built his engines was something to be in awe of.

  • germtm.@lemmy.world
    link
    fedilink
    English
    arrow-up
    85
    arrow-down
    7
    ·
    2 days ago

    welcome to the death of game optimization.

    it especially sucks for Doom: The Dark Ages since both Doom 2016 and Eternal were considered very optimized for their times of release.

    • Jimmycakes@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      13 hours ago

      Not sure what games you play but they haven’t been optimized for jack shit in the last 10 years.

    • tekato@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      7
      ·
      edit-2
      1 day ago

      How does requiring a GPU feature translate into bad optimization?

        • tekato@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          1 day ago

          That link explains nothing, it just tells you what people are using. Why does a game requiring a GPU feature mean, by your own words, the death of optimization?

          • Black616Angel@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            15
            arrow-down
            4
            ·
            edit-2
            1 day ago

            I’ll chime in for the other commenter.

            Having ray tracing be “a minimum requirement” is batshit insane. Just make it an option and don’t require it for everyone.

            Ray tracing is not that widely available, so you shouldn’t just force it onto your whole player base.

            And while this might not sound like an optimization thing, it really looks like they couldn’t be bothered to develop their game with and without the ray tracing features.

            Edit: looking more into the numbers, they are all insane.

            • 8 cores with 16 threads as minimum?
            • 16GB RAM?
            • and a 100GB SSD again?

            I don’t really play AAA titles nowadays, but this is aweful and far from optimized. Doom 2016 needed half of that for every single metric!

            • kungfuratte@feddit.org
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              1 day ago

              Doom 2016 needed half of that for every single metric!

              On the other hand that was almost a decade ago.

              • umbrella@lemmy.ml
                link
                fedilink
                English
                arrow-up
                2
                ·
                12 hours ago

                to be fair graphics havent advanced as much in the last decade compared with the previous one to justify higher and higher requirements.

                • Nik282000@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  10 hours ago

                  Titanfall 2 is 9 years old and looks pretty damned good compared to recent releases. The biggest jump I have seen in graphics was HL:Alyx and a lot of that was VR letting you press your eyeballs up against the windows to see the sweaty hand prints.

                  There is plenty of room for publishers improve without leaning on new hardware features.

              • Black616Angel@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                7
                arrow-down
                4
                ·
                1 day ago

                Yes, but the tech has not advanced that much since then. Also the game probably doesn’t look twice as good.

                Tbh. the game needing 8 cores is the most outrageous of the list, but the ray tracing is a close second, since that could easily be toggleable.

                • SkyeStarfall@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  1 day ago

                  Depending on how a game is made, no, ray casting may not be “easily toggleable”

                  You wouldn’t complain about games requiring DirectX 12, or requiring DirectX 11, 10, whatever, in the past, so why complain about ray tracing? Modern games require modern GPU features, that is nothing new.

            • Jimmycakes@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              13 hours ago

              Nobody buying $80 games doesn’t have all this shit. You guys are living in a bubble. Everyone has rtx these days. The ones that don’t weren’t going to buy Doom anyway.

    • Aurix@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      2 days ago

      I especially fail to see the value to drive up obsolescence. Look how the Final Fantasy XIV art team, or the Tyranny RPG expressed so much through comparatively ancient engines of the PS3 era. And for shooters we have so much visual polished fidelity, with physics, high resolution textures and dynamic lighting to create anything you want. From “Prey” (2016) to Prey (2005) I think both look amazing.

      • woelkchen@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        I especially fail to see the value to drive up obsolescence.

        “Can’t afford an upgrade to a high-end PC? Buy one of our Xboxes.” --Microsoft

    • RedWeasel@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 day ago

      Not sure why you are so worried. Based on those requirements a Radeon rx 6800 should be able to do Ultra 4K @ 60. The main differentiator seems to be vram.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    edit-2
    2 days ago

    My GPU can do ray-tracing and that’s usually the first thing I turn off because it absolutely destroys performance for minimal effect. I think ray-tracing is cool and all, but I don’t really care when it makes most games run like shit. I thought Elden Ring was poorly optimized until I turned it off and than BAM 120fps no problem.

    Honestly if it has to be enabled, as much as I love the Doom games, this’ll be a pass for me. Smooth combat doesn’t mean shit when it stutters every 2 seconds.

    • Dagnet@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      Tbf, elden ring is still poorly optimized. It’s been 3 years since it released and I still get cutscene stutters and no ultrawide

  • Jomn@jlai.lu
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 day ago

    It won’t be for this game that I don’t care about, but seeing the recent trend of requiring ray tracing, I will probably have a good reason to build a new computer (you served me well 1070 Ti) !

    Or maybe I’ll stick to only playing games that work on a Steam Deck. I might not game enough anymore to justify the cost of new hardware. Especially when I look at the current AAA games (which would probably be the ones requiring RT), none of them really make me want to buy them anymore.

  • TastyWheat@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    12
    ·
    1 day ago

    id software moving PC specs forward? You’re KIDDING. Outrage, I tell you!

    Wait, didn’t Quake require a math coprocessor enabled CPU?

    Hang on, Quake 3 required a 3D accelerator… Outrageous!

    Raytracing hardware has been around for the better part of a decade now. It’s time.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 day ago

      The same id managed to keep the hardware requirements for Eternal amazingly low. I have a pretty beefy PC, yet I’d prefer to play without Ray Tracing, as that usually keeps the framerate from reaching a stable 120. It sucks they are taking away that option.

  • droporain@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    2 days ago

    Doom eternal was fun up until you had to jump around like a drunken spiderman. Turned that shit off so fast, never buying a doom game again.

    • deranger
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 days ago

      The constant required acrobatics turned me off as well. I loved Doom 2016, but Eternal wasn’t nearly as good IMO. I also didn’t care for the low max ammo / punching dudes for ammo or weak spots. Eternal is the first Doom I haven’t finished.

      • neon_nova@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 days ago

        Exactly the same for me. The jumping around part was annoying, but the low ammo part was then worst. I tired to play that game a few times but couldn’t do it.

        But the real question I have for you is did you play the doom mobile phone rpg?

        If not, give it a go. Someone reverse engineered it and it’s fun and short.

        https://arstechnica.com/gaming/2022/06/dumbphone-exclusive-doom-rpg-has-been-reverse-engineered-by-fans/

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          24 hours ago

          I didn’t like the low ammo at first either, but I’ve come to really enjoy it. Since your chainsaw regenerates fuel you can easily get more ammo ~once per minute, and between that you’ll have to decide how to allocate your ammo based on enemy danger & weak spots.

          It forces you into a more aggressive and fast playstyle, but that’s the goal of the whole game IMO :)

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            14 hours ago

            Yeah, I hates it at first, but after playing a level or two it clicked. I’m not sure it’s better than 2016, but it did find it’s own rhythm.

            The only thing I hated was the one miniboss like enemy that basically required you use a specific gun to beat.

        • deranger
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          I think I remember seeing that as a top selling game in the app store when I had my Motorola SLVR. I’ll check that out, thanks for the link.

    • LuxSpark@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Yep, the platformy stuff was an big turn off. I got a refund. Wolfenstein games are better.

  • Kraiden@kbin.earth
    link
    fedilink
    arrow-up
    25
    arrow-down
    18
    ·
    2 days ago

    So couple things:

    The first RTX cards were the 20 series, which came out in 2018

    There was a time when volumetric lighting was also optional

    There was a time when GRAPHICS CARDS were optional.

    The first game to require RTX was the Indiana Jones game, as did the Avatar game.

    Shit moves on. Did you expect your 1060 card from 2016 to last indefinitely? How long did you expect developers to support 2 different lighting systems?

    There is so much to be angry about these days, but not this. This was inevitable. If you MUST be angry about it, at least be angry at the right devs

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      2 days ago

      Shit moves on. Did you expect your 1060 card from 2016 to last indefinitely? How long did you expect developers to support 2 different lighting systems?

      This as well as Indiana Jones are both 1st party games Microsoft which are also being released onto Xbox Series S. They are already supporting two different lighting systems because of that.

      It’s not unreasonable to for paying customers to expect Microsoft to just ship the same performance profiles on their PC games.

      What Microsoft is doing is not a technological move. It’s a desperate move to sell more Xboxes.

      • Kraiden@kbin.earth
        link
        fedilink
        arrow-up
        6
        arrow-down
        9
        ·
        2 days ago

        I can’t comment too deeply on consoles. I have no real experience with them, but from a very shallow level, the series S was released in 2020, and Google suggests that it supports ray tracing.

        So my point stands. Stop expecting your 10 year old hardware to run new games indefinitely

        • woelkchen@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          1 day ago

          I can’t comment too deeply on consoles.

          Then don’t.

          Google suggests that it supports ray tracing.

          Series S has raytracing support on paper (just as Steam Deck has) but the GPU is way too underpowered to actually use raytracing for anything.

          So my point stands.

          No, it doesn’t.

          Stop expecting your 10 year old hardware to run new games indefinitely

          Nobody said that. What can be expected is for Microsoft to ship the same performance profiles they enable for one of their platforms (Series S) on their other 1st party platform (Windows).

          • Kraiden@kbin.earth
            link
            fedilink
            arrow-up
            2
            ·
            15 hours ago

            I apologize for my last comment, I was drunk when I wrote it. I’d rather not put that kind of negativity into the world.

            I do still disagree with you though.

            On paper or not, the system supports it, which means that they are very likely NOT supporting two lighting systems, which means that, yes, my point still stands. The series S is only 5 years old. The minimum system requirements are for 7 year old hardware.

            EVERYTHING else is a matter of optimization, which no one here can comment on until the game is released. You just cannot know the game will perform badly until it is released.

            As evidence of this, I will again point to the Indiana Jones game which is a) Ray Traced, b) Runs on the series S, and c) runs at 60fps (although, admittedly it’s apparently blurry)

            • woelkchen@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              22 hours ago

              check the community you’re posting in… aaaand then fuck right off…

              Check the rulebook of the community you’re posting in.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      6
      ·
      edit-2
      2 days ago

      I grew up gaming on a 386 when the great divide was CGA, EGA, VGA, and SVGA, that was the first big graphics card war for me and the beginning of the whole, Oceania had always been at war with Eastasia moment, and then the gulf of floppy disks and hard drives, and then CD-Roms and then the the 3d accelerators games. PC gaming has always had point of contention between the whales and have nots but gaming studios had a good track record always made sure to provide plenty of support to those gamers who weren’t whales.

      Forcing ray tracing when you have to have a 1000+ dollar video card to run ray tracing that doesn’t tank your frame rate into completely unplayable territory then ray tracing is not ready for prime time yet and complete shit move on the developers part to mandate it when it’s still untenable for most gamers.

      Nope, this is just complete bullshit to me that’s forcing gamers to have a sub par experience and force them into buying more expensive hardware for no other reason than to satisfy a publisher’s ego to market their game “next gen”.

      Edit: How sad is it that ray tracing was first released in 2018 and we still don’t have the hardware to run them on a rig with moderate hardware.

      • Zanz@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 hours ago

        Ray tracy can be cheaper for equivalent lighting quality than rasternization. Depending on how they use it, it could be great to have rates and only just like how mega texters work. People got upset about the g p u memory requirements for mega texters , but it was a huge gain in performance if you hit the minimum. Retracing as an effect on top of rasterized ighting is a big hit to performance and the only thing we have now.

        Native real-time ray tracing was released like 20 years ago with ati x1000 series. No one wanted to risk making a retracing only games so it never took on. Rtx is based on using ray tracing as an effect to go on top of rasternized graphics.

        No current games use retracing only, indiana jones uses it for mandatory effects and not as the primary render method.

      • Kraiden@kbin.earth
        link
        fedilink
        arrow-up
        11
        arrow-down
        7
        ·
        2 days ago

        forcing gamers to have a sub par experience

        The game isn’t even out yet and you’re commenting on performance! As someone else pointed out, the modern Doom games have a reputation for being extremely well optimised, so let’s wait and see how it actually performs on a 20 series card

        As for needing a card > $1000 that’s just ridiculous. You can get a 4060 NEW for under 500, and again, the minimum here is a 2060.

        Re: supporting old hardware, again. The minimum is 7 year old hardware. I was also around in the 386 era and to say that devs of that time supported hardware for longer, is at best, wildly exaggerated.