Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

  • conciselyverbose
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    6
    ·
    edit-2
    4 months ago

    There absolutely was a legitimate reason. The hardware was not capable of processing the signals. They didn’t use FPGAs on a whim. They did it because they were necessary to handle the signals properly.

    And you just haven’t followed the tech if you think they were indistinguishable. Gsync has supported a much wider variance of frame times over its entire lifespan.