I’ve been on Wayland for the past two years exclusively (Nvidia).

I thought it was okay for the most part but then I had to switch to an X session recently. The experience felt about the same. Out of curiosity, I played a couple of games and realized they worked much better. Steam doesn’t go nuts either.

Made me think maybe people aren’t actually adopting it that aggressively despite the constant coverage in the community. And that maybe I should just go back.

  • azvasKvklenko
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    8 months ago

    That’s because their mid-2000’s setup with single 1024x768 screen works just fine with compositing disabled, 24bit color depth via VGA connector.

    I had to switch to Wayland the moment I tried to run simple 4K@60 on my old RX570, and Xorg was just refusing to set the mode, or produced some colorful vomit garbage when forced to do so, no matter what. And Wayland (just like Windows) simply worked.

    Was it perfectly ready back then? Heck no. Is it ready now? Maybe not for everyone, but it’s getting there and time is telling us that the missing parts on Wayland side are fixable.

    Criticism is viable to some degree, though. Because from the very beginning there were certain assumptions made, and creators of the base protocol didn’t care about real world use on desktop as much as they cared about the security model, it takes a lot of time to solve some of those. The development is slow and there are always some gaps here and there, but I watch it long enough (17 years) to know that to some degree it is like that with the entire ecosystem, let alone Xorg that no programmer wants to touch anymore for anything but simple bugfix or security patching.