• mindbleach
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Please detach resolution from bitrate. Please. For the love of god. If I can stream 4K that’s been squeezed to hell, I should be able to get 720p that is fucking flawless.

    Twitch makes this problem painfully evident during retro-game marathons. There are times I’ve had to watch Game Boy games at 1080p just to get 60 Hz. The machine’s resolution is 160x144 in four shades of green.

    Frankly I’m hoping for some Quite Okay Imaging style do-over. DCT-based codecs are obscenely efficient - but like rasterized 3D games, they are a deep pile of hacks to cover up blatant shortcomings. Competing concepts with higher potential don’t have thirty-plus years of gradual optimization. Yet I don’t think they’d need it, if we could get merely noisy images, at these modern resolutions.

    Artifacts that look like film grain or analog snow are what your brain deals with all the time as a consequence of how your eyeballs work. Especially if that lets every frame be unique and independent. That’s obviously desirable for low-latency nonsense like livestreaming, or the attractive nuisance money-pit that is game streaming. But it’s also a great step toward avoiding temporal artifacts. Some of the hackiest parts of DCT codecs are in motion compensation - detecting movement between frames, and then not letting someone drag the hallway with them, and not letting confetti turn the whole frame into mush.