• @[email protected]
    link
    fedilink
    English
    478 days ago

    While giving 4k paid customer with 720p resolution because we don’t have “your” recommended devices. Good strategy.

    • geekwithsoul
      link
      fedilink
      English
      58 days ago

      Wouldn’t be surprised if that’s the studios’ requirement - every big streamer I know of requires certain platforms for HD and higher streams because of the copy protection required.

  • @[email protected]
    link
    fedilink
    English
    258 days ago

    Reducing the amount of data you need to send is an obvious factor for a service that sends a lot of data. Not much of a bet at all.

    • @[email protected]
      link
      fedilink
      English
      98 days ago

      That the obvious part. At this point Netflix is looking at drastic transmission costs in the coming decade. Video is obviously taxing and require huge amounts of data but Atmos is no slouch either.The gamble, is in how customers receive the news and how it impacts playback.

      Audio sync issues, subtitle playback, artifacting on anything over 1080p will all cause customers dissatisfaction. Using a new way to save data is a great idea, almost literally a no brainer, but does a technical solution always work out of the gate?

    • HubertManne
      link
      fedilink
      48 days ago

      my first thought was thinking they should always be looking to improve this.

  • @[email protected]
    link
    fedilink
    English
    148 days ago

    without compromising on visual fidelity

    But it does compromise. Netflix has the worst banding issues in low-light scenes of any of the streaming services I’ve tried. It’s hard not to notice and it’s very annoying.

    • @[email protected]
      link
      fedilink
      English
      6
      edit-2
      8 days ago

      With any tech that allows the same quality with less data, there will always be someone pushing to cut quality to save even more data.

  • just another dev
    link
    fedilink
    English
    128 days ago

    As someone who worked on a couple of video encoding / streaming services, this was an amazingly interesting read. Some personal highlights:

    • Custom encoding settings offer shows, per episode, and even per scene.
    • They created a short film specifically to cater to hard-to-encode scenes.
  • @[email protected]
    link
    fedilink
    English
    78 days ago

    Even if they do get the VBR encoding perfect, you’ll still get people on bad connections that will only have a buffer underrun when a dude shows up in a sparkly suit.

  • AutoTL;DRB
    link
    fedilink
    English
    38 days ago

    This is the best summary I could come up with:


    And while the rest of the world marveled at all those celebrities and their glitzy outfits sparkling in a sea of flashing cameras, Aaron’s mind immediately started to analyze all the associated visual challenges Netflix’s encoding tech would have to tackle.

    The company’s content delivery servers would automatically choose the best version for each viewer based on their device and broadband speeds and adjust the streaming quality on the fly to account for network slow-downs.

    “We had to run subjective tests and redo that work specifically for HDR.” This eventually allowed Netflix to encode HDR titles with per-shot-specific settings as well, which the company finally did last year.

    Meridian looks like a film noir crime story, complete with shots in a dusty office with a fan in the background, a cloudy beach scene with glistening water, and a dark dream sequence that’s full of contrasts.

    The film has since been used by the Fraunhofer Institute and others to evaluate codecs, and its release has been hailed by the Creative Commons foundation as a prime example of “a spirit of cooperation that creates better technical standards.”

    In other words: how many times can Netflix re-encode its entire catalog with yet another novel encoding strategy, or new codec, before those efforts are poised to hit a wall and won’t make much of a difference anymore?


    The original article contains 2,686 words, the summary contains 223 words. Saved 92%. I’m a bot and I’m open source!

  • @mindbleach
    link
    English
    18 days ago

    Please detach resolution from bitrate. Please. For the love of god. If I can stream 4K that’s been squeezed to hell, I should be able to get 720p that is fucking flawless.

    Twitch makes this problem painfully evident during retro-game marathons. There are times I’ve had to watch Game Boy games at 1080p just to get 60 Hz. The machine’s resolution is 160x144 in four shades of green.

    Frankly I’m hoping for some Quite Okay Imaging style do-over. DCT-based codecs are obscenely efficient - but like rasterized 3D games, they are a deep pile of hacks to cover up blatant shortcomings. Competing concepts with higher potential don’t have thirty-plus years of gradual optimization. Yet I don’t think they’d need it, if we could get merely noisy images, at these modern resolutions.

    Artifacts that look like film grain or analog snow are what your brain deals with all the time as a consequence of how your eyeballs work. Especially if that lets every frame be unique and independent. That’s obviously desirable for low-latency nonsense like livestreaming, or the attractive nuisance money-pit that is game streaming. But it’s also a great step toward avoiding temporal artifacts. Some of the hackiest parts of DCT codecs are in motion compensation - detecting movement between frames, and then not letting someone drag the hallway with them, and not letting confetti turn the whole frame into mush.