Title almost says it all. OLED monitors are getting more and more affordable, but it’s almost out of the picture when buying a monitor because of toolbars and HUD elements. I don’t understand why monitors “burn-in”, when I shine my LED flashlight or some LED xmas lights they won’t simply start emitting the same light even when I turn them off. I know it’s a dumb comparison, but still, what happens?

The other thing that I don’t understand is the fact that I’ve never seen any signs of burn-in on anyone’s phone. Alright, technically that’s a lie, I did see some on a work phone (or two), that only had some chat app open, seemingly since ages, and the namebar was a bit burned-in, or something like that, as you’d guess I also didn’t interact with that phone a lot. As as said above “but still,” I’ve had my phone for a while now, so does my family and friends, some of us even doomscroll, and I’ve never seen any signs of burn-in on any (actually used) phone.

so, I can watch my background all day, but I should open my browser every like 3 hours press f11 twice and I’m safe? Ff I’m away just let the screensaver save my screen? In that case why would anyone ever worry about burn it, you almost have to do it intentionally. But if it’s really dangerous, like I immerse myself into a youtube video, but it has the youtuber’s pfp on the bottom right (does youtube still do that?), and it was hbomberguy’s, am I just done, toasted, burnt-in?

  • UnRelatedBurnerOP
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Thanks, makes sense. But why don’t monitors have an “emergency” protocol to let the LEDs rest a while if we can know what’s the max stress that they can handle?

    So instead of burning out, I’d get a pop up saying that I should do something, or it lowers the brightness in that area or smth.

    • ilinamorato@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      I have five guesses:

      (1) That would require more diagnostics than an LED on a monitor is able to provide at a reasonable cost, (2) if you’re leaving the monitor on in a situation where burn-in is likely, you’re probably not at the monitor when it matters, (3) monitors are a mission-critical piece of hardware, meaning that them turning themselves off (or even just turning off certain pixels) randomly is not a great idea, (4) it’s probably the OS’s job to decide when to turn off the monitor, as the OS has the context to know what’s important and what isn’t, and how long it’s been since you’ve interacted with the device, and (5) it’s in the monitor manufacturer’s best interest for your monitor to get burn-in so that you have to replace it more often.

      The actual answer is probably a combination of multiple things, but that’s my guess.

      Honestly, setting a screen timeout (or even a screen saver!) is the solution to this problem. So the problem was more or less solved in the early 80s.