• RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    194
    arrow-down
    1
    ·
    edit-2
    6 months ago

    People did care, which is why people who played games competitively continued to use CRT monitors well into the crappy LCD days.

    Heck, some people still use CRTs. There’s not too much wrong with them other than being big, heavy, and not being able to display 4k or typically beeing only 4:3.

    • Julian@lemm.ee
      link
      fedilink
      English
      arrow-up
      102
      arrow-down
      2
      ·
      6 months ago

      Idk if it’s just me but I have pretty good hearing, so I can hear the high pitch tone CRTs make and it drives me crazy.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        49
        arrow-down
        6
        ·
        6 months ago

        This only happens with TVs or very low quality monitors. The flyback transformer vibrates at a frequency of ~15.7k Hz which is audible to the human ear. However, most PC CRT monitors have a flyback transformer that vibrates at ~32k Hz, which is beyond the human hearing range. So if you are hearing the high frequency noise some CRTs make, it is most likely not coming from a PC monitor.

        Its a sound thats a part of the experience, and your brain tunes it out pretty quickly after repeated exposure to it. If the TV is playing sound such as game audio or music it becomes almost undetectable. Unless there is a problem with the flyback transformer circuit, which causes the volume to be higher than its supposed to be.

        • systemglitch@lemmy.world
          link
          fedilink
          arrow-up
          41
          arrow-down
          5
          ·
          edit-2
          6 months ago

          There is not one crt I ever encountered that I couldn’t hear. So I’m having trouble believing you information.

          I could time it out most of the time, but it was always there.

          • RightHandOfIkaros@lemmy.world
            link
            fedilink
            English
            arrow-up
            23
            arrow-down
            6
            ·
            edit-2
            6 months ago

            https://en.m.wikipedia.org/wiki/Flyback_transformer

            Under “Operation and Usage”:

            In television sets, this high frequency is about 15 kilohertz (15.625 kHz for PAL, 15.734 kHz for NTSC), and vibrations from the transformer core caused by magnetostriction can often be heard as a high-pitched whine. In CRT-based computer displays, the frequency can vary over a wide range, from about 30 kHz to 150 kHz.

            If you are hearing the sound, its either a TV or a very low quality monitor. Human hearing in perfect lab conditions can only go up to about 28kHz, and anything higher is not able to be heard by the human ear.

            Either that or you’re a mutant with super ears and the US military will definitely be looking for you to experiment on.

            • errer@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              2
              ·
              6 months ago

              I’ll defend this guy: there can easily be a harmonic at half the flyback frequency that is audible. It’s lower amplitude so less loud, but I could believe someone being able to hear that.

              • RightHandOfIkaros@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                7
                ·
                edit-2
                6 months ago

                Yes, as I previously stated, if there is a problem with the flyback transformer circuit, it is possible that the frequency or volume of the noise it generates can become increased or different.

                Though again, PC monitors never made an audible noise unless they were low quality and used the cheaper 15.7kHz transformer in their construction.

                Other noises associated with CRTs are the degaussing noise, which only happens once usually after turning on the CRT or after pressing the degauss button, or the sound of old IDE hard disks spinning, which also make a constant high frequency noise.

                • errer@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  1
                  ·
                  edit-2
                  6 months ago

                  Not sure you follow: even if the primary frequency is out of range, a harmonic (half the frequency, quarter the frequency, etc) can simultaneously exist with the primary.

            • systemglitch@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              6 months ago

              On a side note, I can also hear when a capacitor is going bad on an lcd when other people around me cant hear it.

              It could be something else in the crts I’m hearing, but I can definitely tell one is on without seeing it. It’s been like this since the 70s for me.

              I can also smell and taste things other people can’t, so something is a little different in my brain somehow.

              My partner and daughter tell me I have super powers lol. Guess who gets to smell meat for rot? Not them! Bleh.

              • Cypher@lemmy.world
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                6 months ago

                You’re not alone, I can hear CRTs, bad capacitors and an array of other electrical appliances.

                I can still hear the high frequencies that only teenagers and your kids are meant to be able to hear.

          • deltapi@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            I could hear them too, when I was younger. I lost that frequency range of my hearing in my mid-to-late 20’s, which I’ve read is normal.

        • Julian@lemm.ee
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          6 months ago

          Oh neat, thanks for the explanation! That makes sense as most of my crt exposure for the past 10 years has been classroom TVs and museum exhibits.

      • nadiaraven@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        6 months ago

        eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee

        (me too)

      • SpaceCowboy@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        For me it was the refresh. If a CRT was at 60Hz, I could see it flashing when I wasn’t looking directly at it. I had to have it set to at least 75 Hz (>80 Hz preferably) or it would give me a headache.

    • SpikesOtherDog@ani.social
      link
      fedilink
      arrow-up
      34
      ·
      6 months ago

      You beat me to the punch.

      We were absolutely considering output delay and hoarding our CRT monitors.

      Some of us were also initially concerned about input delay from early USB until we were shown that while it is slower that it was unnoticeable.

    • sanosuke001@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 months ago

      I bought a Sun Microsystems 24" widescreen CRT for $400 on eBay back in 2003ish? iirc. It was 100lbs and delivered on a pallet lol. There’s a reason why they didn’t get very big and were mostly 4:3. 1920x1200 and like 30" deep! But, they did exist!

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        I know it. I myself have 2, a 1981 JVC UHF/VHF radio 4.5", and a Sylvania 27" with a DVD/VHS combo unit built into it.

        They even made a curved ultrawide CRT once, surprisingly. Though it cost a fortune when it came out.

    • SpaceCowboy@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      6 months ago

      Yeah, right?

      The fact that we know about this decades later is because people actually did care about it.

      When LCDs (then later LEDs) improved this concern kinda faded away. Which makes sense.

      • HackerJoe
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        ebay? If you can get an IBM P77 or Sony G220 (they are the same) in good working condition you should be golden. Those are awesome. They go up 170Hz, 75Hz at 1600x1200. And can even do 2048x1536 although that would be out of specs and only 60Hz (barely usable but fucking impressive).

        • RightHandOfIkaros@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          You will 100% overpay if you get one on eBay. Best place is to try asking businesses, schools, or local news stations if they have old CRTs you can look at they’d be willing to sell to you. News stations preferrably since they usually had very high quality BVMs.

    • InFerNo@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      It even took some weird proportions where “pro” gamers set their game to display 4:3 on their widescreen lcd.

      Habits die hard.

  • Bytemeister@lemmy.world
    link
    fedilink
    arrow-up
    88
    arrow-down
    4
    ·
    6 months ago

    I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields… The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      39
      arrow-down
      3
      ·
      6 months ago

      As someone who still uses a CRT for specific uses, I feel that you’re misremembering the switch over from CRT to LCD. At the time, LCD were blurry and less vibrant than CRT. Technical advancements have solved this over time.

      Late model CRTs were even flat to eliminate the distortion you’re describing.

        • Soggytoast@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          6 months ago

          They’re under a pretty high vacuum inside, so the flat glass has to be thicker to be strong enough

        • Hadriscus@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          yeah my parents had a trinitron, that thing weighed a whole cattle herd. The magnetic field started failing in the later years so one corner was forever distorted. It was an issue playing Halo because I couldn’t read the motion tracker (lower left)

      • rothaine@lemm.ee
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        6 months ago

        Resolution took a step back as well, IIRC. The last CRT I had could do 1200 vertical pixels, but I feel like it was years before we saw greater than 768 or 1080 on flat screen displays.

      • sugar_in_your_tea
        link
        fedilink
        arrow-up
        8
        arrow-down
        2
        ·
        6 months ago

        Sure, but they were thin, flat, and good enough. The desk space savings alone was worth it.

        I remember massive projection screens that took up half of a room. People flocked to wall mounted screens even though the picture was worse.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      6 months ago

      There was always push back in esports

      Smash uses CRTs today because of how much pushback there was/is

  • cordlesslamp@lemmy.today
    link
    fedilink
    arrow-up
    63
    arrow-down
    1
    ·
    6 months ago

    Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?

    • TexasDrunk@lemmy.world
      link
      fedilink
      arrow-up
      108
      arrow-down
      1
      ·
      6 months ago

      The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.

    • B0rax@feddit.de
      link
      fedilink
      arrow-up
      61
      arrow-down
      5
      ·
      6 months ago

      Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        49
        arrow-down
        1
        ·
        6 months ago

        And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.

        • ByteJunk@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          6 months ago

          Thanks for the explanation.

          OP’s point is a weird flex though, like pointing out that a bicycle never runs out of gas…

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        23
        ·
        edit-2
        6 months ago

        Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

        There’s some really bad misconceptions about how latency works on screens.

        • HackerJoe
          link
          fedilink
          arrow-up
          8
          ·
          6 months ago

          Those are on the graphics adapter. Not in the CRT.
          You can update the framebuffer faster than the CRT can draw. That’s when you get tearing. Same VSync then as now.

        • __dev@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

          • frezik@midwest.social
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            6 months ago

            Doesn’t matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn’t like racing the beam.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        6 months ago

        That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn’t the latency just moved to the PC side?

        • fmstrat@lemmy.nowsci.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 months ago

          I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don’t see in digital, digital is “code”. You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.

          Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.

          Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.

        • cynar@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          6 months ago

          When one of your times is in milliseconds, while the other requires awareness of relativistic effects, you might as well call it instant.

          The propagation speed in copper is 2/3 C. With analogue monitors, that was effectively amped and thrown at the screen. The phosphate coating is the slowest part, that takes 0.25-0.5ms to respond fully.

          By comparison, at the time, “gaming” LCD screens were advertising 23ms response rates.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      edit-2
      6 months ago

      They don’t have zero latency. It’s a misconception.

      The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

      Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

      • Björn Tantau@swg-empire.de
        link
        fedilink
        arrow-up
        19
        arrow-down
        1
        ·
        6 months ago

        Actually 60 Hz was too low to comfortably use a CRT. I think it started to work well at 75 Hz, better 80 or 85. Don’t know if I ever had a 90 Hz one, especially at a resolution above 1280x960. But if you valued your eyes you never went down to 60.

        No idea why 60 Hz on an LCD works better, though.

        • DefederateLemmyMl@feddit.nl
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          6 months ago

          No idea why 60 Hz on an LCD works better, though.

          Because LCD pixels are constantly lit up by a backlight. They don’t start to dim in between refresh cycles. They may take some time to change from one state to another, but that is perceived as ghosting, not flickering.

          On a CRT the phosporus dots are periodically lit up (or “refreshed”) by an electron beam, and then start to dim afterwards. So the lower the refresh rate, the more time they have to dim in between strobes. On low refresh rates this is perceived as flickering. On higher refresh rates, the dots don’t have enough time to noticably dim, so this is perceived as a more stable image. 60Hz happens to the refresh rate where this flicker effect becomes quite noticable to the human eye.

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          9
          ·
          6 months ago

          60Hz is what any NTSC TV would have had for consoles. Plenty of older computers, too. Lots of people gamed that way well into the 2000s.

          Incidently, if you do the same calculation above for PAL (50Hz), you end up at 10ms, or about 2ms more lag than NTSC. Many modern LCDs can have response times <2ms (which is on top of the console’s internal framerate matched to NTSC or PAL). The implication for retro consoles is that the lag difference between NTSC CRTs and modern LCDs is about the same as the difference between NTSC and PAL CRTs.

    • myplacedk@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      6 months ago

      Because it draws those “pixels” as the signal reaches the monitor. When half of a frame is transmitted to a CRT monitor, it’s basically half way done making it visible.

      An LCD monitor needs to wait for the entire frame to arrive, before it can be processed and then made visible.

      Sometimes the monitor will wait for several frames to arrive before it processes them. This enables some temporal processing. When you put a monitor in gaming mode, it disables (some of) this.

        • Lojcs@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 months ago

          No? Afaik vsync prevents the gpu from sending half drawn frames to the monitor, not the monitor from displaying them. The tearing happens in the gpu buffer Edit: read the edit

          Though I’m not sure how valid the part about latency is. In the worst case scenario (transfer of a frame taking the whole previous frame), the latency of an lcd can only be double that of a crt at the same refresh rate, which 120+ hz already compensates for. And for the inherent latency of the screen, most gaming lcd monitors have less than 5 ms of input lag while a crt on average takes half the frame time to display a pixel, so 8 ms.

          Edit: thought this over again. On crt those 2 happen simultaneously so the total latency is 8ms + pixel response time (which I don’t know the value of). On lcds, the transfer time should be (video stream bandwidth / cable bandwidth) * frame time. And that runs consecutively with the time to display it, which is frame time / 2 + pixel response time. Which could exceed the crt’s latency

          BUT I took the input lag number from my monitor’s rtings page and looking into how they get it, it seems it includes both the transfer time and frame time / 2 and it’s somehow still below 5 ms? That’s weird to me since for that the transfer either needs to happen within <1 ms (impossible) or the entire premise was wrong and lcds do start drawing before the entire frame reaches them

          Although pretty sure that’s still not the cause of tearing, which happens due to a frame being progressively rendered and written to the buffer, not because it’s progressively transferred or displayed

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      6 months ago

      The transmission is still the same with the exception of things like VRR and DSC. We still send a VBLANK signal which is the electronic signal to tell a CRT to move up to the top of the screen. We don’t change the way things are sent. It’s still top down, left to right. VSync and HSync are still used but make less obvious sense on LCDs. Digital displays translate this.

      Because LCDs convert these signals, we call the time it takes to do the conversion “draw time” but this isn’t as important today. What matters now is the time it takes for a pixel to change one color to another (response time). Because a CRT would fire electrons, the next frame would essentially vanish pretty quickly. LCDs don’t do this.

      Conversely OLEDs are plenty fast, but can’t reproduce the same pixel response without inserting a blank frame with Black Frame Insertion which sacrifices brightness and is being slowly removed.

      Still, most “lag” comes from transmission time. It takes 1/60s of a second to transmit a full frame at 60hz. Divide that 2 to get the “average” lag and CRTs would measure at 8.3333ms. LCDs were happy to get to 10ms.

      Now we can do 120hz which is way more important since even if CRTs are faster, you can get the whole image out in half the time, which “averages” at 4.1666ms, making even a “4ms” slow LCD on PC better than the console running at 60hz on CRT.

      And while CRTs could reach high resolution, these were limited by their HSync speed which usually means lower resolution, because a CRT could only move ever so quickly horizontally.

      Today that translates to an OLED is best for emulating any console that ran at 60hz and better or as good pixel response time if you are willing to do BFI. The main reason why the competitive Melee community still uses CRT is mostly pricing, second to FUD.

    • mindbleach
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Those pixels appear basically as soon as the signal arrives at the back of the monitor, and they’re gone within a dozen scanlines. Watch slow-motion video of a CRT and you’ll see there’s only a narrow band that’s bright at any given moment.

    • Socsa
      link
      fedilink
      arrow-up
      4
      arrow-down
      4
      ·
      6 months ago

      The motion blur thing is complete nonsense. It’s never been a benefit of CRT and reveals this greentext to be fake and gay.

  • Björn Tantau@swg-empire.de
    link
    fedilink
    arrow-up
    61
    arrow-down
    1
    ·
    6 months ago

    First rule at our LAN parties: You carry your own monitor.

    We’d help each other out with carrying equipment and snacks and setting everything up. But that big ass bulky CRT, carry it yourself!

    • Inktvip@lemm.ee
      link
      fedilink
      arrow-up
      19
      ·
      edit-2
      6 months ago

      Not necessarily if you’re the one walking in with the DC++ server. Getting that thing up and running was suddenly priority #1 for the entire floor.

  • Psythik@lemmy.world
    link
    fedilink
    arrow-up
    43
    arrow-down
    8
    ·
    6 months ago

    Hell, modern displays are just now starting to catch up to CRTs in the input lag and motion blur department.

    It was brutal putting up with these shitty LCDs for two whole decades, especially the fact that we had to put up with 60Hz and sub-1080p resolutions, when my CRT was displaying a 1600x1200 picture at 85Hz in the 90s! It wasn’t until I got a 4K 120Hz OLED with VRR and HDR couple years ago that I finally stopped missing CRTs, cause I finally felt like I had something superior.

    Twenty fucking years of waiting for something to surpass the good old CRT. Unbelievable.

    • Heavybell@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      6 months ago

      LCDs came in just in time for me to be attending LAN parties in uni. Got sick of lugging my CRT up the stairs once a week pretty quickly and was glad when I managed to get my hands on an LCD. I can’t even remember if I noticed the downgrade, I was so thrilled with the portability.

    • Aux@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      2
      ·
      6 months ago

      If input lag is the only measure for you, ok. But LCDs have surpassed CRTs in pretty much every other metric at least a decade ago.

      • Psythik@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        10
        ·
        6 months ago

        Not just input lag (I mean I literally mentioned other things too but you obviously didn’t read my entire comment) but also contrast ratio, brightness in LUX, color volume and accuracy, response time, viewing angle, displaying non-native resolutions clearly, flicker, stutter… Should I go on?

        All things that LCDs struggled on and still struggle on. OLED fixes most of these issues, and is the only display tech that I’d consider superior to a CRT.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      6 months ago

      Most people didn’t own a CRT capable of 1600x1200@85Hz, most were barely if any better in resolution department than your average “cube” LCDs (one which I’m currently using besides my main 32" QHD display). I have owned a gargantuan beast like that with a Trinitron tube, I could run it at 120Hz at 1024x768 and at higher resolutions without much flicker, but it had issues with the PCBs cracking, so it was replaced to a much more mediocre and smaller CRT with much lower refresh rates.

      • Psythik@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        10
        ·
        edit-2
        6 months ago

        In an OLED? They weren’t affordable 10 years ago.

        A 10 year old LCD is not good. The resolution and refresh rate is irrelevant if it’s not an OLED, which as I said, is the only display tech good enough to replace a CRT.

        • Wilzax@lemmy.world
          link
          fedilink
          arrow-up
          14
          arrow-down
          1
          ·
          6 months ago

          Not an OLED, in an IPS LCD. You’re asserting that OLED is the only tech good enough (which is not true, QLED displays are also starting to get good enough to surpass OLED, they’re just more expensive), but the response time of IPS displays frequently got under 10ms as long ago as 2014, and that’s fast enough to be imperceptible by humans. Any other drawbacks of IPS compared to OLED were far worse with CRTs.

          And they don’t make that annoying high-pitched shriek.

          • Hadriscus@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            6 months ago

            I have an Asus proart 23" from twelve years ago that’s great in terms of color (contrast and response time, not so much) but it produces a high pitched sound when at full brightness. I wondered if that was due to the panel tech itself

            • Wilzax@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              6 months ago

              I have never heard of an LCD making a high pitched noise like that, I think your monitor may be haunted

              • Hadriscus@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                6 months ago

                It’s likely. It’s not even a faulty unit, I returned it and the next one did the same thing. Better call a hardware exorcist

    • fallingcats@discuss.tchncs.de
      link
      fedilink
      arrow-up
      24
      arrow-down
      2
      ·
      6 months ago

      Just goes to show many gamers do not infact know what “input” lag is. I’ve seen the response time a monitor adds called input lag way to many times. And that mostly doesn’t in fact include the delay a (wireless) input device might add, or the GPU (with multiple frames in flight) for that matter.

      • Vardøgor@mander.xyz
        link
        fedilink
        arrow-up
        9
        ·
        6 months ago

        seems pretty pedantic. the context is monitors, and it’s lag from what’s inputted to what you see. plus especially with TVs, input lag is almost always because of response times.

      • PieMePlenty@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        6 months ago

        Lets see If I get this right, input lag is the time it takes from when you make an input (move your mouse) to when you see it happen on screen. So even the speed of light is at play here - when the monitor finally displays it, the light still has to travel to your eyes - and your brain still has to process that input!

      • Hadriscus@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Once I tried playing Halo or Battlefield on a friend’s xbox with a wireless controller on a very large TV. I couldn’t tell which of these (the controller, the tv or my friend) caused the delay but whatever I commanded happened on the screen, like, 70ms later. It was literally unplayable

        • Rev3rze@feddit.nl
          link
          fedilink
          arrow-up
          5
          ·
          6 months ago

          My guess would be the TV wasn’t in ‘game mode’. Which is to say it was doing a lot of post-processing on the image to make it look nicer but costs extra time, delaying the video stream a little.

  • mindbleach
    link
    fedilink
    arrow-up
    26
    ·
    6 months ago

    CRTs perfectly demonstrate engineering versus design. All of their technical features are nearly ideal - but they’re heavy as shit, turn a kilowatt straight into heat, and take an enormous footprint for a tiny window. I am typing this on a 55" display that’s probably too close. My first PC had a 15" monitor that was about 19" across, and I thought the square-ass 24" TV in the living room was enormous. They only felt big because they stuck out three feet from the nearest wall!

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 months ago

      The heavy part truly cannot be overstated. I recently got a tiny CRT, not even a cubic foot in size. It’s about the same weight as my friends massive OLED TV. Of course, OLED is particularly light, but still. It’s insane!

      • mindbleach
        link
        fedilink
        arrow-up
        9
        ·
        6 months ago

        And it’s a vacuum tube. How does nothing weigh this much?!

        Plasma screens weren’t much better, at first. I had a 30" one circa 2006, maybe three inches thick, and it you’d swear it was solid metal. A decade later we bought a couple 32" LCD TVs, then a few more because they were so cheap, and the later ones weighed next to nothing. Nowadays - well, I walked this 55" up and down a flight of stairs by myself, and the only hard parts were finding somewhere to grab and not bonking any walls.

        • don@lemm.ee
          link
          fedilink
          arrow-up
          10
          ·
          6 months ago

          The vacuum itself might not weigh anything, but the glass strong enough to resist the implosion the vacuum would cause has to be pretty thick, which is where the weight is

          • mindbleach
            link
            fedilink
            arrow-up
            7
            ·
            6 months ago

            And that scales nonlinearly with volume, so smaller monitors are even denser than big monitors.

  • FrostyCaveman@lemm.ee
    link
    fedilink
    arrow-up
    24
    ·
    6 months ago

    That pic reminds me of something. Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?

    • TSG_Asmodeus (he, him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      6 months ago

      Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?

      We got to move these refrigerators, we got to move these colour TV’s.

    • ZombiFrancis
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      6 months ago

      Those flatscreen CRTs were pretty great for their time though. Maybe/probably rose tinted glasses but man I remember them being plain better monitors overall.

      • daltotron@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        6 months ago

        They probably were in terms of viewing angles at the time of release, and probably were better if you had a technician which was able to come and adjust it or could adjust it at the store before it was sold, but I think the flatscreen CRTs have a much higher tendency for image warping over time.

        • ZombiFrancis
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          Technician? More like some early 2000s teenager sweating bullets as they fiddle with settings and knobs they barely understand.

          I took that sucker to LAN parties and always had to recalibrate after bumping it up and down stairs. I actually had that damned thing in use through 2013.

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    6 months ago

    It was a dark day for gamers when the competitive things crawled out of their sports holes.

  • figaro@lemdro.id
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 months ago

    It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      38
      ·
      edit-2
      6 months ago

      I mean, I have some nostalgia moments, but while I think that while OP’s got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, I also think that in pretty much all aspects, current LCD/LEDs beat CRTs.

      Looking at OP’s benefits:

      0 motion blur

      CRT phosphors didn’t just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous mouse pointer trails. But if you’ve ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some response time.

      https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays

      Response time: 0.01 ms[14] to less than 1 μs,[15] but limited by phosphor decay time (around 5 ms)[16]

      0 input lag

      That’s not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was “smart” TVs adding stuff like image processing that involved buffering some video.

      At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.

      • As one moved around, the color you saw on many types of LCDs shifted dramatically.

      • There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.

      • Contrast wasn’t great; blacks were really murky grays.

      • Early LCDs couldn’t do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.

      • Pixels could get stuck.

      But those have mostly been dealt with.

      CRTs had a lot of problems too, and LED/LCD displays really address those:

      • They were heavy. This wasn’t so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.

      • They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur “blend” together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I’d rather have the sharpness. The blurriness also wasn’t always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution and refresh rate so high, and the higher you went, the blurrier things were.

      • There were scanlines; brightness wasn’t even.

      • You could get color fringing.

      • Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.

      • They didn’t deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we’re better-off with wider displays.

      • Analog signalling meant that as cables got longer, the image got blurrier.

      • They used more electricity and generated more heat than LED/LCD displays.

      • Kushan@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 months ago

        It’s also worth pointing out that OLED’s solve many of the drawbacks of LCD’s, particularly around latency and response times.

        We just don’t talk about burn in.

        • AnyOldName3@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          Current-generation OLEDs aren’t worse than late-generation CRTs for burn-in, they’re just worse than LCDs.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Also a lot of CRT nostalgia comes from whatever display a certain person had. With a lot of my CRTs, the screen were sharp enough for a more pixelated look, and required me to turn off scanline effects in emulators as they just turned everything quite bad looking instead. Except a really bad monitor I owned, because the previous owner lied it could do 1024x768@75Hz (it was an old VGA monitor, and didn’t like that resolution).

      • vithigar@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        My Trinitron monitor actually had two of those stabilizing wires. They were very thin, much thinner than even a single scan line, but you could definitely notice them on an all white background.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Apparently the dividing line was 15 inches:

          https://en.wikipedia.org/wiki/Trinitron

          Screens 15" and below have one wire located about two thirds of the way down the screen, while monitors greater than 15" have 2 wires at the one-third and two-thirds positions.

      • Socsa
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        6 months ago

        Yeah “zero motion blur” tells me OP has literally never seen a CRT and is just repeating something he heard his grandpa say.

  • jaybone@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    6 months ago

    Guy on the left has this maniacal smile. Like he just got an A on the midterm at Clown College for Advanced Villainry.