• jaybone@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Guy on the left has this maniacal smile. Like he just got an A on the midterm at Clown College for Advanced Villainry.

  • Björn Tantau@swg-empire.de
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    First rule at our LAN parties: You carry your own monitor.

    We’d help each other out with carrying equipment and snacks and setting everything up. But that big ass bulky CRT, carry it yourself!

    • Inktvip@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      4 months ago

      Not necessarily if you’re the one walking in with the DC++ server. Getting that thing up and running was suddenly priority #1 for the entire floor.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      I mean, I have some nostalgia moments, but I think that while OP’s got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, but I think that in pretty much all aspects, current LCD/LEDs beat CRTs.

      Looking at OP’s benefits:

      0 motion blur

      CRT phosphors didn’t just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous mouse pointer trails. But if you’ve ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some response time.

      https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays

      Response time: 0.01 ms[14] to less than 1 μs,[15] but limited by phosphor decay time (around 5 ms)[16]

      0 input lag

      That’s not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was “smart” TVs adding stuff like image processing that involved buffering some video.

      At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.

      • As one moved around, the color you saw on many types of LCDs shifted dramatically.

      • There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.

      • Contrast wasn’t great; blacks were really murky grays.

      • Early LCDs couldn’t do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.

      • Pixels could get stuck.

      But those have mostly been dealt with.

      CRTs had a lot of problems too, and LED/LCD displays really address those:

      • They were heavy. This wasn’t so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.

      • They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur “blend” together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I’d rather have the sharpness. The blurriness also wasn’t always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution and refresh rate so high.

      • There were scanlines; brightness wasn’t even.

      • You could get color fringing.

      • Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.

      • They didn’t deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we’re better-off with wider displays.

      • Analog signalling meant that as cables got longer, the image got blurrier.

      • Socsa@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Yeah “zero motion blur” tells me OP has literally never seen a CRT and is just repeating something he heard his grandpa say.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Also a lot of CRT nostalgia comes from whatever display a certain person had. With a lot of my CRTs, the screen were sharp enough for a more pixelated look, and required me to turn off scanline effects in emulators as they just turned everything quite bad looking instead. Except a really bad monitor I owned, because the previous owner lied it could do 1024x768@75Hz (it was an old VGA monitor, and didn’t like that resolution).

      • vithigar@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        My Trinitron monitor actually had two of those stabilizing wires. They were very thin, much thinner than even a single scan line, but you could definitely notice them on an all white background.

      • Kushan@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        It’s also worth pointing out that OLED’s solve many of the drawbacks of LCD’s, particularly around latency and response times.

        We just don’t talk about burn in.

        • AnyOldName3@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Current-generation OLEDs aren’t worse than late-generation CRTs for burn-in, they’re just worse than LCDs.

    • fallingcats@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Just goes to show many gamers do not infact know what “input” lag is. I’ve seen the response time a monitor adds called input lag way to many times. And that mostly doesn’t in fact include the delay a (wireless) input device might add, or the GPU (with multiple frames in flight) for that matter.

      • Hadriscus@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Once I tried playing Halo or Battlefield on a friend’s xbox with a wireless controller on a very large TV. I couldn’t tell which of these (the controller, the tv or my friend) caused the delay but whatever I commanded happened on the screen, like, 70ms later. It was literally unplayable

        • Rev3rze@feddit.nl
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          My guess would be the TV wasn’t in ‘game mode’. Which is to say it was doing a lot of post-processing on the image to make it look nicer but costs extra time, delaying the video stream a little.

      • Vardøgor@mander.xyz
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        seems pretty pedantic. the context is monitors, and it’s lag from what’s inputted to what you see. plus especially with TVs, input lag is almost always because of response times.

      • PieMePlenty@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Lets see If I get this right, input lag is the time it takes from when you make an input (move your mouse) to when you see it happen on screen. So even the speed of light is at play here - when the monitor finally displays it, the light still has to travel to your eyes - and your brain still has to process that input!

  • figaro@lemdro.id
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car

      • Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Sure, and in fact some developers used the fuzziness to their advantage, which can make certain games look weird when you display them on anything modern. But, my point was more that some people are in here acting like every part of a CRT experience is better than flatscreens.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific “pixel” for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).

      What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.

      Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    It was a dark day for gamers when the competitive things crawled out of their sports holes.

  • Bytemeister@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields… The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      There was always push back in esports

      Smash uses CRTs today because of how much pushback there was/is

      • WldFyre@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Melee uses CRTs because it’s an old ass game lol

        Ultimate is not played on CRTs

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      As someone who still uses a CRT for specific uses, I feel that you’re misremembering the switch over from CRT to LCD. At the time, LCD were blurry and less vibrant than CRT. Technical advancements have solved this over time.

      Late model CRTs were even flat to eliminate the distortion you’re describing.

      • rothaine@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Resolution took a step back as well, IIRC. The last CRT I had could do 1200 vertical pixels, but I feel like it was years before we saw greater than 768 or 1080 on flat screen displays.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Sure, but they were thin, flat, and good enough. The desk space savings alone was worth it.

        I remember massive projection screens that took up half of a room. People flocked to wall mounted screens even though the picture was worse.

        • Hadriscus@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          yeah my parents had a trinitron, that thing weighed a whole cattle herd. The magnetic field started failing in the later years so one corner was forever distorted. It was an issue playing Halo because I couldn’t read the motion tracker (lower left)

        • Soggytoast@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          They’re under a pretty high vacuum inside, so the flat glass has to be thicker to be strong enough

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I had a 20-odd inch CRT with the flat tube. Best CRT I ever had, last one I had before going to LCD. Still miss that thing, the picture was great! Weighed a ton, though.

  • FrostyCaveman@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    That pic reminds me of something. Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?

    • ZombiFrancis@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Those flatscreen CRTs were pretty great for their time though. Maybe/probably rose tinted glasses but man I remember them being plain better monitors overall.

      • daltotron@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        They probably were in terms of viewing angles at the time of release, and probably were better if you had a technician which was able to come and adjust it or could adjust it at the store before it was sold, but I think the flatscreen CRTs have a much higher tendency for image warping over time.

        • ZombiFrancis@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Technician? More like some early 2000s teenager sweating bullets as they fiddle with settings and knobs they barely understand.

          I took that sucker to LAN parties and always had to recalibrate after bumping it up and down stairs. I actually had that damned thing in use through 2013.

    • TSG_Asmodeus (he, him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?

      We got to move these refrigerators, we got to move these colour TV’s.