• BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

        • Holzkohlen@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO