• dindonmasker@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    I personally want more physics simulations. I always loved 2D falling sand games where everything reacted with each other and after a long time not having games with those mechanics i found noita and i can’t stop playing it. As much for the game loop then for the game’s falling sand engine.

  • Endorkend@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Thing is, looking at some games, Horizon and Elden Ring being a prime examples, we can have both great games with great graphics.

    You don’t really want better games with worse graphics, you want better games that don’t use great graphics as an excuse to bad gameplay.

  • buzz86us@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Yeah graphics are nice to have, but sometimes I want to game on a small and light laptop like I don’t need revolutionary HD high quality all the time

    • DrSteveBrule@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Art Style > Graphics. Kingdom Hearts (2002) looks wildly better GTA: San Andreas (2004) and Fallout 3 (2008).

      • PraiseTheSoup@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Fallout 3 looks like dog shit man. It has since day 1. It’s one of my favorite games and I have 100% on it, but it has never looked good.

          • 𝓔𝓶𝓶𝓲𝓮@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            6 months ago

            Fo76 looks amazing on max settings and nvidia upscaling. It still has ugly elements but overall I made so many screenshots the only other game I made this many screenshots is modded Skyrim.

            I will link one later actually to demonstrate it

            • Aux@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              6 months ago

              Idk, I’m playing FO76 on ultra on 4K right now and it looks like shit. Not much different than Skyrim. Compare it to something like Forza Horizon 5 and it’s not even funny how bad FO76 looks like.

    • HauntedCupcake@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I mean this with the greatest respect, I’m not making a judgement on the gameplay.

      But there’s a whole spectrum between Roblox and the latest Quadruple A™ that all consist of “worse graphics”

    • PenisWenisGenius@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      As inflation continues to outpace wages, surely more people will start preferring this. $1000 for a gpu is a joke. If I ever develop an indie game my target system is going to be like, a 1.6ghz core i3 and garden variety basic opengl capable graphics card.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        There are constant high end GPU shortages, $1,000 is too cheap.

        • PenisWenisGenius@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          6 months ago

          idk, maybe we need to figure out how to get by with basic laptop opengl graphics. An Intel HD 4000 would have been a groundbreaking graphics card in 2005 but today you can barely run a unity project with one. More serious effort needs to go into optimization and efficiency I think and if that means everything has to have 2005 era graphics (which aren’t even that bad) then that’s what has to be done.

          Making your own game engine an using open source 3d engine then filling in the rest is too much work for most indie devs but as enshitification continues this will eventually stop being the case. Tux kart was made this way and it can run on a potato.

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            6 months ago

            No one wants to play potato games. And this is evident by the shortage of high end GPUs. People want better graphics and people have the money for GPUs. If you check Steam stats, then the top 15 cards are all 3060, 4060, 3070, 4070, and 3080. Steam has 132m active monthly users and 2% of their users have 3080 cards. That’s over 2.6m people with a high end card.

            There are only 0.2% of Intel HD 4000 users. When you combine all the mid and high end GPU users it becomes obvious that there’s absolutely no point making games for Intel HD 4000.

  • mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Chasing photorealism has been unsustainable since before MW2 came out. You could see where that line was headed. The answer has always been procedural artwork - not randomized, just rule-based. Even if an entire desert gets away with four textures for sand, those shouldn’t be hand-drawn and manually-approved bitmaps. They should not be fixed-resolution. Let the machine generate them at whatever level of detail you need. Define what it’s supposed to look like.

    This is how that “Doom 3 on a floppy disk” game, .kkreiger, worked. It weighs 96 KB. It doesn’t look like Descent. It has oodles of textures and smooth models. Blowing a few megabytes on that kind of content is a lot easier than cramming things down and a lot cheaper than mastering five hundred compressed six-channel bitmaps. Even if every rivet on a metal panel was drawn by hand with a circle tool, ship that tool, so that no matter how closely the player looks, those rivets stay circular.

    You can draw rust and have it be less shiny because that’s how rust is defined - and have that same smear of rust look a little bit different every time it appears, tiled across a whole battleship. Every bullet ding and cement crack can become utterly unremarkable by being completely unique and razor-sharp at macro-lens distances. You don’t hire a thousand artists to manage one tree each, you hire a handful of maniacs who can define: wood. Sapling, tree, log, plank, chair, wood. Hand that to a dozen artists and watch them crank out a whole bespoke forest in an afternoon.

    • Kazumara@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      You propose an interesting approach. I just wonder how the individual streaks of different rust interact with typical graphics pipelines. You can certainly ship a generator, but then for rasterizing the image the texture still has to be generated and shipped off to GPU memory to be used in shaders, won’t you blow through VRAM limits or shader cache limits by having no texture reuse anywhere?

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Any game with texture pop-in is already handling more data than you have space. “Rage” famously had unique textures across the entire world… and infamously streamed them from DVD, with the dumbest logic for loading and unloading. You could wait for everything to load, turn around, and it would all be blurry again.

        Anyway if you’re rendering ten zillion copies of something way out in the distance, those can all be the same. It will not matter whether they’re high-res or unique when they’re eight pixels across. As Nvidia said: if you’re not cheating, you’re just not trying.

    • icesentry@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      How do you think modern games are made? Procedural generation is used all over the place to create materials and entire landscapes.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        But never ships clientside.

        These tools have been grudgingly adopted, but only to make ‘let’s hire ten thousand artists for a decade!’ accomplish some ridiculous goal, as measured in archaic compressed textures and static models. The closest we came was “tessellation” as a buzzword for cranking polycount in post. And it somehow fucked up both visuals and performance. Nowadays Unreal 5 brags about its ability to render zillion-polygon Mudbox meshes at sensible framerates, rather than letting artists do pseudo-NURBS shit on models that don’t have a polycount. And no bespoke game seems ready to scale to 32K, or zoom in on a square inch of carpet without seeing texels, even though we’ve had this tech for umpteen years and a texture atlas is not novel.

        Budgets keep going up and dev cycles keep getting longer and it’s never because making A Game is getting any harder.

  • FluffyPotato@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Personally I’d prefer if games used more stylized graphics like pixel art or hand drawn stuff. That’s not worse in graphical quality but better imho while not needing a supercomputer to run. Spiritfarer is still one of the prettiest games I have played and it runs on the switch.

    Going with stylized graphics instead of trying to do photorealism also makes the game age way more gracefully. Bastion for example still looks amazing while there’s a reason Oblivion npcs are a meme.

  • Th4tGuyII@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Honestly, I have to agree with the article - while you could say graphics have improved in the last decade, it’s nowhere near as much as the difference as the decade before that.

    I’d easily argue that the average AAA game from a decade ago looks just as good on a 1080/1440p display as the average AAA game today - and I’d still bet the difference wouldn’t be that noticeable for 4K either.

    And what do we gain for that diminishing return on graphics?
    Singleplayer games are being made smaller, or vapid “open worlds”, and cost more due to more resources going to design teams rather than the rest of the game.
    Meanwhile multiplayer games get less frequent and smaller updates, and that gets padded out with aggressive micro-transactions.

    I hate that “realistic” graphics has become such an over-hyped selling point in games that it’s consuming AAA gaming in its entirety.

    I would love for AAA games to go back to being reasonably priced with plainer looking graphics, so that resources can actually be put into making them more than just glorified tech demos.

    • Shawdow194@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Well it’s a scaling effect and diminishing returns

      To the human eye 480p vs 1080p is significant but 4k vs 8k is hard to tell

      I think focusing on new technologies such as AI upscaling/world generation or VR is a better use of developers time and pushes the industry back into the innovative space it’s supposed to be

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      I don’t have a 1080p monitor, but most games look like shit on 4K. Bumping texture resolution is not enough for 4K, you also need better geometry and much longer drawing distances. If it’s not an Unreal 5 game with their virtually infinite geometry detailing, then it mostly likely looks like shit.

  • Odious@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I’d love to upvote this more than once. What’s the point of all those super high quality graphics if the core gameplay hasn’t advanced in the slightest 🙄

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      AAA studios

      Best I can do is predatory monetization and half-baked dlc. Also, now the Eula prohibit you from making unflattering comparisons to that one game Larian made

  • azvasKvklenko@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I’ve got pretty similar thoughts. I wasn’t into gaming all that much up until relatively recently when I built my first gaming PC at the beginning of pandemic. Thanks to that, I’m not only on market for bleeding edge AAA titles, but also discovering 3 dacades worth of PC games. My observation is that games got worse over time. They’re also a lot more expensive to make because it all must be visually impressive, which usually ends up with poor performance and bugs, requiring high-end hardware for the game to run somehow. Quite often games are broken and unoptimized on launch, they have that generic formula, watch cinematic, hold a button, watch some more, here’s your little tutorial fight, now more cutscene and a crappy puzzle. It really makes me feel, if game developers were more limited by hardware constraint and unable to feed legions of normie players to flashy graphics, they wouldn’t have other way to makes games attractive other than with better mechanics and level design.

    Meanwhile Nintendo continues to release bangers for their ancient potato console.

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      Game development got more expensive because people want more complex games. No one wants to play a shooter with loading screens, everyone wants to play an open world game. Even if you tone down the graphics, such development will still be a lot more expensive.