• JakJak98@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 days ago

      I feel like bloom depends on how intense it is, and if it makes sense to reasonably play the game.

      Like, if it’s the sun, yeah, bloom is OK.

      If it’s anything else? Pass.

  • Yaarmehearty@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    6 days ago

    The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?

    • ne0phyte@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      6 days ago

      It is unnatural. The focus follows where you are looking at. Having that fixed based on the mouse/center of the screen instead of what my eyes are doing feels so wrong to me.

      I bet with good eye tracking it would feel different.

      • Yaarmehearty@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        6 days ago

        That makes sense, if you can’t dynamically control what is in focus then it’s taking a lot of control away from the player.

        I can also see why a dev would want to use it for a fixed angle cutscene to create subject separation and pull attention in the scene though.

  • Psythik@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    7 days ago

    Hating on hair quality is a new one for me. I can understand turning off Ray Tracing if you can have a low-end GPU, but hair quality? It’s been at least a decade since I’ve last heard people complaining that their GPU couldn’t handle Hairworks. Does any game even still use it?

  • ShortFuse@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 days ago

    Bad effects are bad.

    I used to hate film grain and then did the research for implementing myself, digging up old research papers on how It works at a scientific level. I ended up implementing a custom film grain in Starfield Luma and RenoDX. I actually like it and it has a level of “je ne sais quoi” that clicks in my brain that feels like film.

    The gist is that everyone just does additive random noise which raises black floor and dirties the image. Film grain is perceptual which acts like cracks in the “dots” that compose an image. It’s not something to be “scanned” or overlayed (which gives a dirty screen effect).

    Related, motion blur is how we see things in real life. Our eyes have a certain level of blur/shutter speed and games can have a soap opera effect. I’ve only seen per-object motion blur look decent, but fullscreen is just weird, IMO.

    • dustyData@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 days ago

      On Motion blur, our eye’s motion blur, and camera’s shutter speed motion blur are not the same. Eyes don’t have a shutter speed. Whatever smearing we see is the result of relaxed processing on the brain side. Under adrenaline with heavy focus, our motion blur disappears as our brain goes full power trying to keep us alive. If you are sleep deprived and physically tired, then everything is blurred, even with little motion from head or eyes.

      Over 99% of eye movement (e.g. saccadic eye movement) is ignored by the brain and won’t produce a blurred impression. It’s more common to notice vehicular fast movement, like when sitting in a car, as having some blur. But it can be easily overcome by focused attention and compensatory eye tracking or ocular stabilization. In the end, most of these graphical effects emulate camera behavior rather than natural experience, and thus are perceived as more artificial than the same games without the effects. When our brain sees motion blur it thinks movie theater, not natural everyday vision.

      • Aux@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 days ago

        Eyes do have a “shutter speed”, but the effect is usually filtered out by the brain and you need very specific circumstances to notice motion blur induced by this.

        • dustyData@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 days ago

          No, they don’t. As there is no shutter in a continuous parallel neural stream. But, if you have any research paper that says so, go ahead and share.

          • Aux@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            6 days ago

            It has nothing to do with a neural stream, it’s basic physics.

            • dustyData@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              6 days ago

              Explain, don’t just antagonize. I bet you don’t understand the basic physics either. I’m open to learn new things. What is the eye’s shutter speed? sustain your claim with sources.

              • Aux@feddit.uk
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 days ago

                I put “shutter speed” in quotes for a reason. To gather the required amount of light, the sensor must be exposed to it for a specific amount of time. When it’s dark, the time increases. It doesn’t matter if it’s a camera or your eye.

                • dustyData@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  5 days ago

                  That’s sensitivity, not shutter speed. Eye’s do not require time for exposure, but a quanta or intensity of light. This sensitivity is variable, but not in a time dilated way. Notice that you don’t see blurrier in darker conditions, unlike a camera. You do see in duller colors, as a result of higher engagement of rods instead of cones. The first are more sensitive but less dense in the fovea, and not sensitive to color. While a camera remains as colorful but more prone to motion blur. This is because the brain does not take individual frames of time to process a single still and particular image. The brain analyses the signals from the eye continuously, dynamically and in parallel from each individual sensor, cone or rod.

                  In other words, eye’s still don’t have, even a figurative, shutter speed. Because eyes don’t work exactly like a camera.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 days ago

        Yeah, if you see motion blur in real life, that usually means something bad, yet game devs are not using it for those purposes.

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      Most “film grain” is just additive noise akin to digital camera noise. I’ve modded a bunch of games for HDR (RenoDX creator) and I strip it from almost every game because it’s unbearable. I have a custom film grain that mimic real film and at low levels it’s imperceptible and acts as a dithering tool to improve gradients (remove banding). For some games that emulate a film look sometimes the (proper) film grain lends to the the look.

      • kautau@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 days ago

        Agreed. It fits very well in very specific places, but when not there, it’s just noise

    • JokeDeity@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      Worst fucking AA ever created and it blows my mind when it’s the default in a game.

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    Step 1. Turn on ray tracing

    Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames

    Step 3. Switch back to DX11, disable ray tracing

    Step 4. Play the game

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      Best use of ray tracing I’ve seen is to make old games look good, like Quake II or Portal or Minecraft. Newer games are “I see the reflection in the puddle just under the car when I put them side by side” and I just can’t bring myself to care.

    • ElectroLisa@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      If I know a game I’m about to play runs on Unreal Engine, I’m passing a -dx11 flag immediately. It removes a lot of useless Unreal features like Nanite

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          7 days ago

          Nanite + Lumen run like garbage on anything other than super high end hardware.

          It is also very difficult to tweak and optimize.

          Nanite isn’t as unperformant as Lumen, but its basically just a time saver for game devs, and its very easy for a less skilled game dev to think they are using it correctly… and actually not be.

          But, Nanite + Lumen have also become basically the default for AAA games down to shitty asset flips… because they’re easier to use from a dev standpoint.

      • boletus@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        Nanite doesn’t affect any of the post processing stuff nor the smeary look. I don’t like that games rely on it but modern ue5 games author their assets for nanite. All it affects is model quality and lods.

        Lumen and other real time GI stuff is what forces them to use temporal anti aliasing and other blurring effects, that’s where the slop is.

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Then you get to enjoy they worst LODs known to man because they were only made as a fallback

  • DaddleDew@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    Has the person who invented the depth of field effect for a video game ever even PLAYED a game before?

    • alaphic@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      Well, not exactly, but they were described to him once by an elderly man with severe cataracts and that was deemed more than sufficient by corporate.

    • 11111one11111@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      What is the depth of field option? When it’s on what happens vs when it’s off?

      Side question, why the fuck does everything in IT reuse fucking names? Depth of field means how far from character it’ll render the environment, right? So if the above option only has an on or off option then it is affecting something other than the actual depth of field, right? So why the fuck would the name of it be depth of fucking field??? I see this shit all the time as I learn more and more about software related shit.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 days ago

        Put your finger in front of your face. Focus on it. Background blurry? That’s depth of field. Now look at the background and notice your finger get blurry.

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Depth of field is basically how your characters eyes are unfocused on everything they aren’t directly looking at.

        If there are two boxes, 20 meters apart, one of them will be blurry, while aiming at the other.

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 days ago

          Your example is great at illustrating how DoF is often widely exaggerated in implementation, giving the player the experience of having very severe astigmatism, far beyond the real world DoF experienced by the average… eyeball haver.

      • DaddleDew@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        8 days ago

        When it’s on, whatever the playable character looks at will be in focus and everything else that is at different distances will be blurry, as it would be the case in real life if your eyes were the playable character’s eyes. The problem is that the player’s eyes are NOT the playable character’s eyes. Players have the ability to look around elsewhere on the screen and the vast majority of them use it all the time in order to play the game. But with that stupid feature on everything is blurry and the only way to get them in focus is to move the playable character’s view around along with it to get the game to focus on it. It just constantly feels like something is wrong with your eyes and you can’t see shit.

        • Zozano@aussie.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          It’s like motion blur. Your eyes already do that, you don’t need it to be simulated…

          • SitD@lemy.lol
            link
            fedilink
            arrow-up
            0
            ·
            8 days ago

            to be fair you need it for 24fps movies. however, on 144Hz monitors it’s entirely pointless indeed

            • Zozano@aussie.zone
              link
              fedilink
              English
              arrow-up
              0
              ·
              7 days ago

              My Dad showed me the Avatar game on PS4. The default settings have EXTREME motion blur, just by turning the camera; the world becomes a mess of indecipherable colors, it’s sickening.

              Turning it off changed the game completely.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            8 days ago

            For depth of field, our eyes don’t automatically do that for a rendered image. It’s a 2d image when we look at it and all pixels are the same distance and all are in focus at the same time. It’s the effect you get when you look at something in the distance and put your finger near your eye; it’s blurry (unless you focus on it, in which case the distant objects become blurry).

            Even VR doesn’t get it automatically.

            It can feel unnatural because we normally control it unconsciously (or consciously if we want to and know how to control those eye muscles at will).

          • FooBarrington@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            8 days ago

            No, your eyes can’t do it on a screen. The effect is physically caused by the different distances of two objects, but the screen is always the same distance from you.

            • Zozano@aussie.zone
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 days ago

              Yes, but you still get the blurry effect outside of the spot on the screen you’re focused on.

              • FooBarrington@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                7 days ago

                Not in the same way. Our eyes have lower resolution away from the center, but that’s not what’s causing DoF effects. You’re still missing the actual DoF.

                If the effect was only caused by your eye, the depth wouldn’t matter, but it clearly does.

                • Zozano@aussie.zone
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  7 days ago

                  Yeah I get it, I’m just saying it’s unnecessary. If I need to see what’s going on in the background, then my eyes should be able to focus on it.

                  There are very few scenarios where DoF would be appropriate (like playing a character who lost their glasses).

                  Like chromatic aberration, which feels appropriate for Cyberpunk, since the main character gets eye implants and fits the cyberpunk theme.

      • StitchIsABitch@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        In this context it just refers to a post processing effect that blurs certain objects based on their distance to the camera. Honestly it is one of the less bad ones imo, as it can be well done and is sometimes necessary to pull off a certain look.

      • tehmics@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        No.

        Depth of field is when backgroud/foreground objects get blurred depending on where you’re looking, to simulate eyes focusing on something.

        You’re thinking of draw distance, which is where objects far away aren’t rendered. Or possibly level of detail (LoD) where distant objects will be changed to a lower detailed model as they get further away.

    • shneancy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      it works great for games that have little to no combat, or combat that’s mostly melee and up to like 3v1. or if it’s a very slight DOF that just gently blurs things far away

      idk what deranged individual plays FPS games with heavy DOF though

    • taiyang@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      I mean, it works in… hmmm… RPGs, maybe?

      When I was a kid there was an effect in FF8 where the background blurred out in Balamb Garden and it made the place feel bigger. A 2D painted background blur, haha.

      Then someone was like, let’s do that in the twenty-first century and ruined everything. When you’ve got draw distance, why blur?

      • DaddleDew@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        Yes, it makes sense in a game where the designer already knows where the important action is and controls the camera to focus on it. It however does not work in a game where the action could be anywhere and camera doesn’t necessarily focus on it.

        • taiyang@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          8 days ago

          Yup, or if they’re covering up hardware deficiency, like Nintendo sometimes does. And even then, they generally prefer to just make everything a little fuzzy, like BotW.

  • Onionguy@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    Taps temple Auto disable ray tracing if your gpu is too old to support it ( ͡° ͜ʖ ͡°)