i need some motion blur on otherwise i get motion sickness.
Wait, I’ve been turning it off to prevent motion sickness. 🤔
My friend is the same way as you haha.
i like lens flare its pretty
I like lense flare for a bit if I’m just enjoying the scenery or whatever. If I’m actually playing the game though, turn that shit off so I can actually see
You are supposed to not see
But what about Bloom?
I feel like bloom depends on how intense it is, and if it makes sense to reasonably play the game.
Like, if it’s the sun, yeah, bloom is OK.
If it’s anything else? Pass.
The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?
Different mediums. Different perception. Games are a different kind of immersion.
It is unnatural. The focus follows where you are looking at. Having that fixed based on the mouse/center of the screen instead of what my eyes are doing feels so wrong to me.
I bet with good eye tracking it would feel different.
That makes sense, if you can’t dynamically control what is in focus then it’s taking a lot of control away from the player.
I can also see why a dev would want to use it for a fixed angle cutscene to create subject separation and pull attention in the scene though.
The title should be “anon can’t afford rtx5090”.
Don’t forget TAA!
Worst fucking AA ever created and it blows my mind when it’s the default in a game.
I’d add Denuvo to that list. Easily a 10-20% impact.
Unfortunately that’s not a setting most of us can just disable.
/c/crackwatch@lemmy.dbzer0.com sure you can
Step 1. Turn on ray tracing
Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames
Step 3. Switch back to DX11, disable ray tracing
Step 4. Play the game
True, I’ve had very few games worth the fps hit
If I know a game I’m about to play runs on Unreal Engine, I’m passing a -dx11 flag immediately. It removes a lot of useless Unreal features like Nanite
Then you get to enjoy they worst LODs known to man because they were only made as a fallback
what’s wrong with nanite?
Nanite + Lumen run like garbage on anything other than super high end hardware.
It is also very difficult to tweak and optimize.
Nanite isn’t as unperformant as Lumen, but its basically just a time saver for game devs, and its very easy for a less skilled game dev to think they are using it correctly… and actually not be.
But, Nanite + Lumen have also become basically the default for AAA games down to shitty asset flips… because they’re easier to use from a dev standpoint.
Nanite doesn’t affect any of the post processing stuff nor the smeary look. I don’t like that games rely on it but modern ue5 games author their assets for nanite. All it affects is model quality and lods.
Lumen and other real time GI stuff is what forces them to use temporal anti aliasing and other blurring effects, that’s where the slop is.
I don’t even check anymore lol.
The slideshow Control experience does look stellar for a bit
Best use of ray tracing I’ve seen is to make old games look good, like Quake II or Portal or Minecraft. Newer games are “I see the reflection in the puddle just under the car when I put them side by side” and I just can’t bring myself to care.
If only I could just turn off the chromatic aberration in my eyeglasses.
What Anti Aliasing does your glasses use?
I’m on the -4.25 setting but I may be due for a new prescription as newer reality is getting blurry again.
You can get ones with less chromatic aberration, but it’ll cost you.
Has the person who invented the depth of field effect for a video game ever even PLAYED a game before?
Well, not exactly, but they were described to him once by an elderly man with severe cataracts and that was deemed more than sufficient by corporate.
What is the depth of field option? When it’s on what happens vs when it’s off?
Side question, why the fuck does everything in IT reuse fucking names? Depth of field means how far from character it’ll render the environment, right? So if the above option only has an on or off option then it is affecting something other than the actual depth of field, right? So why the fuck would the name of it be depth of fucking field??? I see this shit all the time as I learn more and more about software related shit.
Put your finger in front of your face. Focus on it. Background blurry? That’s depth of field. Now look at the background and notice your finger get blurry.
Depth of field is basically how your characters eyes are unfocused on everything they aren’t directly looking at.
If there are two boxes, 20 meters apart, one of them will be blurry, while aiming at the other.
Your example is great at illustrating how DoF is often widely exaggerated in implementation, giving the player the experience of having very severe astigmatism, far beyond the real world DoF experienced by the average… eyeball haver.
https://en.wikipedia.org/wiki/Depth_of_field
It’s not “IT” naming. It’s physics. Probably a century or few old. That’s what they’re trying to emulate to make things like more photographic/cinematic.
Same with almost all the other options listed.
When it’s on, whatever the playable character looks at will be in focus and everything else that is at different distances will be blurry, as it would be the case in real life if your eyes were the playable character’s eyes. The problem is that the player’s eyes are NOT the playable character’s eyes. Players have the ability to look around elsewhere on the screen and the vast majority of them use it all the time in order to play the game. But with that stupid feature on everything is blurry and the only way to get them in focus is to move the playable character’s view around along with it to get the game to focus on it. It just constantly feels like something is wrong with your eyes and you can’t see shit.
It’s like motion blur. Your eyes already do that, you don’t need it to be simulated…
to be fair you need it for 24fps movies. however, on 144Hz monitors it’s entirely pointless indeed
My Dad showed me the Avatar game on PS4. The default settings have EXTREME motion blur, just by turning the camera; the world becomes a mess of indecipherable colors, it’s sickening.
Turning it off changed the game completely.
For depth of field, our eyes don’t automatically do that for a rendered image. It’s a 2d image when we look at it and all pixels are the same distance and all are in focus at the same time. It’s the effect you get when you look at something in the distance and put your finger near your eye; it’s blurry (unless you focus on it, in which case the distant objects become blurry).
Even VR doesn’t get it automatically.
It can feel unnatural because we normally control it unconsciously (or consciously if we want to and know how to control those eye muscles at will).
No, your eyes can’t do it on a screen. The effect is physically caused by the different distances of two objects, but the screen is always the same distance from you.
You don’t know what focusing on things is?
Yes, but you still get the blurry effect outside of the spot on the screen you’re focused on.
Not in the same way. Our eyes have lower resolution away from the center, but that’s not what’s causing DoF effects. You’re still missing the actual DoF.
If the effect was only caused by your eye, the depth wouldn’t matter, but it clearly does.
Yeah I get it, I’m just saying it’s unnecessary. If I need to see what’s going on in the background, then my eyes should be able to focus on it.
There are very few scenarios where DoF would be appropriate (like playing a character who lost their glasses).
Like chromatic aberration, which feels appropriate for Cyberpunk, since the main character gets eye implants and fits the cyberpunk theme.
In this context it just refers to a post processing effect that blurs certain objects based on their distance to the camera. Honestly it is one of the less bad ones imo, as it can be well done and is sometimes necessary to pull off a certain look.
No.
Depth of field is when backgroud/foreground objects get blurred depending on where you’re looking, to simulate eyes focusing on something.
You’re thinking of draw distance, which is where objects far away aren’t rendered. Or possibly level of detail (LoD) where distant objects will be changed to a lower detailed model as they get further away.
Gotcha. Thanks🍻
it works great for games that have little to no combat, or combat that’s mostly melee and up to like 3v1. or if it’s a very slight DOF that just gently blurs things far away
idk what deranged individual plays FPS games with heavy DOF though
Yeah, especially games with any amount of sniping. Instantly crippling yourself.
I mean, it works in… hmmm… RPGs, maybe?
When I was a kid there was an effect in FF8 where the background blurred out in Balamb Garden and it made the place feel bigger. A 2D painted background blur, haha.
Then someone was like, let’s do that in the twenty-first century and ruined everything. When you’ve got draw distance, why blur?
It works for the WiiU games where Nintendo used it for tilt shifts. That’s pretty much it
Yes, it makes sense in a game where the designer already knows where the important action is and controls the camera to focus on it. It however does not work in a game where the action could be anywhere and camera doesn’t necessarily focus on it.
Yup, or if they’re covering up hardware deficiency, like Nintendo sometimes does. And even then, they generally prefer to just make everything a little fuzzy, like BotW.
the problem with dilf is that you need to put the subject of your life in the middle
The main problem with these is giving people control of these properties without them knowing how the cameras work in real life.
The problem is that I am not playing as a camera, so why the hell would I want my in-game vision to emulate one?
Sometimes it does look better, but I would argue it’s on the developer to pick the right moments to use them, just like a photographer would. Handing it to the players is the wrong way to go about it, their control on it isnt nearly as good, even without considering their knowledge about it.
Shadows: Off
Polygons: Low
Idle Animation: Off
Draw distance: LowDoes your PC even have a dedicated GPU? At this point you might as well give up on PC gaming and buy a console.
Alt: F4
Launch: BalatroI think my PC can run the C64 demake of Balatro in an emulator
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
I will say that a well executed motion blur is just a chef’s kiss type deal, but it’s hard to get right and easy to fuck up
Early HDR games were rough. I look back at Zelda Twilight Princess screenshots, and while I really like that game, I almost squint looking at it because it’s so bloomed out.
PS3-> everything is sepia filtered and bloomed until nearly unplayable.
That’s just games from that period. It’s not excluse to PS3.
The number of times I’ve broken this one out…
After having lived through it, if I never play a gritty brown bloom game again, it’ll be too soon.
Man, VGCats. Deep, deep, deep cut
I think of that comic every time I see a gritty brown game. I don’t see bloom as much any more, though.
I think maybe that’s part of why The Last Of Us grabbed everyone so hard; it was a gritty, green game. STALKER 2 is brown AF, though. Thank God they skipped the whole bloom fad.
Personally I use motion blur in every racing game I can but nothing else. It helps with the sense of speed and smoothness.
I like DoF as it actually has a purpose in framing a subject. The rest are just lazy attempts at making the game “look better” by just slopping on more and more effects.
Current ray tracing sucks because its all fake AI bullshit.
Ray tracing is not related to AI. Why do you think it’s fake AI bullshit? It’s tracing rays in the same fashion that blender or Maya would. I think you may be confusing this with DLSS?
“real time raytracing” as is advertised by hardware vendors and implemented in games today is primarily faked by AI de-noising. Even the most powerful cards can’t fire anywhere near enough rays to fully raytrace a scene in realtime, so instead they just fire a very low number of rays, and use denoising to clean up the noisy result. That’s why, if you look closely, you’ll notice that reflections can look weird, and blurry/smeary (especially on weaker cards). It’s because the majority of those pixels are predicted by machine learning, not actually sampled from the real scene data.
Blender/Maya’s and other film raytracers have always used some form of denoising (before machine learning denoising, there were other algorithms used), but in films they’re applied after casting thousands of rays per pixel. In a game today, scenes are rendering around 1 ray per pixel, and with DLSS it’s probably even less since the internal render resolution is 2-4x smaller than the final image.
As a technologist, I’ll readily admit these are cool applications of machine learning, but as a #gamer4lyfe, I hate how they look in actual games. Until gpus can hit thousands (or maybe just hundreds) of rays per pixel in real time, I’ll continue to call it “fake AI bullshit” rather than “real time raytracing”
also, here’s an informative video for anyone curious: https://youtu.be/6O2B9BZiZjQ
The only game with Raytracing I’ve seen actually look better with RT on js Cyberpunk 2077. It’s the only game I’ve seen that has full raytraced reflections on surfaces. Everything else just does shadows, and there’s basically no visual difference with it on or off; it just makes the game run slower when on.
But those reflections in CP are amazing as fuck. Seeing things reflect in real time off a rained on road is sick.
I agree with this. That makes it even more jarring to me that mirrors inside of safehouses don’t work until you specifically interact with them. It seems so out of place in a game that has all of these cool raytraced reflections except for a mirror you directly look into.
I just don’t see them as mirrors. They are video screens with a camera in them. ;)
It’s also connected to a performance feature. They can load lower resolution textures for faraway objects. You can do this without the blurring effect of DoF, but it’s less jarring if you can blur it.
The cost of DoF rendering far outweighs the memory savings of using reduced texture sizes, especially on older hardware where memory would be at a premium
Now… in fairness…
Chromatic abberation and lense flares, whether you do or don’t appreciate how they look (imo they arguably make sense in say CP77 as you have robot eyes)…
… they at least usually don’t nuke your performance.
Motion blur, DoF and ray tracing almost always do.
Hairworks? Seems to be a complete roll of the dice between the specific game and your hardware.
Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn’t always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.
You either haven’t been playing PC games very long, or aren’t that old, or have only ever played on fairly high end hardware.
Anisotropic filtering?
Yes, that… hasn’t been challenging for an affordable PC an average person has to run at 8x or 16x for … about a decade. That doesn’t cause too much framerate drop off at all now, and wasn’t too much until you… go all the way back to the mid 90s, when ‘GPUs’ were fairly uncommon.
But that just isn’t true for motion blur and DoF, especially going back further than 10 years.
Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.
Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.
Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.
(Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)
I am 40 and have been gaming on PC my entire life.
Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.
Arma is a horrible example, since it is so poorly optimized, you actually get a higher frame rate maxing everything out compared to running everything on low. lol
If you’re 40 and have been PC gaming your whole life, then I’m going with you’ve had fairly high end hardware, and are just misremembering.
Arma 2 is unoptimized in general… but largely thats because it basically uses a massive analog to a pagefile on your HDD because of how it handles its huge environments in engine. Its too much to jam through 32 bit OSs and RAM.
When SSDs came out, that turned out to be the main thing that’ll boost your FPS in older Arma games, because they have much, much faster read/write speeds.
… But, their motion blur is still unoptimized and very unperformant.
As for setting everything to high and getting higher FPS… thats largely a myth.
There are a few postprocessing settings that work that way, and thats because in those instances, the ‘ultra’ settings actually are different algorithms/methods, that are both less expensive and visually superior.
It is still the case that if you set texture, model quality to low, grass/tree/whatever draw distances very short, you’ll get more frames than with those things maxxed out.
I love it when the hair bugs out and covers the whole distance from 0 0 0 to 23944 39393 39