MHLoppy
Currently studying CS and some other stuff. Best known for previously being top 50 (OCE) in LoL, expert RoN modder, and creator of RoN:EE’s community patch (CBP). He/him.
(header photo by Brian Maffitt)
- 347 Posts
- 417 Comments
MHLoppy@fedia.ioto Technology@lemmy.world•ClockBench: Even the best AI models can't reliably read the clock4·8 days agoReally wish they published the whole dataset. They don’t specify on the page or in the paper what the full set was like, and the GitHub repo only has one of the easy-to-read ones. If >=10% of the set is comprised of clock faces designed not to be readable then fair enough.
MHLoppy@fedia.ioto Technology@lemmy.world•ClockBench: Even the best AI models can't reliably read the clock141·9 days agoThe human level accuracy is less than 90%!?
the kbin.social domain expires in a few days, on 10 September
Looks like someone might’ve bought the domain?
It doesn’t count when you have to change the headline
The Rules
Posts must be:
- Links to news stories from…
- …credible sources, with…
- …their original headlines, that…
- …would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Poor proton-Earth, barely mentioned at all :'(
I honestly didn’t know (or probably more accurately, didn’t remember) that raw damage dealt gave points lol
Similar to the older Meet the Heavy Sniper: YouTube
I actually thought it looked pretty good because of Return to Monkey Island (which has never been bundled before and the ATL price of which is ~half the bundle’s cost). The average OpenCritic score (if that’s what you meant?) for the games excluding the Destiny 2 bundle is high 70s, and the average Steam review score is 83% positive. Seems fine to me but ymmv depending on personal preference I guess.
MHLoppy@fedia.ioto Technology@lemmy.world•WhatsApp's new AI feature lets you rephrase and adjust the tone of your messages | TechCrunch5·26 days agoAnecdotally, quite a lot of users vote “selfishly” and don’t care that downvoting reduces visibility.
all
andlocal
feeds also fall victim to people voting as if these are their own personal curated feeds.And I hate it 🫠
“We’re going to collect as much data about you as we can to sell to advertisers”
That’s a rather pessimistic interpretation of a privacy policy that starts with this:
The spirit of the policy remains the same: we aren’t here to exploit you or your info. We just want to bring you great new videos and creators to enjoy, and the systems we build to do that will sometimes require stuff like cookies.
and which in section 10 (Notice for Nevada Residents) says:
We do not “sell” personal information to third parties for monetary consideration [as defined in Nevada law] […] Nevada law defines “sale” to mean the exchange of certain types of personal information for monetary consideration to another person. We do not currently sell personal information as defined in the Nevada law.
So yes, I suppose they may be selling personal information by some other definition (I don’t know the Nevada law in question). But it feels extremely aggressive to label it a “shithole” that “collect[s] as much data about you as we can to sell to advertisers” based on the text of the privacy policy as provided.
I guess perspective here depends on your anchoring point. I’m anchoring mostly on the existing platform (YouTube), and Nebula’s policy here looks better (subjectively much better) than what runs as normal in big tech. If your anchor is your local PeerTube instance with a privacy policy that wasn’t written by lawyers, I can see how you’d not be a fan.
However beyond being in legalese I’m not sure what part of it you find so bad as to describe it as a shithole. Even compared to e.g., lemmy.world’s privacy policy Nebula’s looks “good enough” to me. They collect slightly more device information than I wish they did and are more open to having/using advertising partners than I had expected (from what I know of the service as someone who has never actually used it) but that’s like… pretty tame compared what most of the big platforms have.
Nebula is a shithole, just have a glance at their privacy policy.
It looks pretty run of the mill to me?
I’m definitely glad that FSR isn’t bound to a brand or model, but DLSS just does so much better.
Not sure if you haven’t kept up with the current-gen AMD cards, but FSR 4 released with the current RX 9000 series and is roughly halfway between DLSS 3 and DLSS 4 in overall image quality (i.e., it’s good, but has some specific strengths and weaknesses compared to DLSS) and doesn’t run on older-gen GPUs. With FSR 4, AMD gave up on the hardware-agnostic upscaling approach – I guess because the quality just isn’t there – and worked with Sony on this new approach that uses their own hardware “AI cores” the same way Nvidia uses the equivalent cores for DLSS.
TechSpot / Hardware Unboxed did some tests (on Windows, where DirectStorage is available so this will alter some of the results compared to your own context) on this recently: https://www.techspot.com/article/3023-ssd-gaming-comparison-load-times/ (video form: YouTube)
In their results (which again may not map 1:1 to your own environment given OS differences etc), there was some difference when moving from a SATA SSD to a “slow” (by current standards) PCIe gen 3 NVMe SSD, but pretty negligible difference beyond that within gaming contexts when moving from that to other, newer/faster NVMe SSDs.
If I were to hazard a guess for your specific setup (assuming you’re currently loading mostly from a SATA SSD), it sounds like you might eke out a small loading speed improvement with either a RAID0 (or similar) SATA SSD setup or by moving to an NVMe drive, but the gains are probably only going to be generally meaningful if you’re able to somehow use DirectStorage (or a “Linux’d” version of it) somehow. Indiana Jones and the Great Circle was the only game within the tested samples that saw meaningful improvements without using DirectStorage when moving to something faster than a single SATA SSD.
MHLoppy@fedia.ioto Australia@aussie.zone•Quiz: can you pick a Victorian from a Queenslander? How our accents change from state to state0·1 month agoApparently I’m hopeless, they all just sound “Australian” to me. I didn’t even know there was much state-by-state difference lol
Can access fine (with reduced functionality) on my end with JS disabled - maybe you have something else tripping it up or something?
MHLoppy@fedia.ioto Ask Lemmy@lemmy.world•Your Lemmy/Reddit/Fetlife/etc profile is tied to your real name and browsable. How screwed are you?10·1 month agoI used to believe in this (and to a degree still do), but the idea of increasing the attack surface of unreasonable people (who seem to have become increasingly common in the last 10 years) who will do insane things like SWAT you, or doxx your personal details (like home address), or even just follow you around online to harass you has made me have second thoughts about the tradeoffs involved in this approach 🫠
MHLoppy@fedia.ioto PC Gaming@lemmy.ca•Best SSD for Gaming: PCIe 5.0 vs 4.0 vs 3.0 vs SATA vs HDD Comparison5·1 month agoI honestly don’t think any of this matters anymore. SSD in general is just good for gaming
As the article shows, in some games it does matter enough to be probably something where you can “feel” the difference between a SATA vs NVMe SSD. There’s no need to guess or speculate here, the article has several measurements with differences that I’d consider a non-trivial between SATA and NVMe SSD speeds:
- 10 seconds difference (~50%) in first load time of Assassin’s Creed Shadows
- 10 seconds difference (~33%) in first load time of Black Myth Wukong
- 3 seconds difference (>50%) in quick travel load time in Indiana Jones and the Great Circle
- 15 seconds difference (~33%) in first load time of Kingdom Come Deliverance II
- 10 seconds difference (~50%) in load into game time of The Last of Us Part II
- 8 seconds difference (~70%) in first load time of Spider-Man 2
- 5 seconds difference (~80%) in first load time of Ratchet & Clank: Rift Apart
In several of these, the SATA SSD has performance similar to the hard drives, not to the NVMe SSDs. Of course, there are also many results where just having an SSD – regardless of what type – seems to be enough. Is it enough to justify upgrading an existing SATA SSD just for performance reasons? For most people, probably not - but it’s worth knowing what real difference there can be in real-world situations. It’s certainly nice to save a few minutes of cumulative load time every week if you play some of these types of games regularly though. (And for those with NVMe SSDs already, yeah, even in the above cases there seems to be only a trivial difference.)
I assume there will be some non-zero number of new releases making good use of DirectStorage, so if for anyone who tends to play new releases then it may matter increasingly more too, though it of course depends on what a person plays.
I actually think this video is doing a pretty bad job of summarizing the practical-comparison part of the paper.
If you go here you can get a GitHub link which in turn has a OneDrive link with a dataset of images and textures which they used. (This doesn’t include some of the images shown in the paper - not sure why and don’t really want to dig into it because spending an hour writing one comment as-is is already a suspicious use of my time.)
Using the example with an explicit file size mentioned in the video which I’ll re-encode with Paint.NET trying to match the ~160KB file size:
Hadriscus has the right idea suggesting that JPEG is the wrong comparison, but this type of low-detail image at low bit rates is actually where AVIF rather than JPEG XL shines. The latter (for this specific image) looks a lot worse at the above settings, and WebP is generally just worse than AVIF or JPEG XL for compression efficiency since it’s much older. This type of image is also where I would guess this type of compression / reconstruction technique also does comparatively well.
But honestly, the technique as described by the paper doesn’t seem to be trying to directly compete against JPEG which is another reason I don’t like that the video put a spotlight on that comparison; quoting the paper:
Most image compression formats (with AVIF being a possible exception) aren’t tailored for “ultra-low bitrates”. Nevertheless, here’s another comparison with the flamingo photo in the dataset where I’ll try to match the 0.061 bpp low-side bit rate target (if I’ve got my math right that’s 255,860.544 bits):
(Ideally I would now compare this image at some of the other, higher bpp targets but I am le tired.)
It looks like interesting research for low bit rate / low bpp compression techniques and is probably also more exciting for anyone in the “AI compression” scene, but I’m not convinced about “Intel Just Changed Computer Graphics Forever!” as the video title.
As an aside, every image in the supplied dataset looks weird to me (even the ones marked as photos), as though it were AI-generated or AI-enhanced or something - not sure if the authors are trying to pull a fast one or if misuse of generative AI has eroded my ability to discern reality 🤔
edit: to save you from JPEG XL hell, here’s the JPEG XL image which you probably can’t view, but losslessly re-encoded to a PNG: https://files.catbox.moe/8ar1px.png