• Lizardking27@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.

    I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.

    I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?

    Fucking despicable. Do better or die, manufacturers.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.

      • Allonzee@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Capitalism: “Growth or die!”

        Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥

        It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.

        But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.

      • volodya_ilich@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Not really, and I say this being a communist myself. Capitalism just requires to extract the maximum profit from the capital investment, sometimes it leads to what you said, sometimes it leads to the opposite (e.g. no difference between i5 1st gen and i5 8th gen)

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.

      Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.

    • Doombot1@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.

    • InputZero@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.

    • black0ut@pawb.social
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      Afaik it wasn’t a temperature problem, it was voltage related. Obviously cooler temps help, but you would probably still be vulnerable to this.

  • w2tpmf@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    This keeps getting slightly misrepresented.

    There is no fix for CPUs that are already damaged.

    There is a fix now to prevent it from happening to a good CPU.

    • exanime@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      But isn’t the fix basically under clocking those CPU?

      Meaning the “solution” (not even out yet) is creeping those units before the flaw creeples them?

      • w2tpmf@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        They said the cause was a bug in the microcode making the CPU request unsafe voltages:

        Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.

        If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from the voltage, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.

        I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.

  • angrystego@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.

    • scrion@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      3 months ago

      For years, Intel’s compiler, math library MKL and their profiler, VTune, really only worked well with their own CPUs. There was in fact code that decreased performance if it detected a non-Intel CPU in place:

      https://www.agner.org/optimize/blog/read.php?i=49&v=f

      That later became part of a larger lawsuit, but since Intel is not discriminating against AMD directly, but rather against all other non-Intel CPUs, the result of the lawsuit was underwhelming. In fact, it’s still a problem today:

      https://www.extremetech.com/computing/302650-how-to-bypass-matlab-cripple-amd-ryzen-threadripper-cpus

      https://medium.com/codex/fixing-intel-compilers-unfair-cpu-dispatcher-part-1-2-4a4a367c8919

      Given that the MKL is a widely used library, people also indirectly suffer from this if they buy an AMD CPU and utilize software that links against that library.

      As someone working in low-level optimization, that was/is a shitty situation. I still bought an AMD CPU after the latest fiasco a couple of weeks ago.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost

      • Cyborganism@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        I haven’t really been paying much attention to the latest GPU news, but can AMD cards do ray tracing and dlss and all that jazz that comes with RTX cards?

        • natebluehooves@pawb.social
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          DLSS is off the table, but you CAN raytrace. That being said I do not see the value of RT myself. It has the greatest performance impact of any graphical setting and often looks only marginally better than baked in lighting.

          • Cyborganism@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            3 months ago

            It depends greatly on the game. I’ve seen a huge difference in games like Control where the game itself was used to feature that… Well… Feature! You can see it in the quality of the lighting and the reflections. You also get better illumination on darker areas thanks to radiated lighting. It’s much more natural looking.

          • linkhidalgogato@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            3 months ago

            dlss is a brand name both amd and intel have their own version of the same thing, and they are only a little worse if at all.

        • vithigar@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          There is analogous functionality for most of it, though it’s generally not quite as good across the board.

          FSR is AMD’s answer to DLSS, but the quality isn’t quite as good. However the implementation is hardware agnostic so everyone can use it, which is pretty nice. Even Nvidia’s users with older GPUs like a 1080 who are locked out of using DLSS can still use FSR in supported games. If you have an AMD card then you also get the option on the driver settings of renaming it globally for every game, whether it has support built in or not.

          Ray tracing is present and works just fine, though their performance is about a generation behind. It’s perfectly usable if you keep your expectations in line with that though. Especially in well optimized games like DOOM Eternal or light ray tracing like in Guardians of the Galaxy. Fully path traced lighting like in Cyberpunk 2077 is completely off the table though.

          Obviously AMD has hardware video encoders. People like to point out that the visual quality of then is lower than Nvidia’s but I always found them perfectly serviceable. AMD’s background recording stuff is also built directly into their driver suite, no need to install anything extra.

          While they do have their own GPU-powered microphone noise removal, a la RTX Voice, AMD does lack the full set of tools found in Nvidia Broadcast, e.g. video background removal and whatnot. There is also no equivalent to RTX HDR.

          Finally, if you’ve an interest in locally running any LLM or diffusion models they’re more of a pain to get working well on AMD as the majority of implementations are CUDA based.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Yes, but by different names. They use FSR that’s basically the same thing, I haven’t noticed a difference in quality. Ray tracing too, just not branded as RTX

      • anivia@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.

        Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS

        • reliv3@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.

          There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I had the 3090 and then the 6900xtx. The differences were minimal, if even noticeable. Ray tracing is about a generation behind from Nvidia to and, but they’re catching up.

          As the other commenter said too fsr is the same as dlss. For me, I actually got a better frame rate with fsr playing cyberpunk and satisfactory than I did dlss!

      • sparkle@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…

  • kamen@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.

  • arefx@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    3 months ago

    Ryzen gang

    My 7800x3d is incredible, I won’t be going back to Intel any time soon.

      • felsiq@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        To put this into context, the zen5 X3D chips aren’t out yet so this isn’t really an apples to apples comparison between generations. Also, zen5 was heavily optimized for efficiency rather than speed - they’re only like 5% faster than zen4 (X series, not X3D ofc) last I saw but they do that at the zen3 TDPs, which is crazy impressive. I’m not disagreeing with you about the 7800X3D - I love that chip, it’s def a good one - just don’t want people to get the wrong idea about zen5.

      • SuperIce@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Not sure how much longer I’ll be using the 5950x tbh. We’ve reached a point where the mobile processors have faster multicore (for the AI 370) than the 5950X without gulping down boatloads of power.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      Also on the 7800X3D. I think I switched at just the right time. I’ve been on Intel since the Athlon XP. The next buy would have been 13/14th gen.

      • arefx@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        I’m not that worried about it effecting me lol, i would be more concerned about my intel cpu dying, especially since it’s been around for decades.

      • Xanis@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        tldr: Flaw can give a hacker access to your computer only if they have already bypassed most of the computer’s security.

        This means continue not going to sketchy sites.

        Continue not downloading that obviously malicious attachment.

        Continue not being a dumbass.

        Proceed as normal.

        Because if a hacker got that deep your system is already fucked.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          3 months ago

          It’s more serious than normal because if your PC ever gets owned, a wipe and reinstall will not remove the exploit.

          “Nissim sums up that worst-case scenario in more practical terms: “You basically have to throw your computer away.””

    • Rakonat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Me who bought AMD cpu and gpu last year for my new rig cause fuck the massive mark up for marginal improvement on last gen stats.

    • zaphodb2002@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I loved my FX cpu but I lived in a desert and the heat in the summer coming off that thing would make my room 100F or more. First machine I built a custom water loop for. Didn’t help with the heat in the room, but did stop it from shutting down randomly, so I could continue to sit in the sweltering heat in my underpants and play video games until dawn. Better times.

      • rotopenguin@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        You might want to go through the trouble of extending that radiator loop all the way out through a window.

      • Bytemeister@lemmy.world
        link
        fedilink
        Ελληνικά
        arrow-up
        0
        ·
        3 months ago

        I had the FX8350 Black Edition, and that thing would keep my room at 70f… In the winter… With a window open.

        Summer gaming was BSOD city. I miss it so much.

      • helpmepickaname@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Of course it didn’t help the heat in the room, the heat from the CPU still has to go somewhere. Better coolers aren’t for the room, they’re for the CPU. in fact a better cooler could make the room hotter because it is removing heat at a higher rate from the CPU and dumping it into the room

    • punkfungus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      This isn’t the first time such a vulnerability has been found, have you forgotten spectre/meltdown? Though this is arguably not nearly as impactful as those because it requires physical access to the machine.

      Your fervour in trying to paint this as an equivalent problem to Intel’s 13th and 14th Feb defects, and implication that everyone else are being fanboys, is just telling on yourself mate. Normal people don’t go to bat like that for massive corpos, only Kool aid drinkers.

    • g0nz0li0@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I’m not up to speed on the discovery you linked. It appears to be a vulnerability that can’t be exploited remotely? If so, how is this the same as Intel chips causing widespread system instability?

    • linkhidalgogato@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      not gonna lie u look a lot like a fanboy urself idk ur just giving off “my beloved intel looks so bad here that i can directly say its better so ill just both sides with some dumb thing” energy

  • gmtom@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.