• floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 months ago

    Intel has not halted sales or clawed back any inventory. It will not do a recall, period.

    Buy AMD. Got it!

      • mox@lemmy.sdf.orgOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        RISC-V isn’t there yet, but it’s moving in the right direction. A completely open architecture is something many of us have wanted for ages. It’s worth keeping an eye on.

      • Dudewitbow@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        arm is very primed to take a lot of market share of server market from intel. Amazon is already very committed on making their graviton arm cpu their main cpu, which they own a huge lion share of the server market on alone.

        for consumers, arm adoption is fully reliant on the respective operating systems and compatibility to get ironed out.

        • icydefiance@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Yeah, I manage the infrastructure for almost 150 WordPress sites, and I moved them all to ARM servers a while ago, because they’re 10% or 20% cheaper on AWS.

          Websites are rarely bottlenecked by the CPU, so that power efficiency is very significant.

          • tal@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 months ago

            I really think that most people who think that they want ARM machines are wrong, at least given the state of things in 2024. Like, maybe you use Linux…but do you want to run x86 Windows binary-only games? Even if you can get 'em running, you’ve lost the power efficiency. What’s hardware support like? Do you want to be able to buy other components? If you like stuff like that Framework laptop, which seems popular on here, an SoC is heading in the opposite direction of that – an all-in-one, non-expandable manufacturer-specified system.

            But yours is a legit application. A non-CPU-constrained datacenter application running open-source software compiled against ARM, where someone else has validated that the hardware is all good for the OS.

            I would not go ARM for a desktop or laptop as things stand, though.

            • batshit@lemmings.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 months ago

              If you didn’t want to game on your laptop, would an ARM device not be better for office work? Considering they’re quiet and their battery lasts forever.

              • frezik@midwest.social
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                3 months ago

                ARM chips aren’t better at power efficiency compared to x84 above 10 or 15W or so. Apple is getting a lot out of them because TSMC 3nm; even the upcoming AMD 9000 series will only be on TSMC 4nm.

                ARM is great for having more than one competent company in the market, though.

                • batshit@lemmings.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  ARM chips aren’t better at power efficiency compared to x84 above 10 or 15W or so.

                  Do you have a source for that? It seems a bit hard to believe.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Linux works great on ARM, I just want something similar to most mini-ITX boards (4x SATA, 2x mini-PCIe, and RAM slots), and I’ll convert my DIY NAS to ARM. But there just isn’t anything between RAM-limited SBCs and datacenter ARM boards.

            • conciselyverbose@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 months ago

              Servers being slow is usually fine. They’re already at way lower clocks than consumer chips because almost all that matters is power efficiency.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Eh, it looks like ARM laptops are coming along. I give it a year or so for the process to be smooth.

              For servers, AWS Graviton seems to be pretty solid. I honestly don’t need top performance and could probably get away with a Quartz64 SBC, I just don’t want to worry about RAM and would really like 16GB. I just need to server a dozen or so docker containers with really low load, and I want to do that with as little power as I can get away with for minimum noise. It doesn’t need to transcode or anything.

              • CancerMancer@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                Man so many SBCs come so close to what you’re looking for but no one has that level of I/O. I was just looking at the ZimaBlade / ZimaBoard and they don’t quite get there either: 2 x SATA and a PCIe 2.0 x4. ZimaBlade has Thunderbolt 4, maybe you can squeeze a few more drives in there with a separate power supply? Seems mildly annoying but on the other hand, their SBCs only draw like 10 watts.

                Not sure what your application is but if you’re open to clustering them that could be an option.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 months ago

                  Here’s my actual requirements:

                  • 2 boot drives in mirror - m.2 or SATA is fine
                  • 4 NAS HDD drives - will be SATA, but could use PCIe expansion; currently have 2 8TB 3.5" HDDs, want flexibility to add 2x more
                  • minimum CPU performance - was fine on my Phenom II x4, so not a high bar, but the Phenom II x4 has better single core than ZimaBlade

                  Services:

                  • I/O heavy - Jellyfin (no live transcoding), Collabora (and NextCloud/ownCloud), samba, etc
                  • CPU heavy - CI/CD for Rust projects (relatively infrequent and not a hard req), gaming servers (Minecraft for now), speech processing (maybe? Looking to build Alexa alt)
                  • others - actual budget, vault warden, Home Assistant

                  The ZimaBlade is probably good enough (would need to figure out SATA power), I’ll have to look at some performance numbers. I’m a little worried since it seems to be worse than my old Phenom II x4, which was the old CPU for this machine. I’m currently using my old Ryzen 1700, but I’d be fine downgrading a bit if it meant significantly lower power usage. I’d really like to put this under my bed, and it needs to be very quiet to do that.

      • lath@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Yet they do it all the time when a higher specs CPU is fabricated with physical defects and is then presented as a lower specs variant.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Nobody objects to binning, because people know what they’re getting and the part functions within the specified parameters.

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I’ve been buying AMD for – holy shit – 25 years now, and have never once regretted it. I don’t consider myself a fanboi; I just (a) prefer having the best performance-per-dollar rather than best performance outright, and (b) like rooting for the underdog.

      But if Intel keeps fucking up like this, I might have to switch on grounds of (b)!

      spoiler

      (Realistically I’d be more likely to switch to ARM or even RISCV, though. Even if Intel became an underdog, my memory of their anti-competitive and anti-consumer bad behavior remains long.)

      • SoleInvictus@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Same here. I hate Intel so much, I won’t even work there, despite it being my current industry and having been headhunted by their recruiter. It was so satisfying to tell them to go pound sand.

      • Damage@slrpnk.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I’ve been on AMD and ATi since the Athlon 64 days on the desktop.

        Laptops are always Intel, simply because that’s what I can find, even if every time I scour the market extensively.

        • Krauerking@lemy.lol
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Honestly I was and am, an AMD fan but if you went back a few years you would not have wanted and AMD laptop. I had one and it was truly awful.

          Battery issues. Low processing power. App crashes and video playback issues. And this was on a more expensive one with a dedicated GPU…

          And then Ryzen came out. You can get AMD laptops now and I mean that like they exist, but also, as they actually are nice. (Have one)

          But in 2013 it was Intel or you were better off with nothing.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Sorry but after the amazing Athlon x2, the core and core 2 (then i series) lines fuckin wrecked AMD for YEARS. Ryzen took the belt back but AMD was absolutely wrecked through the core and i series.

        Source: computer building company and also history

        tl:dr: AMD sucked ass for value and performance between core 2 and Ryzen, then became amazing again after Ryzen was released.

        • CancerMancer@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          I ran an AMD Phenom II x4 955 Black Edition for ~5 years, then gave it to a friend who ran it for another 5 years. We overclocked the hell out of it up to 4ghz, and there is no way you were getting gaming performance that good from Intel dollar-for-dollar, so no AMD did not suck from Core 2 on. You need to shift that timeframe up to Bulldozer, and even then Bulldozer and the other FX CPUs ended up aging better than their Intel counterparts, and at their adjusted prices were at least reasonable products.

          Doesn’t change the fact AMD lied about Bulldozer, nor does it change Intel using its market leader position to release single-digit performance increases for a decade and strip everything i5 and lower down to artificially make i7 more valuable. Funny how easy it is to forget how shit it was to be a PC gamer then after two crypto booms.

  • MudMan@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    I have a 13 series chip, it had some reproducible crashing issues that so far have subsided by downclocking it. It is in the window they’ve shared for the oxidation issue. At this point there’s no reliable way of knowing to what degree I’m affected, by what type of issue, whether I should wait for the upcoming patch or reach out to see if they’ll replace it.

    I am not happy about it.

    Obviously next time I’d go AMD, just on principle, but this isn’t the 90s anymore. I could do a drop-in replacement to another Intel chip, but switching platforms is a very expensive move these days. This isn’t just a bad CPU issue, this could lead to having to swap out two multi-hundred dollar componenet, at least on what should have been a solidly future-proof setup for at least five or six years.

    I am VERY not happy about it.

    • Lets_Eat_Grandma@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      switching platforms is a very expensive move these days.

      It’s just a motherboard and a cpu. Everything else is cross compatible, likely even your cpu cooler. If you just buy another intel chip… it’s just gonna oxidize again.

      $370 for a 7800x3d https://www.microcenter.com/product/674503/amd-ryzen-7-7800x3d-raphael-am5-42ghz-8-core-boxed-processor-heatsink-not-included

      ~$200 for a motherboard.

      Personally i’d wait for the next release to drop in a month… or until your system crashes aren’t bearable / it’s worth making the change. I just don’t see the cost as prohibitive, it’s about on par with all the alternatives. Plus you could sell your old motherboard for something.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I’m not really that knowledgeable about AM5 mobos (still on AM4) but you should be able to get something perfectly sensible for 100 bucks. Are you going to get as much IO and bells and whistles no but most people don’t need that stuff and you don’t have to spend a lot of money to get a good VRM or traces to the DIMM slots.

        Then, possibly bad news: Intel Gen 13 supports DDR4, so you might need new RAM.

        • Lets_Eat_Grandma@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          32GB of ddr5 can be found for ~$100, and any other upgrade from a ddr4 platform today is going to require new memory.

          So the DDR4 13th series folks can stay on their oxidized processors, or they can pay money to get something else. Not much else to do there.

          I upgraded my AM4 platform system to a 5800x3d a while back and it’s still working just fine. I wouldn’t recommend people buying into AM4 today just because no more upgrades are coming… but AM5? why not? It’ll be around until ddr6 is affordable circa 2027.

          I’m super interested in seeing how intel’s 15th gen turns out. We know it’s a new socket so the buy in cost is sky high as all have argued here (that mobo/cpu/ram is crazy expensive.) I can only imagine they will drop power load to avoid more issues but who can say. Maybe whatever design they are using won’t have been so aggressively tuned or if they’re lucky hasn’t started physical production so they can modify it appropriately. Time will tell, and we won’t know if it has the same issue for a year or so post release.

  • TrickDacy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Amd processors have literally always been a better value and rarely have been surpassed by much for long. The only problem they ever had was back in the day they overheated easily. But I will never ever buy an Intel processor on purpose, especially after this.

  • deltreed@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    So like, did Intel lay off or deprecate its QA teams similar to what Microsoft did with Windows? Remember when stability was key and everything else was secondary? Pepperidge farms remembers.

  • demesisx@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    The other day, when this news hit for the first time, I bought two ITM Put options on INTC. Then, I waited three days and sold them for 200% profit. Then, I used the profit to invest in the SOXX etf. Feels good to finally get some profit from INTC’s incompetence.

    edit: haters of gambling with the downvotes?

    • nek0d3r@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I genuinely think that was the best Intel generation. Things really started going downhill in my eyes after Skylake.

  • 2pt_perversion@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    People are freaking out about the lack of a recall but intel says their patch that will supposedly stop currently working cpus from experiencing the overvolt condition that is leading to the failure. So they don’t really need to do a recall if currently working CPUs will stay working with the patch in place. As long as they offer some sort of free extended warranty and a good RMA proccess for the CPUs that are already damaged I feel it’s fine.

    If they RMA with a bump in perf for those affected it might even be positive PR like “they stand by their products” but if they’re stingy with responsibility then we should obviously give them hell. We really have to see how they handle this.

    • BobGnarley@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Oh you mean they’re going to underclock the expensive new shit I bought and have it underperform to fix their fuck up?

      What an unacceptable solution.

      • Strykker@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        They aren’t over clocking / under clocking anything with the fix. The chip was just straight up requesting more voltage than it actually needed, this didn’t give any benefit and was probably an issue even without the damage it causes, due to extra heat generated.

        • nek0d3r@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Giving a CPU more voltage is just what overclocking is. Considering that most of these modern CPUs from both AMD and Intel have already been designed to start clocking until it reaches a high enough temp to start thermally throttling, it’s likely that there was a misstep in setting this threshold and the CPU doesn’t know when to quit until it kills itself. In the process it is undoubtedly gaining more performance than it otherwise would, but probably not by much, considering a lot of the high end CPUs already have really high thresholds, some even at 90 or 100 C.

          • Strykker@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            If you actually knew anything you’d know that overclockers tend to manually reduce the voltage as they increase the clock speeds to improve stability, this only works up to a point, but clearly shows voltage does not directly influence clock speed.

  • ApollosArrow@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I have an Intel Core i9-14900K 3.2 GHz 24-Core LGA 1700 Processor purchased in March. Is there any guesses for the window yet of potential affected CPUs?

      • ApollosArrow@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        So the bios update wouldn’t already put my chip within safer operating parameters? I also have to undervolt it?

    • M0oP0o@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The window is the whole gen and the gen before it for any chips over 600. You are in this picture.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Yeah, this is going to be a mess for years. I am not even sure if they have enough chips for replacement.

          I would suggest getting a new cpu and then locking that cpu down to a set speed.

  • w2tpmf@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Any real world comparison. Gaming frame rate, video encoding… The 13-700 beats the 7900x while being more energy efficient and costing less.

    That’s even giving AMD a handicap in the comparison since the 7700x is supposed to be the direct comparison to the 13-700.

    I say all this as a longggg time AMD CPU customer. I had planned on buying their CPU before multiple different sources of comparison steered me away this time.

    • M0oP0o@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      Ok, so maybe you are missing the part where the 13 and 14 gens are destroying themselves. No one really cares if you use AMD or what not, this little issue is intel and makes any performance,power use or cost moot as the cpu’s ability to not hurt itself in its confusion will now always be in question.

      Also I don’t think CPU speeds have been a large bottleneck in the last few years, why both AMD and Intel keep pushing is just silly.

      • w2tpmf@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yeah that does suck. But I was replying specifically to the person saying Intel hasn’t been relevant for years because of a supposed performance dominance from AMD. That’s part just isn’t true.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Your comment does not reply to anyone though, its just floating out there on its own.

          And even taken as a reply it still does not make sense since as of this “issue” any 13th or 14th gen Intel over a 600 is out of the running since they can not be trusted to not kill themselves.

  • gearheart@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    This would be funny if it happened to Nvidia.

    Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…

    No one wants that.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      This would be funny if it happened to Nvidia.

      Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…

      Lol there was a reason Xbox 360s had a whopping 54% failure rate and every OEM was getting sued in the late 2000s for chip defects.

          • icedterminal@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Tagging on here: Both the first model PS3 and Xbox 360 were hot boxes with insufficient cooling. Both suffered from getting too hot too fast for their cooling solutions to keep up. Resulting in hardware stress that caused the chips solder points to weaken until they eventually cracked.

            • john89@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Owner of original 60gb PS3 here.

              It got very hot and eventually stopped working. It was under warranty and I got an 80gb replacement for $200 cheaper, but lost out on backwards compatibility which really sucked because I sold my PS2 to get a PS3.

              • lennivelkant@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                Why would you want backwards compatibility? To play games you already own and like instead of buying new ones? Now now, don’t be ridiculous.

                Sarcasm aside, I do wonder how technically challenging it is to keep your system backwards-compatible. I understand console games are written for specific hardware specs, but I’d assume newer hardware still understands the old instructions. It could be an OS question, but again, I’d assume they would develop the newer version on top of their old, so I don’t know why it wouldn’t support the old features anymore.

                I don’t want to cynically claim that it’s only done for profit reasons, and I’m certainly out of my depth on the topic of developing an entire console system, so I want to assume there’s something I just don’t know about, but I’m curious what that might be.

                • john89@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 months ago

                  It’s my understanding that backwards-compatible PS3s actually had PS2 hardware in them.

                  We can play PS2 and PS1 games if they are downloaded from the store, so emulation isn’t an issue. I think Sony looked at the data and saw they would make more money removing backwards compatibility, so that’s what they did.

                  Thankfully the PS3 was my last console before standards got even lower and they started charging an additional fee to use my internet.

        • hardcoreufo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I think the 360 failed for the same reason lots of early/mid 2000s PCs failed. They had issues with chips lifting due to the move away from leaded solder. Over time the formulas improved and we don’t see that as much anymore. At least that’s the way I recall it.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      This would be funny if it happened to Nvidia.

      It kinda, has, with Fermi, lol. The GTX 480 was… something.

      Same reason too. They pushed the voltage too hard, to the point of stupidity.

      Nvidia does not compete in this market though, as much as they’d like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can’t take the market themselves because there simply isn’t enough TSMC/Samsung to go around.

      • Kyrgizion@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        There’s also Intel holding the x86 patent and AMD holding the x64 patent. Those two aren’t going anywhere yet.

        • wax@feddit.nu
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Actually, looks lhe base patents have expired. All the extentions, SSE, AVX are still in effect though