• maynarkh@feddit.nl
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    1
    ·
    6 months ago

    Thus, Windows will again be instrumental in driving growth for the minimum memory capacity acceptable in new PCs.

    I love that the primary driver towards more powerful hardware is Windows just bloating itself bigger and bigger. It’s a grift in its own way, consumers are subsidizing the requirements for Microsoft’s idiotic data processing. And MSFT is not alone in this, Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).

    • thesorehead@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).

      OOTL, what’s going on here? Distributed processing like Folding@Home, but for serving ads to make Google more money?

      • maynarkh@feddit.nl
        link
        fedilink
        English
        arrow-up
        10
        ·
        6 months ago

        They called it Federated Learning of Cohorts at one point. Instead of you sending raw activity data to Google servers and them running their models there, the model runs in Chrome and they only send back the ad targeting groups you belong to. All in the name of privacy of course.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    41
    ·
    6 months ago

    At least it should result in less laptops being made with ridiculously small amounts of non upgradable RAM.

    Requiring a large amount of compute power for AI is just stupid though. It will probably come in the form of some sort of dedicated AI accelerator that’s not usable for general purpose computing.

    • Lee Duna@lemmy.nzOP
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      6 months ago

      And remember that your data and telemetry are sent to Microsoft servers to train Copilot AI. You may also need to subscribe to some advanced AI features

      • DontMakeMoreBabies@kbin.social
        link
        fedilink
        arrow-up
        9
        arrow-down
        2
        ·
        6 months ago

        And that’s when I’ll start using Linux as my daily driver.

        Honestly installing Ubuntu is almost idiot proof at this point.

        • Lee Duna@lemmy.nzOP
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          6 months ago

          I do agree with you, the obstacle is that there are many applications that are not available on Linux or they’re not as powerful as on Windows. As for me is MS. Excel, many of my office clients use VBA in Excel spreadsheet to do calculations.

          • Reptorian@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            6 months ago

            At least we might have a finally viable replacement in Photoshop soon. GIMP is getting NDE, Krita might be getting foreground extraction tool at some point, and Pixellator might have better tools though it’s NDE department is solid. The thing is all of them are missing something, but I’m betting on GIMP after CMYK_Student arrival to GIMP development.

            I tried adding foreground selection based on guided selection, but was unable to fix noises on in-between selection and was unable to build Krita. We would have Krita with foreground selection if it weren’t for that.

  • thecrotch@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    2
    ·
    edit-2
    6 months ago

    Microsoft is desperate to regain the power they had in the 00s and is scrambling trying to find that killer app. At least this time they’re not just copying apples homework.

    • Toribor@corndog.social
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      6 months ago

      They either force it on everyone or bundle it in the enterprise package that businesses already pay for and then raise the price.

      It never works, but maybe this time it will. I mean it won’t… But maybe.

      • tias@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        6 months ago

        And maybe that’s why it isn’t working. They try too hard to persuade or force you, giving people icky feelings from the get go… and they try too little to just make a product that people want.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      6 months ago

      Yeah, and solder it onto the board while you’re at it! Who ever needs to upgrade or perform maintenance anyways?

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        15
        ·
        6 months ago

        They do make the most of it though. Soldered RAM can be much faster than socketed RAM, which is why GPUs do it too.

        • locuester@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          6 months ago

          My knowledge of electrical engineering has not shown that solder increases performance. Do you have some more information on this?

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            Well, that too, but that’s not particularly common on laptops or GPUs. Even in Apple silicon it’s not the same die, but it is the same package.

    • Shurimal@kbin.social
      link
      fedilink
      arrow-up
      14
      ·
      6 months ago

      Unless it’s locally hosted, doesn’t scan every single file on my storage and doesn’t send everything I do with it to the manufacturer’s server.

      • evranch@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Do it, it’s easy and fun and you’ll learn about the actual capabilities of the tech. Started a week ago and I’m a convert on the utility of local AI. Got to go back to Reddit for it but r/localllama has tons of good info. You can actually run useful models at a conversational pace.

        This whole thread is silly because VRAM is what you need, I’m running some pretty good coding and general knowledge models in a 12GB Radeon. Almost none of my 32GB system ram is used lol either Microsoft is out of touch or hiding an amazing new algorithm

        Running in system ram works but the processing is painfully slow on the regular CPU, over 10x slower

  • ANON@lemmy.ml
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    edit-2
    6 months ago

    Everyone here is like praising microsoft when in fact you can just buy any pc’s with 16 gig ram you like without the additional ai spyware and (cost if i may assume)

    • flamingarms@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      I’m not seeing anyone here praising Microsoft; actually the opposite. Who’s praising Microsoft?

  • nyakojiru@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    6 months ago

    They are making for a long time now, a massive slow effort to make end users finally migrate to Linux (and I’m a whole life windows guy)

  • DumbAceDragon@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    6 months ago

    “Wanna see me fill entire landfills with e-waste due to bullshit minimum requirements?”

    “Wanna see me do it again?”

  • query@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    AI PC sounds like something that will be artificially personal more than anything else.

  • HidingCat@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    6 months ago

    Great, so it’ll take AI to set 16GB as minimum.

    I still shudder that there are machines still being sold with 8GB RAM, that’s just barely enough.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      It’s honestly crazy to think about that we used to say the same about 4GB only 5-7 years ago…

      And the same about 2GB a measly 10 years ago…

      5 years ago I used to think 32GB was great. Now I regularly cap out and start page filing doing my normal day-to-day work on 48GB. It’s crazy now.

  • Poem_for_your_sprog@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    6 months ago

    Opening excel and outlook on a win11 PC brings you to almost 16GB of memory used. I don’t know how anybody is still selling computers with 8GB of ram.

    • rbesfe@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      Uh… No, it doesn’t. 8GB is definitely tight these days, but for simple word processing, email, and spreadsheet usage it still works fine.

  • banneryear1868@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    Makes sense, 16GB is sort of the new “normal” although 8GB is still quite enough for everyday casual use. “AI PCs” being a marketing term just like “AI” itself.