• EatATaco@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I think we’re in an ai bubble because Nvidia is way over valued and I agree with you that often people flock to shiny new things and many people are taking risk with the hope of making it big…and many will get left holding the bag.

    But how do you go from NFTs, which never had widespread market support, to the market pumping a trillion dollars into Nvidia alone? This makes no sense. And to down play this as “just a bullshitter” leads me to believe you have like zero real world experience with this. I use copilot for coding and it’s been a boost to productivity for me, and I’m a seasoned vet. Even the ai search results, which many times have left me scratching my head, have been a net benefit to me in time savings.

    And this is all still pretty new.

    While I think there it is over hyped and people are being ridiculous with how much this will change things, at the very least this is going to be a huge new tool, and I think you’re setting yourself up to be left behind if you aren’t embracing this and learning how to leverage it.

    • NoMoreCocaine@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 month ago

      The AI technology we’re using isn’t “new” the core idea is several decades old, only minor updates since then. We’re just using more parallel processing and bigger datasets to brute force the “advances”. So, no, it’s not actually that new.

      We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we’re going to burst the bubble once the hype dies down.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I’m not sure where you are getting the idea that the core idea is “decades” old. Unless you are taking the core idea to mean neural networks, or digital computing?