I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

    • blackbelt352@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 month ago

      It’s a lot. Like a lot a lot. GPUs have about 150 billion transistors but those transistors only make 1 connection in what is essentially printed in a 2d space on silicon.

      Each neuron makes dozens of connections, and there’s on the order of almost 100 billion neurons in a blobby lump of fat and neurons that takes up 3d space. And then combine the fact that multiple neurons in patterns firing is how everything actually functions and you have such absurdly high number of potential for how powerful human brains are.

      At this point, I’m not sure there’s enough gpus in the world to mimic what a human brain can do.

      • cynar@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 month ago

        That’s also just the electrical portion of our mind. There are whole levels of chemical, and chemical potentials at work. Neurones will fire differently depending on the chemical soup around them. Most of our moods are chemically based. E.g. adrenaline and testosterone making us more aggressive.

        Our mind also extends out of our heads. Organ transplant recipricants have noted personality changes. Food preferences being the most prevailant.

        The neurons only deal with ‘fast’ thinking. ‘slow’ thinking is far more complex and distributed.

    • cron@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      I don’t think your brain can be reasonably compared with an LLM, just like it can’t be compared with a calculator.

      • GetOffMyLan@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        LLMs are based on neural networks which are a massively simplified model of how our brain works. So you kind of can as long as you keep in mind they are orders of magnitude more simple.

        • utopiah@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 month ago

          At some point it becomes so “simplified” it’s arguably just not the same thing, even conceptually.

          • GetOffMyLan@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            1 month ago

            It is conceptually the same thing. A series of interconnected neurons with a firing threshold and weighted connections.

            The simplification comes with how the information is transmitted and how our brain learns.

            Many functions in the human body rely on quantum mechanical effects to function correctly. So to simulate it properly each connection really needs to be its own super computer.

            But it has been shown to be able to encode information in a similar way. The learning the part is not even close.

      • abcd@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        The Answer to the Ultimate Question of Life, The Universe, and Everything