• Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    I fully believe AI will be able to replace 50% or more of desk jobs in the near future. It’s definitely a complicated situation and you make good points.

    First and foremost, I think it’s imperative the barrier for entry for model training is as low as possible. Anything else basically gives a select few companies the ability to charge a huge subscription fee on all our goods and services.

    The data needed is pretty heavy as well, it’s not very pheasible to go off of donated or public domain data.

    I also think any job loss is virtually guaranteed and trying to save them is misguided as well as not really benefiting most of those affected.

    And yea, the big companies win either way but if it’s easier to use this new tech, we might not lose as hard. Disney for instance doesn’t have any competition but if a bunch of indie animation companies and groups start popping up, it levels the playing field a bit.

    • RedFox@infosec.pub
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      In many discussions I’ve seen, small or independent creators are one of the focuses of loss and protection.

      Also there’s the acknowledgement that existing jobs will be reduced, eliminated, or transformed.

      How much different is this from the mass elimination of the 50s stereotype secretaries? We used to have rooms full of workers typing memos, then we got computers, copiers, etc.

      I know there’s a difference between a creator’s work vs a job/task. I’m more curious if these same conversations came up when the office technological advances put those people out? You could find a ton more examples where advancement or efficiency gains reduced employment.

      Should technology advancement be tied to not eliminating jobs or taking away from people’s claim to work?

      I know there’s more complexity like greed and profits here.