• 4 Posts
  • 254 Comments
Joined 11 months ago
cake
Cake day: August 3rd, 2023

help-circle





  • That doesn’t work for capitalism at scale, where they are eternally chasing profits by minimizing costs over everything else.

    Everytime I watch one of those capitalist BJ shows (e.g. shark’s tank, Gordon Ramsey’s next food star, basically anything on CNBC, etc.), which I do far too often, and they ask the chum in the water whether or not the business can scale I think about how they’re really asking whether or not the product can eventually be made terribly enough to net them a new mega yacht.





  • It also is going to take another leap in algorithm.

    It was a hard problem to solve when Google’s founders cracked it, but it’s an even harder problem to solve now that you have state of the art spam bots filling the Internet full of shit that looks like it was composed by humans.

    If someone cracks how to figure out whether something is ai or not (for real, not the fake solutions we have now) and adds that to a good search algorithm and filters the fake shit by default, they will have a hell of a product on their hands.





  • It’s not naive to think that corporations will continue to win the “AI” war. It’s actually pretty naive to think otherwise.

    I also dunno why you think that all of the resources in oss AI will focus their efforts on making it easy to generate excellent, likely already illegal deep fake porn of random teenagers in “one click”.

    I’ve been using oss for decades and almost nothing is that easy to do even when it could be. Why would people focus their efforts on this?

    Also also, I don’t get why you think that generating AI porn of people around you is:

    A) so much better than just watching the millions of hours of already available porn

    B) anything even remotely similar to “seeing someone naked”



  • I don’t even think that’s necessarily true. If you make it illegal and/or platforms ban it, you’re already taking a step toward making it more difficult to do.

    I think throughout this thread you’re mistaking the technically possible for the probable or likely.

    By making it illegal, you essentially eliminate the commercial incentive for making it easy. Every barrier to doing something makes it more unlikely that people will do it. I understand that there is an inherent motive for people to do it anyway, but, every hoop they have to jump through (e.g. setting up their “own, local AI”) reduces the likelihood of them doing it.

    People don’t even run their own email servers, music servers, video servers, etc. etc. etc…most people don’t even “jail break” devices…many don’t even store a local cache of regular porn…why the hell would most people bother themselves with setting up a local generative AI instance for this purpose?

    Outlawing it and banning it from platforms makes it much more within the realm of the creepy basement weirdo rather than something that is as inevitably ubiquitous as you’re saying it will be.

    Policy is very often about reduction of harms rather than elimination of harms. It’s not the black and white realm that you’re trying to make it out to be.





  • Aguing that since you do a crime with a tool, outlawing the crime outlaws the tool is a bad argument. Outlawing murder doesn’t outlaw knives.

    As far as enforcement, it may be enforced with varying degrees of success but the argument that someone may get away with the crime also isn’t a reason not to make it a crime.

    If someone created deep fakes using locally run models, rubbed one out and then deleted everything they probably wouldn’t be caught…but largely who cares that they didn’t? It’s the harm to others that it causes that you would largely like to prevent, and if a person didn’t distribute the image at all them “getting away with it” doesn’t matter much.

    Edit: I think the argument that existing laws already cover this is more compelling than any of the above arguments as far as why this new law shouldn’t be passed.