• 0 Posts
  • 238 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle

  • I get it, I do. I’ve been a migrant in a place with a language barrier on top of sharing that general feeling, so… yeah, sure. In principle.

    But the times I’ve used it it’s twice the anxiety, in that I keep fearing I’ll mess up and need help, which is orders of magnitude worse than having to go through the register. Just the potential of issues is enough to deter me, but the times I’ve had the scales mess up or the payment method not go through were excruciating.



  • I don’t disagree on principle, but I do think it requires some thought.

    Also, that’s still a pretty significant backstop. You basically would need models to have a way to check generated content for copyright, in the way Youtube does, for instance. And that is already a big debate, whether enforcing that requirement is affordable to anybody but the big companies.

    But hey, maybe we can solve both issues the same way. We sure as hell need a better way to handle mass human-produced content and its interactions with IP. The current system does not work and it grandfathers in the big players in UGC, so whatever we come up with should work for both human and computer-generated content.


  • That’s not “coming”, it’s an ongoing process that has been going on for a couple hundred years, and it absolutely does not require ChatGPT.

    People genuinely underestimate how many of these things have been an ongoing concern. A lot like crypto isn’t that different to what you can do with a server, “AI” isn’t a magic key that unlocks automation. I don’t even know how this mental model works. Is the idea that companies who are currently hiring millions of copywriters will just rely on automated tools? I get that yeah, a bunch of call center people may get removed (again, a process that has been ongoing for decades), but how is compensating Facebook for scrubbing their social media posts for text data going to make that happen less?

    Again, I think people don’t understand the parameters of the problem, which is different from saying that there is no problem here. If anything the conversation is a net positive in that we should have been having it in 2010 when Amazon and Facebook and Google were all-in on this process already through both ML tools and other forms of data analysis.


  • I’m gonna say those circumstances changed when digital copies and the Internet became a thing, but at least we’re having the conversation now, I suppose.

    I agree that ML image and text generation can create something that breaks copyright. You for sure can duplicate images or use copyrighted characterrs. This is also true of Youtube videos and Tiktoks and a lot of human-created art. I think it’s a fascinated question to ponder whether the infraction is in what the tool generates (i.e. did it make a picture of Spider-Man and sell it to you for money, whcih is under copyright and thus can’t be used that way) or is the infraction in the ingest that enables it to do that (i.e. it learned on pictures of Spider-Man available on the Internet, and thus all output is tainted because the images are copyrighted).

    The first option makes more sense to me than the second, but if I’m being honest I don’t know if the entire framework makes sense at this point at all.


  • A lot of this can be traced back to the invention of photography, which is a fun point of reference, if one goes to dig up the debate at the time.

    In any case, the idea that humans can only produce so fast for so long and somehow that cleans the channel just doesn’t track. We are flooded by low quality content enabled by social media. There’s seven billion of us two or three billion of those are on social platforms and a whole bunch of the content being shared in channels is created by using corporate tools to make stuff by pointing phones at it. I guarantee that people will still go to museums to look at art regardless of how much cookie cutter AI stuff gets shared.

    However, I absolutely wouldn’t want a handful of corporations to have the ability to empower their employed artists with tools to run 10x faster than freelance artists. That is a horrifying proposition. Art is art. The difficulty isn’t in making the thing technically (say hello, Marcel Duchamp, I bet you thought you had already litgated this). Artists are gonna art, but it’s important that nobody has a monopoly on the tools to make art.


  • It’s not right to say that ML output isn’t good at practical tasks. It is and it’s already in use and has been for ages. The conversation about these is guided by the relatively anecdotal fact that chatbots and image generation got good so this stuff went viral, but ML models are being used for a bunch of practical uses, from speeding up repetitive, time consuming tasks (e.g. cleaning up motion capture, facial modelling or lip animation in games and movies) or specialized tasks (so much science research is using ML tools these days).

    Now, a lot of those are done using fully owned datasets, but not all, and the ramifications there are also important. People dramatically overestimate the impact of trash product flooding channels (which is already the case, as you say) and dramatically underestimate the applications of the underlying tech beyond the couple of viral apps they only got access to recently.


  • Yep. The effect of this as currently framed is that you get data ownership clauses in EULAs forever and only major data brokers like Google or Meta can afford to use this tech at all. It’s not even a new scenario, it already happened when those exact companies were pushing facial recognition and other big data tools.

    I agree that the basics of modern copyright don’t work great with ML in the mix (or with the Internet in the mix, while we’re at it), but people are leaning on the viral negativity to slip by very unwanted consequences before anybody can make a case for good use of the tech.


  • I think viral outrage aside, there is a very open question about what constitutes fair use in this application. And I think the viral outrage misunderstands the consequences of enforcing the notion that you can’t use openly scrapable online data to build ML models.

    Effectively what the copyright argument does here is make it so that ML models are only legally allowed to make by Meta, Google, Microsoft and maybe a couple of other companies. OpenAI can say whatever, I’m not concerned about them, but I am concerned about open source alternatives getting priced out of that market. I am also concerned about what it does to previously available APIs, as we’ve seen with Twitter and Reddit.

    I get that it’s fashionable to hate on these things, and it’s fashionable to repeat the bit of misinformation about models being a copy or a collage of training data, but there are ramifications here people aren’t talking about and I fear we’re going to the worst possible future on this, where AI models are effectively ubiquitous but legally limited to major data brokers who added clauses to own AI training rights from their billions of users.



  • It seems like it would have been hard to avoid acknowledging the mistake, given that the mistake was very clearly lodged into somebody’s backyard, as opposed to still being attached to the rest of the plane, but alright.

    Hey, some people can have a human interaction when doing damage control during a crisis, and apparently this CEO I didn’t know about until just now isn’t one of those. There are now two different lessons to take away from this, apparently.

    For the record, flashy as this thing was it’s not that big of a deal, but it sure is funny and spectacular.


  • I don’t hate. I like a good keyboard.

    Now, do I think obsessing about the extremely specific properties of switches and keycaps and spending hours manually embedding each individual key component just to get a specific color combination makes sense as a hobby? Hell no. But then neither does collecting stamps or watching people’s grocery runs on Youtube. You do what you want, and this hobby at least lets you put whatever icon you please on the Bixby button.

    I’ll say this, though, that justification, which I have used often to myself and others, is a terrible rabbit hole of mismanaged finances. That is true of your monitor, your PC, you laptop, your phone, your keyboard, your chair, your desk… by the time you’re done you’ve spent a year’s salary setting up your workstation with absurdly luxurious, custom gear that sometimes makes no discernible difference. By all means get whatever stuff saves you from injury and provides comfort and satisfaction, but we all know in many of those categories the quality curve flattens out way before the price curve does.

    Also, I guarantee most people with a custom keyboard swap it out more often than people who are still using the crappy board that came for free with their prebuilt or was given to them at work. I have dirt cheap Dell keyboards that still work fine. I may not love how they feel or sound, but it turns out we mastered the art of making buttons a while ago and closing a circuit with a conductive pad reliably is not a particularly costly proposition. Hey, buy good keyboards for the feel or because you have a glitzy hobby, but don’t lie to yourself or me about it. You’re a grown person, own that superfluously expensive nerdy taste. If boomers could brag about their fountain pens you can smugly bore your friends talking about the injection molding in the keycaps matching a specific pantone that you bought.


  • Once the superheroes start to go it gets weird, because at some point the likeness is the least of the issues there. You’d probably want to redesign the costume anwyay. Once you can publish stories with Superman or Batman and use the names and at least some of the core cast why stick to the rest of the package, given how constantly it cycles.

    Only it’s all still going to be a minefield. Famously the Sherlock Holmes guys were out there trying to sue Netflix for having their Holmes be too emotional, which they argued was still protected. I mean, they lost, but outside of the fan productions that already exist are you gonna bet your business on that?





  • As fas as I know there’s nothing keeping restaurants or bars from charging to use the toilets. Also as far as I know, and I’ve used public toilets in restaurants and bars in most of the countries you list many, many times over several decades, those are exceedingly rare and absolutely not the norm. That was true 40 years ago and it’s true today.

    The type of toilet is a different thing and yeah, until maybe the late 90s a lot of Europe was no stranger to squatting toilets. Honestly, for pubs and places where you’re mostly disposing of the drinks you’re having, I’m not even sure they’re a bad idea. Less accessible and whatnot, but I’m not sure a sit down toilet with a carefully developed patina of beer urine developed over years of sloppy drunken aim is a safer or cleaner proposition.