• 1 Post
  • 332 Comments
Joined 11 months ago
cake
Cake day: August 5th, 2023

help-circle




  • I understand the gist but I don’t mean that it’s actively like looking up facts. I mean that it is using bad information to give a result (as in the information it was trained on says 1+1 =5 and so it is giving that result because that’s what the training data had as a result. The hallucinations as they are called by the people studying them aren’t that. They are when the training data doesn’t have an answer for 1+1 so then the LLM can’t do math to say that the next likely word is 2. So it doesn’t have a result at all but it is programmed to give a result so it gives nonsense.


  • Duckduckgo suffers a lot of the same problems as google and other search engines. It’s just not getting progressively worse as fast as google. It’s still been getting worse and worse as time has gone on. I really dislike people who just point to another search engine like it’s the end all be all and don’t or won’t acknowledge that each one has problems and a lot of the problems overlap significantly. None of that fixes the problem or makes any of these companies backtrack on their terrible implementation of anti-user/anti-consumer policies.


  • This comment is not arguing in the spirit of the original comments or my own. Healthy people absolutely do want this technology for the sheer amount of convenience it could provide. Hence the number of science fiction stories about it. The thing is though, assuming that anyone who would sign up for a clinical trial must be sick is an interesting take especially in response to someone else positing that anyone who would do it is stupid or crazy. People can be perfectly healthy and still participate in clinical trials. For lots of reasons to include simply wanting to progress the science.



  • I don’t even think hallucinations is the right word for this. It’s got a source. It is giving you information from that source. The problem is it’s treating the words at that source as completely factual despite the fact that they are not. Hallucinations from what I’ve read actually is more like when it queries it’s data set, can’t find an answer, and then generates nonsense in order to provide an answer it doesn’t have. Don’t think that’s the same thing.


  • Even you think something must be wrong with them if they’re agreeing to this. Just because you lean more toward an ailment that would make someone desperate rather than someone being deficient in congestive function doesn’t mean you’re any better. Like. I get it. It’s hard to imagine a regular person just thinking one day it’s a good idea to sign up to let a company run by Elon Musk implant anything into their body (especially their brain). But this is a bit of a high horse riding comment, isn’t it?


  • This is perhaps the most ironic thing about the whole reddit data scraping thing and Spez selling out the user data of reddit to LLM’S. Like. We spent so much time posting nonsense. And then a bunch of people became mods to course correct subreddits where that nonsense could be potentially fatal. And then they got rid of those mods because they protested. And now it’s bots on bots on bots posting nonsense. And they want their LLM’S trained on that nonsense because reasons.










  • Pretty much all the big tech firms have done this. The problem is we only blame the middlemen. We blame Sony or Amazon, or Google or whoever. But the companies providing the licenses for them to “sell” are a big part of the problem. And nobody ever wants to listen when I say this but they should be on the hook too. Like, I appreciate that it’s messed up to have your purchased media shadow ganked. But at the same time it’s fucked up to have the licensing agreements be what they are to start with and that’s absolutely on companies that own the rights to digital media. Who continue to lobby to maintain the status quo.