If you have the impression that there’s a dominant, homogeneous “mass” sharing the same opinion, you are right there in the middle of an information bubble and a victim of those “algorithms”.
If you have the impression that there’s a dominant, homogeneous “mass” sharing the same opinion, you are right there in the middle of an information bubble and a victim of those “algorithms”.
Would that make a difference?
I’d like to share your optimism, but what you suggest leaving us to “deal with” isn’t “AI” (which has been present in web search for decades as increasingly clever summarization techniques…) but LLMs, a very specific and especially inscrutable class of AI which has been designed for “sounding convincing”, without care for correctness or truthfulness. Effectively, more humans’ time will be wasted reading invented or counterfeit stories (with no easy way to tell); first-hand information will be harder to source and acknowledge by being increasingly diluted into the AI-generated noise.
I also haven’t seen any practical advantage to using LLM prompts vs. traditional search engines in the general case: you end up typing more, for the sake of “babysitting” the LLM, and get more to read as a result (which is, again, aggravated by the fact that you are now given a single source/one-sided view on the matter, without citation, reference nor reproducible step to this conclusion).
Last but not least, LLMs are an environmental disaster in the making, the computational cost is enormous (in new hardware and electricity), and we are at a point where all companies partaking in this new gold rush are selling us a solution in need of a problem, every one of them having to justify the expenditure (so far, none is making a profit out of it, which is the first step towards offsetting the incurred pollution).
It’s part of the reason why I think decentralized services could be the future. Lemmy or Mastodon can have a lot of small servers with reasonable costs spread across many admins, instead of one centralized service that costs a significant amount to run.
Ohh, absolutely, or rather, it is the past. I mean, internet was built that way, as a resilient federation of networks and protocols. Lemmy could be seen as us just rediscovering emails after the tech giants almost succeeded in killing it. We should approach all the services we use by asking ourselves basic sustainability questions:
is that thing opensource?
self hostable?
does it federate/interoperate with equivalent services?
can I pull my data out of it/relocate to another provider on a whim?
if not, is this a trustworthy and ethical business?
is it profitable?
are there open financial records available showing where/for what the money is going?
is it at risk of being acquired?
is it subject to foreign/unlawful interference
Etc Etc
Until i can give a laptop with linux to my neighbour without also needing to also provide support, its not there yet.
I mean, isn’t your neighbor already getting Windows support from his son or nephew anyway? Let’s not pretend that there exists a magical and perfect OS for those who don’t want to learn one. Some learning is required, whichever the OS, and I would be hard to convince that a current preinstalled Linux is more difficult to handle than a current preinstalled Windows.
Windows has for itself that it’s a devil most people know/got exposure to (thanks to Microsoft schemes and monopolistic practices), there is nothing inherently better or easier about it (and arguably quite the opposite).
What I found compelling about the sync is that you can have your other machines’ histories there with you, but in the background, behind a different shortcut, just in case you need to re-run or check that command you ran somewhere else few years ago…
As I said, I haven’t used that yet, but that’s in many ways more appealing than having to SSH onto said machine (assuming it’s even possible).
I figured starship.rs but not the CTT part, any pointer to help me?
Been using it for months, haven’t gotten to use the sync yet, my only regret so far is that it doesn’t support case insensitive search which is a pretty big deal for me unfortunately.
I mean, the internet was fine until the advent of global “engagement-driven social networks” that practically became filter bubbles optimizing for ads delivery, then echo chambers for political gain, down to self-sustained propaganda machines for geopolitical sabotage. Early internet felt like village-scale communities centered around a single purpose/interests where people came in the first place to contribute something or help each other. Trolls did exist but there was no tolerance for them because the absence of centralization meant they didn’t have to be accepted there in the first place.
Would be nice to be able to run WG on the NAS directly and not need a server, wouldn’t it? I believe there are a few go/rust userspace WG servers out there but I don’t know if anyone’s using them for anything like that.
What is “old arse” to you might be blazing fast and great for someone else (potentially in a less fortunate area of this world), and besides that, no matter your or my sensobilities, if it works, it works and should be kept that way as long as it has a purpose and the hardware permits it.
Except for a marginal fraction of the top YouTubers, aren’t most of them getting paid to inject sponsored links and from donations/patronage these days? It seems that the deal you are referring to has been off the table for a majority of YouTubers for a very long time now, and I don’t see why other platforms wouldn’t be as good, or even healthier than YouTube to provide them that kind of revenue.
I can’t pretend to know the future, but if you read between the lines and the justifications provided, this isn’t really about AGPL per se, but about Element brokering AGPL exceptions. Practically we can expect all kinds of forks with opencore options that might enshittify the user experience in different ways, and further solidification of Element’s single-handed control over Matrix (which had been a prime concern for many years). Matrix is by the day closer to the closed-source centralized silos it was first pretending to oppose.
Interesting. Were the apps/features installed comparable between the OC and NC instances? I can’t even find an “email” equivalent app for owncloud from their marketplace.
I don’t want to sound like I’m coming in defence of NC, but I’d be curious to find an as factual as possible comparison between “bare-bones NC” vs “bare-bones OC”.
Matrix problems become unmanageable at scale, but the effects of the underlying complexity can be felt long before: https://telegra.ph/why-not-matrix-08-07
If you read between the lines, Matrix 2 is practically about handing the client state over to the server (what they refer to as “sliding sync”). Realistically, this is an admission that the protocol is too complex to be handled efficiently on the user’s devices. I’m not saying there are not clear benefits (and new trade-offs) to the approach, just that in the grand scheme of things the complexity is shifted elsewhere (and admins foot a larger bill).
Yup, like pretty much everyone else :) https://nlnet.nl/project/XMPP-MLS/
Please, don’t recommend pidgin, it’s a security hellhole, and a pretty terrible XMPP client at that. If you want something with a similar vibe, check-out https://dino.im/ or https://gajim.org/ if you are more on the “power-user” side of things :)
Isn’t that the essence of the issue, that those models are loaded with biases, that might or might not overlap with dominant ones in inscrutable ways, hence producing new levels of confusion and indirection?