Get a dumb video projector.
Get a dumb video projector.
To consider related to evaluation of said hardware, cf https://lemmy.ml/post/20849010/14003579
I mean it’s very complex and very expensive for “just” a key but if you want something fully auditable maybe Precursor.dev is a good fit. It’s more than a key but the point is that it’s as open as it can be. Honestly I’d consider it more a learning adventure that an tool at this point but still, see https://www.bunniestudios.com/blog/2022/towards-a-more-open-secure-element-chip/ for the philosophy and https://github.com/betrusted-io/xous-core with Vault for the key aspect specifically.
Yep, already happily running on my PineTab2 thanks to DanctNix!
Just yesterday I pinned VLC on my KDE Plasma Task Manager. Why? Because this way I can directly open “Recent Files” from it. I discovered about this functionality just last week with Libre Office Draw. It’s so efficient, it absolutely changed how I use my computer daily!
but… why do I bother with this long example? Because IMHO that’s from KDE, not Debian. When a distro improve the UX, as I also wish, it can be mostly by selecting the best software in its packages to maintain (e.g. here KDE but yes could indeed be their own custom made package, even though it requires a lot more resource AND other distro could also use them back assuming it’s FLOSS) but arguably the UX is mostly of the distribution itself is limited to the installation process.
more cutting edge than Debian
In what aspect? How about Debian Unstable?
I’m personally on Stable but I do also have some AppImages (and recently discovered AM https://github.com/ivan-hc/AM thanks to someone here), my own ~/bin directory and quite a few tools. I feel that there are very few things from an end-user standpoint that needs to done only through the distribution package manager. I believe having a stable OS but “cutting edge” specific apps (say Cura, Blender, etc) is a good compromise. As you mention Firefox over a PPA (which is also have I have) is such a good compromise. So I’m curious (genuinely, not trying to “convert” you to Debian on desktop) what is better on that front on Ubuntu rather than Debian.
Edit: to clarify I both pay my bills (literally, and work too) and play (including recent VR Windows only games) on my Debian stable on desktop.
a shortage of meaningful innovation
Well… a distribution IS a selection of packages and a way to keep them working together. Arguably the “only” innovation in that context is HOW to do that and WHICH packages to rely on. For the first, the “latest” real change could be considered immutable distributions, as on the SteamDeck, and declarative setup, e.g. NixOS. For the second… well I don’t actually know if anybody is doing that, maybe things like PrimTux for kids at schools in France?
Anyway, I agree but I think it’s tricky to be innovative there so let me flip the question, what would YOU expect from an innovative distribution?
Funny, exactly what I mentioned in another thread https://lemmy.ml/post/21238446/14210360
That’s why I moved back to Debian few weeks ago. I’m checking this thread and article precisely to see what I’m missing and… arguably not much. If it’s “just” updates of some applications without any meaningful change, I don’t really see the appeal anymore.
I hope everybody criticizing the move either do not use products from Mozilla or, if they do, contribute however they can up to their own capabilities. If you don’t, if you ONLY criticize, yet use Firefox (or a derivative, e.g. LibreWolf) or arguably worst use something fueled by ads (e.g. Chromium based browsers) then you are unfortunately contributing precisely to the model you are rejecting.
FWIW I installed Debian few times this weekend, both Sid and Bookworm, with a 2080Ti and iirc following the official documentation, e.g https://wiki.debian.org/NvidiaGraphicsDrivers#Debian_12_.22Bookworm.22 was enough, nothing exotic needed namely :
contrib
and non-free
, updating, install driversAnother “trick” I use is having an ~/Apps directory in which I have AppImage, binaries, etc that I can bring from an old /home to a new one. It’s not ideal, bypassing the package manager, and makes quite a few assumption, first architecture, but in practice, it works.
I did more than 5 installs this weekend (for … reasons) and the “trick” IMHO is …
Do NOT install things ahead of actually needing them. (of course this assume things take minutes to install and thus you will have connectivity)
For me it meant Firefox was top of the list, VLC or Steam (thus NVIDIA driver) second, vim as I had to edit crontab, etc.
Quite a few are important to me but NOT urgent, e.g Cura (for 3D printer) and OpenSCAD (for parametric design) or Blender. So I didn’t event install them yet.
So IMHO as other suggested docker/docker-compose but only for backend.
Now… if you really want a reproducible desktop install : NixOS. You declare your setup rather than apt install -y
and “hope” it will work out. Honestly I was tempted but as install a fresh Debian takes me 1h and I do it maybe once a year, at most, no need for me (yet).
Out of curiosity, why? They have their own TPU which they claim to be quite efficient. Is it because they can’t produce enough? Or because they have to resell NVIDIA for their own cloud, Google Cloud, to customers because they prefer to stick to CUDA? Or something else?
No and to be honest without a clear comparison with the advantages AND disadvantages with the most popular solutions, e.g containers with implementations like Docker or Podman, I don’t think I ever will.
Obviously it’s nice to have alternatives which I bet can be interesting in specific use cases but without a way to understand in which specific situations it would be worth investing to learn the tooling, principles, etc then I would, naively, stay with the status quo.
TL;DR: any comparison vs Docker?
It is, if you are not ready to tinker I do not recommend it.
Yet, it works as-is, assuming you don’t need to work wirelessly. I use it on a nearly daily basis and it’s stable.
My documented process https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence but honestly I just tinker with this. Most of that isn’t useful IMHO except some pieces, e.g STT/TTS, from time to time. The LLM aspect itself is too unreliable, and I do like 2 relatively recent papers on the topic, namely :
which are respectively saying that the long-tail makes it practically impossible to train AI to be correct in rare cases and that “hallucinations” are a misnomer for marketing purposes to be replaced instead by “bullshit” used to convinced people without caring for veracity.
Still, despite all this criticism it is a very popular topic, hyped up to be the “future” of computing. Consequently I did want to both try and help others to do so rather than imagine that it was restricted to a kind of “elite”. I try to keep the page up to date but so far, to be honest, I do it mostly defensively, to be able to genuinely criticize because I did take the time to try, not reject in block.
PS: I do try also state of the art, both close and open-source, via APIs e.g OpenAI or Mistral but only for evaluation purposes, not as tools part of my daily usage.
From video description:
Reason 1: Gaming Reason 2: Creative Apps Reason 3: Foobar2000 (my music player) Reason 4 (bonus) Fussing, fussing, fussing!
Gaming? Fair point.
Unless it’s for games that use shitty anticheat solutions probably not a good reason anymore due to the SteamDeck, a LOT of games do work and it’s possible to check before hand via ProtonDB.
So it was a fair point 5 years ago, now most AAA games, including VR games, do work without tinkering.
Too late, back to Debian proper.