• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • Yeah, I reckon having a split of the frontend and the backend results in about half the complexity in each. If you have multiple frontends you can upgrade whatever the least important one is to see if there are any problems

    I didn’t really answer your original question.

    When I was using NUC’s I was using Linux mint which uses cinnamon by default as the window manager. Originally I changed it to use some really minimal window manager like twm, but then at some point it became practical to not use one at all and just run kodi directly on X.

    If I was going back to a Linux frontend I’d probably evaluate libreELEC as it has alot of the sharp edges sorted out.


  • I used to run kodi on linux on intel NUC’s connected to all our TV’s a while ago. I don’t remember it being particularly unreliable. The issue that made me change that setup was hardware decoding support in 4k for newer codecs.

    What I’ve had doing that frontend function ( kodi, jellyfin, disney plus, netflix etc ) for the last few years is three Nvidia shield TV pro’s which have been absolutely awesome. They are an old product now and I suspect Nvidia are too busy making money to work on a newer generation version of them,

    The biggest surprise improvement was how good it was being able to ( easily ) configure their remotes to generate power on / off and volume up and down IR codes for the TV or the AV amp they were using so you only need a single remote.

    Separating the function of the backend out from the frontend in the lounge has reduced the broken mess that happens around OS upgrades drastically.



  • The most impressed I’ve been with hardware encoding and decoding is with the built in graphics on my little NUC.

    I’m using a NUC10i5FNH which was only barely able to transcode one vaguely decent bitrate stream in software. It looked like passing the hardware transcoding through to a VM was too messy for me so I decided to reinstall linux straight on the hardware.

    The hardware encoding and decoding performance was absolutely amazing. I must have opened up about 20 jellyfin windows that were transcoding before I gave up trying and called it good enough. I only really need about 4 maximum.

    The graphics on the 10th generation NUC’s is the same sort of thing that is on the 9th gen and 10th gen desktop cpu’s, so if you have and intel cpu with onboard graphics give it a try.

    It’s way less trouble than the last time I built a similar setup with NVidia. I haven’t tried a Radeon card yet, but the jellyfin docs are a bit more negative about AMD.






  • I’ve been using Linux for something like 27 years, I wouldn’t say evangelical or particularly obsessed.

    I started using it because some of the guys showing up to my late 90’s LAN parties were dual booting Slackware it and it had cool looking boot up messages compared to DOS or Windows at the time. The whole idea of dual booting operating systems was pretty damn wild to me at the time too.

    After a while it became obvious to me that Slackware '96 was way more reliable than DOS or Windows 95 at the time, a web browser like Netscape could take out the whole system pretty easily on Windows, but when Netscape crashed on Linux, you opened up a shell and killed off whatever was left of it and started a new one.

    I had machines that stayed up for years in the late 90’s and that was pretty well impossible on Windows.



  • I’ve been running Linux for 100% of my productive work since about 1995. Used to compile every kernel release and run it for the hell of it from about 1998 until something like 2002 and work for a company that sold and supported Linux servers as firewalls and file servers etc.

    I had used et4000’s, S3 968’s and trio 64’s, the original i740, Matrox g400’s with dual CRT monitors and tons of different Nvidia GPU’s throughout the years and hadn’t had a whole lot of trouble.

    The Nvidia Linux driver made me despair for desktop Linux for the last few years. Not enough to actually run anything different, but it did seem like things were on a downward slide.

    I had weird flashing of sections of other windows when dragging a window around. Individual screens that would just start flashing sometimes. Chunky slideshow window dragging when playing video on another screen. Screens re-arranging themselves in baffling orientations after the machine came back from the screen being locked. I had crap with the animation rate running at 60hz on three 170hz monitors because I also had a TV connected to display network graphs ( that update once a minute ). I must have reset up the panels on cinnamon, or later on KDE a hundred times because they would move to another monitor, sometimes underneath a different one or just disappeared altogether when I unlocked the screen. My desktop environment at home would sometimes just freeze up if the screen was DPMS blanked for more than a couple of hours requiring me to log in from another machine and restart X. I had two different 6gb 1060’s and a 1080ti in different machines that would all have different combinations of these issues.

    I fixed maybe half of the issues that I had. Loaded custom EDID on specific monitors to avoid KDE swapping them around, did wacky stuff with environment variables to change the sync behaviour, used a totally different machine ( a little NUC ) to drive the graphs on the TV on the wall.

    Because I had got bit pretty hard by the Radeon driver being a piece of trash back in something like 2012, I had the dated opinion that the proprietary Nvidia driver was better than the Radeon driver. It wasn’t til I saw multiple other folks adamant that the current amdgpu driver is pretty good that I bought some ex-mining AMD cards to try them out on my desktop machines. I found out that most of the bugs that were driving me nuts were just Nvidia bugs rather than xorg or any other Linux component. KDE also did a bunch of awesome work on multi monitor support which meant I could stop all the hackery with custom EDIDs.

    A little after that I built a whole new work desktop PC with an AMD GPU ( and CPU FWIW ) . It has been great. I’m down from about 15 annoying bugs to none that I can think of offhand running KDE. It all feels pretty fluid and tight now without any real work from a fresh install.


  • A 2 gigabit event isn’t big enough to be considered a real attack, a service like cloudflare can sink a 2 terrabit attack every day of the week.

    Building a DDoS protection service ( that isn’t just black holing traffic ) starts with having enough bandwidth to throw away the attack volume plus keep your desired traffic working and have a bit of overhead to work your mitigation strategies.

    What this means is to DIY a useful service you start by buying a couple of terrabits of bandwith in ‘small’ chunks of a hundred gigabits or so in most peering locations around the globe and then you build a proxy layer like cloudflare on top of it with a team of smart dudes to automate outsmarting the bad guys.

    I don’t like cloudflare either, but the barriers to entry in this industry are epic.