![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://programming.dev/pictrs/image/170721ad-9010-470f-a4a4-ead95f51f13b.png)
There’s almost always at least a little ASM sprinkled into any kernel, so that’s not a big deal.
OTOH, there is the factor of “you know how Chrome takes up 2GB per tab? What if that was a whole OS?”
There’s almost always at least a little ASM sprinkled into any kernel, so that’s not a big deal.
OTOH, there is the factor of “you know how Chrome takes up 2GB per tab? What if that was a whole OS?”
You’re probably in a country that got a ton of allocations in the 90s. If you came from a country that was a little late to build out their infrastructure, or even tried to setup a new ISP in just about any country, you would have a much harder time.
The Rust compiler tends to turn my impostor syndrome to 11. I assume she has some kind of humiliation kink and I do not consent.
JSON and XML can be “real” languages. Mostly because of people who didn’t stop to ask if they should.
This is the same language where you have to say PLEASE
sometimes or it won’t compile. But if you say PLEASE
too much, the compiler will think you’re pandering and also refuse to compile. The range between too polite and not polite enough is not specified and varies by implementation.
import tensorflow # we don't actually use this anywhere, but my boss told the client we use AI
Every time I see a defense of IPv4 and NAT, I think back to the days of trying to get myself and my roommate to play C&C: Generals together online, with a 2v2 game, with one of us hosting. Getting just the right combination of port forwarding working was more effort than us playing C&C: Red Alert on dial up when we both lived at home.
With IPv6, the answer is to open incoming traffic on the port(s) to the host machine (or just both since the other guy is might host next time). With IPv4, we have to have a conversation about port forwarding and possibly hairpin routes on top of that. This isn’t a gate for people “who know what they’re doing”, it’s just a bunch of extra bullshit to deal with.
There’s one practical thing. Routers have had years to optimize IPv4 routing, which has to be redone for IPv6. Same with networking stacks in general.
In theory, IPv6 should be faster by not having to do bullshit like CGNAT. There’s every reason to think it’ll match that advantage if we just make it happen.
The other app off the top of my head is VoIP. You should be able to “dial” a number directly. Most solutions go through the company’s data center first in order to pierce through NAT. Which makes it more expensive, less reliable, slower, and more susceptible to snooping.
There’s a “if you build it, they will come” effect here. Once you can address hosts directly, a whole bunch of things become better, and new ideas that were infeasible are now feasible. They don’t exist now because they can’t.
There ought to be more servers.
Will the app for the smart thermostat be updated three years from now and still be useful? If it was instead a web server app on a routable IP, it wouldn’t matter provided they didn’t fuck up the authentication and access control.
I’m starting to think the way to go isn’t set stories in the sprint at all. There’s a refined backlog in priority order. You grab one, do it, grab the next. At the end of the two week period, you can still have a retro to see how the team is doing, but don’t worry about rollover.
Alternatively, don’t think of Agile as a set instruction manual, but rather a group of suggestions. Have problem X? Solution Y worked for many teams, so try that.
They primarily use evaporative cooling. Way less energy use, but no, it doesn’t get returned.
Datacenters moved to using evaporative cooling to save power. Which it does, but at the cost of water usage.
Using salt water, or anything significantly contaminated like grey water, would mean sediment gets left behind that has to be cleaned up at greater cost. So yes, they generally do compete with drinking water sources.
There’s no way nuclear gets built out in less than 10 years.
Unlike purchasing things for imaginary gods, carbon credits could work in theory. At least well enough to be part of the solution. That is, if they were properly regulated around strategies that actually absorb carbon and everyone is forced to be honest and transparent.
Which none of them do, of course.
It’s definitely a movie cut from the same cloth as Catcher in the Rye. You love it as a teenager because you vibe with the main character. Then you grow up and see how self-polluting and obnoxious the character is.
I did love the exchange between Mary McDonnell’s character and the fundie lady. “Do you know who Graham Greene is?” “Please, I think we’ve all seen Bonaza”. It has a layer of humor that couldn’t have been intentional. The fundie lady is mixing up Graham Greene with Lorne Green, and Mary McDonnell would go on to play the political half of Lorne Green’s character in the Battlestar Galactical reboot a few years later.
I thought the first one was at least fun, but had some obviously annoying parts that should have been cut from any sequel.
Then the second one comes out, and the annoying parts of the first are the entire movie of the second.
So here’s two links about Alan Wake 2.
First, on a 1080ti: https://youtu.be/IShSQQxjoNk?si=E2NRiIxz54VAHStn
And then on a Rog Aly (which I picked because it’s a little more powerful than the current Steam Deck, and runs native Windows): https://youtu.be/hMV4b605c2o?si=1ijy_RDUMKwXKQQH
The Rog seems to be doing a little better, but not by much. They’re both hitting sub 30fps at 720p.
My point is that if that kind of handheld hardware becomes typical, combined with the economic problems of continuing to make highly detailed games, then Alan Wake 2 is going to be an abberation. The industry could easily pull back on that, and I welcome it. The push for higher and higher detail has not resulted in good games.
Lot of those games are also hot garbage. Baldur’s Gate 3 may be the only standout title of late where you don’t have to qualify what you like about it.
I think the recent layoffs in the industry also portend things hitting a wall; games aren’t going to push limits as much as they used to. Combine that with the Steam Deck-likes becoming popular. Those could easily become the new baseline standard performance that games will target. If so, a 1080ti could be a very good card for a long time to come.
Yeah, if anything, Apple is behind the curve. Nvidia/AMD/Intel have gone full cocaine nose dive into AI already.
Not sure about GP, but that’s basically what we did under “SAFe” (Scaled Agile Framework). PI planning means taking most of a sprint to plan everything for the next quarter or so. It’s like a whole week of ticket refinement meetings. Or perhaps 3 days, but when you’ve had 3 days of ticket refinement meetings, it might as well be the whole work week for as much a stuff as you’re going to get done otherwise.
It’s as horrible as you’re thinking, and after a lot of agitating, we stopped doing that shit.