I guess the libraries and schools can make the decision and throw out things they don’t find useful.
I guess the libraries and schools can make the decision and throw out things they don’t find useful.
It probably also depends on the book. I have tons of outdated books on obscure topics within engineering, science, and computing. I doubt anyone would check out my 1995 book on the Vi text editor from a library. Although, if I’m lucky, perhaps it could be a collectors item some day. In reality, I’m probably going to just say “thank you for helping me so many years ago” and respectfully recycle the book.
Mind if I ask where? I would love to see the glow worms some day. I have only seen videos, but it looks amazing.
A bit more historic, but still very relevant. The FBI used surveillance in repeated attempts to discredit Martin Luther King JR. It’s chilling how they used the information they gathered to try to get rid of MLK any way they could. They were even trying to use information they gathered to convince him to commit suicide.
I have tried installing Linux on my Surface, unfortunately I haven’t found a configuration that works for me yet. There are just a lot of small features that didn’t work like the touch screen keyboard, the ability to use my finger to scroll while using the stylus to write, and more. I can probably get it there with enough work, but for now I’m taking the lazy way out and running Windows. Still better than IOS though, yuck.
Or “let’s finish setting up your PC” full screen on a 4 year old system. Then you click through just to find the only options are 1) share more data with Microsoft, or 2) make Edge your default browser. The day I find a decent note taking tablet running Linux, windows is dead to me.
Regardless of the other stuff, deputy spam catcher is and extremely valuable contribution to the community. I have requested this of people on Lemmy and been pleasantly surprised by how willing people are to help.
This especially holds true for niche subjects. I love back country skiing so I created and moderate /c/Backcountry . The mirror community on Reddit is extremely small and dumb crypto spam sits around for days before removal because there is only one mod. He seems like a cool person dedicated to the sport, but he just can’t be there all the time. I created the community in the hopes that I can invite and keep a larger mod team.
When you are filling out the web form with your password it’s stored plain text in the web browser and accessible via JavaScript. At that point, a JavaScript function checks the requirements like length and then does the salting/hashing/etc and sends the result to the server.
You could probably come up with a convoluted scheme to check requirements server side, but it would weaken the strength of the hash so I doubt anyone does it this way. The down side of the client side checking is that a tenacious user could bypass the password requirements by modifying the JavaScript. But they could also just choose a dumb password within the requirements so it doesn’t matter much… “h4xor!h4xor!h4xor!” Fits most password requirements I have seen but is probably tried pretty quickly by password crackers.
Perhaps they validate the passwords client side before hashing. The user could bypass the restrictions pretty easily by modifying the JavaScript of the website, but the password would not be transmitted un-hashed.
It is worth pointing out that nearly any password restriction like this can be made ineffective by the user anyway. Most people who are asked to put a special character in the password just add a ! to the end. I think length is still a good validation though and it runs into the same issue @randombullet@lemmy.world is asking about
In the us it isn’t too hard to get a title for a kit car. It needs an inspection and emissions test in most states, but it’s certainly possible and people do it regularly.
No worries, thanks for the response!
Interesting answer, scanning through the Wikipedia article on kiki/bouba it makes sense that we don’t really have solid evidence that it isn’t a learned trait. It may be hard to get a population of people who developed language independently of all other humans ever and see if they maintain the strong correlation with naming kiki and bouba.
So I guess that brings up another question I have kinda wondered about. What is the most “isolated” spoken language on the planet? By that, I mean the language that evolved most independently of other spoken languages. Is there anything interesting that can be learned by comparing such a language to the European languages that are dominant among the global population?
Computer science. However, statistics is more of a hobby than anything. I am just intrigued by the idea of federated social media in general so I have thought a bit on how I would personally make it work. Perhaps I will make some more in depth blog posts about my ideas at some point.
Spam detectors are pretty opaque by their nature. In contrast, karma is pretty easy to understand: “x number of people upvoted comments or posts from this user”. This lets people understand a score even if they don’t agree. If a karma replacement behaved like a spam detector, it would probably just annoy people.
Sporting brackets may be a better analogy. They are developed with statistics in mind but are understandable to the average sports fan. I think a karma replacement should have similar properties.
Here is a good general explanation of Bayesian inference.
I think @jayrhacker@kbin.social is suggesting using such techniques to predict “troll” or “not troll” given the posting history/removed comments/etc. My personal thought is that whatever system replaces karma, it should be understandable to the typical user. I think its possible Bayesian inference could be used in developing the system, but the end system should be explainable without it.
It would be nice if you could whitelist sites for cookies. That way you can stay logged into things like email.
How common are things like the bouba/kiki effect in linguistics? It seems there are some sounds that are based on something other than learned behavior, how much does this cause commonality in real language?
I was wondering if someone would bring up search engine indexing. Google certainly has the upper hand for LLM training data with Reddit’s new API change since they have the comments anyway. This is a big reason I fear these API changes, it is very much concentrating power in the hands of already powerful companies.
deleted by creator