I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
Tell that to anyone in the LGBTQ+. Or anyone seeking an abortion. Or any child wanting to learn about systemic racism. Or anyone openly admitting to being atheist in school. Or anyone refusing to recite the pledge of allegiance.
The news about Florida is SPOT-FUCKING-ON, and is why we move from there six years ago to find a better life.