I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
The southern states still are the worst and I assume will continue to be the worst due to conservatism
And that will remain true until education improves
Meanwhile desantis is somehow making Florida’s education even worse