Wish we could go back to the 90s. If you notice, there was never talk about white nationalism, global warming, "kids choice" gender identity crisis, racism, misogyny, all the terms they like to throw out there to degrade people. I mean just imagine growing up the 90s and telling people 20 years later people in America would be called Nazis, people would tear down George Washington statues, we will have to debate if its okay for drag queen story hour to kids, people would actually defend perversion of kids, and people who hate billionaires continue to support billionaires like Soros Open Society foundation, Bilderberg group, Gates, Rockefeller, and so on while they attack the people trying to expose their culture manipulation... does it sound like anything has actually really got better? No, over the last 20 years we've found even more social differences and cultural taboos to breakdown culture to divide ourselves.