For me, it’s realizing that Hollywood isn't just a dream factory, but an industry where a bunch of powerful men have historically coerced women into sex in exchange for a chance to appear in a movie.
Also, learning how the media and political elites push identity politics constantly. It feels like a deliberate attempt to get everybody fighting a culture war so that they are distracted from uniting around a class war against economic exploitation.
And probably the biggest one is foreign policy. Growing up you hear about America spreading freedom and democracy around the world. The more you learn, the more you realize it's actually about orchestrating coups, destabilizing regions, and stealing resources—all to exploit people and lower costs for corporations.



Manly video games like gears of war. They are making everything woke, gay and bland now. Even GTA 6 is going in that direction, and I remember when vice city came out it became popular because of the media shitstorn it caused with even politicians talking about regulating that kind of content.