Explain the culture war in the US to me.
How long has it been going on?
What is it about?
What issues does it touch?
Pic unrelated
cool book thread faggot
In the summer of 2001, Newsweek had a theme issue on the US culture wars. I still got it.
It can be boiled down to that the culture in Europe is conservative, and in US it's democratic.
That means there's a strong centre in Europe. Everyone must relate to it. And if the center moves, the rest of society moves with it. The center can give you a sense of security, so your status won't be imperiled if you go "slumming".
Because the lack of a strong centre in the US, there's a constant war over issues that europeans made up their minds about decades ago.
Too bad that we europeans tend to invest in the worst sides of the US, totally oblivious about the cultural differences.