I hear people on here defending the west a lot but what intrinsically makes the west better than the East? Most of the west is weak and cucked beyond repair. Eastern Europeans on the other hand still have strong traditions and support masculinity. So why support the west? Wouldn't be be better off if we let it fall and let the East lead the world out of the cesspit of degeneracy?