How and why did feminism become the de facto religion of millennials and their liberal elites? There may be no official church and no official hierarchy of priestesses, but feminist dogmas and demands have steadily gained mainstream acceptance over the last decades, and now are adhered to by the younger generations as if they were the gospel. Feminist thought influences minds, and laws, and policies in a way only Christianity used to do for older generations. But why? And is there any hope of reversing this trend before it's too late to save western society from self-destruction?