First, let me add a disclaimer. I don't hate Christians, nor do I hate Christianity. What I hate is Christian fundamentalism, or nationalism. The idea that America is a Christian nation and should be governed as such.
The last few years have seen two things: growing awareness of this movement (such as Jesus Camp and Michelle Goldberg's fascinating book, Kingdom Coming: The Rise of Christian Nationalism) and the movement itself.
The aspect of Christian nationalism that makes it so terrifying is that if it succeeds it would completely do away with our basic freedoms and rights. Freedom of religion would be gone, women's rights would go down the drain, along with so much more that we currently take for granted.
Although many people consider them extremists (which they are) and don't think that they'll ever actually succeed in politics, I feel that they are a major threat. Children are being brainwashed into Jesus-loving zombies whose only goals are to spread their religion among the god-hating heathens. For example, read this post from Feministe: Onward Christian Soldiers.
The idea of Christian nationalism and what would happen if they take power in this country has terrified me for years. What are your takes on this?