ClayTrainor
Member
- Joined
- Sep 19, 2007
- Messages
- 12,840
Yes, Christianity unequivocally teaches it. And not only that, the premise that men are evil has been a solid foundation for patriots in history to build a case against tyrants, and government power itself.
It is a GOOD thing to understand that men are evil by nature.
Thanks for clarifying.
Does this mean that babies are evil by nature too? If not, than at what point do they all become naturally evil?