To think that the church has allowed itself to become mired in sexual morality. With wars, starvation, and greed destroying creation, we’ve chosen this as our hill to die on?
Looking back at this, I find myself posed with even more thoughts and questions.
Christianity is supposed to be an active religion of service works. A faith where we work together to care for the poor, sick, and needy. A faith in which those works are a central evidence of one’s belief. “Thus you will recognize them by their fruits.” (Matthew 17:20).
What we have in modern American evangelical Christianity isn’t a spirt of doing works to serve others. We instead have a mashup of faith and politics whose primary goal is enforcing our views upon others. We’ve transformed from a Christ-centered faith of “we shall” into a legalistic tribe of “thou shall not.”
Christianity has an extreme identity issue in the western world. Our most visible proponents are woolen wolves who barely know the faith they claim to preach. By their actions, those of us who do follow the way, truth, and light are cast as hypocritical bigots either intent on wealth and fame or woefully ignorant.
Because of those who’ve chosen to spend their ministries condemning others rather than serving them, Christianity in America is withering away to nothing. Who wants to follow a supposedly love-based faith that calls people hell-bound, immoral miscreants? I know I sure don’t. And the Americans who believe in God but don’t go to Church don’t either.