Why is it that so many Christian conservatives are trying so hard to make a lost and sinful world adhere to the truth of God’s Word? Especially when the Church itself can’t even agree on what the Bible says about certain things and Christians have as high of divorce rate as the world, etc. Where should the line be drawn between staying out of politics and being politically active to the point of trying to make sinners behave like Christians? America is a nation filled with many different faiths and those who don’t have faith in God or any other diety. Wouldn’t the Christian’s time be better spent showing love to all people. (I’m not talking about telling people that it’s okay to sin.) Shouldn’t the Christian be out preaching Christ and him crucified instead of boycotting secular companies that decide to support gays? Shouldn’t Christians be working in Crisis Pregnancy Centers trying to help the girls who are going through possibly the most difficult time in their life instead of picketing abortion clinics?
I am not saying that Christians should have no part in politics. I’m just saying that maybe we should rethink the whole issue. Are we, as Christians reaching out to a lost and dying world, pointing them to Christ? Do they see our love for one another? Or are we judging the world harshly for something that they will be eternally punished for (as in sin)? Are we trying to make them be morally upright when, without Christ transforming their lives, they can’t understand the ways of God? Would God have us change our approach?