Americans cherish twin freedoms: freedom of religion, and freedom of speech. But they are overlooking an important fact: unrestrained freedoms are sure to self-destruct.
Politics is, by its very nature, the art of compromise. Christianity, by its very nature, is a voluntary faith that does not force itself on others. The temptation is to trust in the hope that somehow Christianizing politics and government will lead to the betterment of society.
I am not promoting Democrat, Republican, or Independent—not even conservative, liberal, more social programs, or less spending. But I would like to ask: Why can’t we leave the word “Christian” out of politics?
National government is authorized by God to administer laws and punish crime. However, in the church of Christ, the members are subject to the higher law of God. Where there is conflict between the two, "we ought to obey God rather than men", as did the early apostles.
To this difficult question, there is no answer. And yet, God gives an answer to His children: Trust God. His reasons, thoughts and intents are infinitely beyond human capacity to understand. Will we accept the fact that God owes us no explanation for what He allows?
Only the Gospel of Jesus has the answers to the world's problems. Politics is a sideshow that Satan is directing to distract our attention elsewhere. Ignore the distraction and join the real program: Christ's work on earth through the community of genuine believers.
As ambassadors of the kingdom of heaven, we have no role in the national politics of any nation on earth. While Christians are considered citizens of particular nations, they cannot become involved in national affairs. What are Christians to do when the laws of the land clash with God's laws?
In the time of national elections, Christians face extra pressure to become something they are not. Many times during election years, the question is asked–“Why don’t you folks vote; aren’t Christians to be a salt and a light? Why don’t you use the power of the vote and make a difference?”
Although we glibly talk about the land we “own,” we are actually using what belongs to another. The earth belongs to God by virtue of the fact that He created it. Has God relinquished His ownership?
In light of Jesus’ prophecy and present-day reality, is there any hope for a war-torn world? According to the Bible, there is!
As a Christian, how do you know which side of the various conflicts that arise in our world to "support" in your conversation?