The Christian Nation: Christianity's Camelot
by jamal smith

I think everyone is familiar with the story of King Arthur and his legendary kingdom of Camelot. According to many literature experts, it was suppose to represent the perfect world: a utopia, where peace and justice ruled the land and evil was non-existent or held back beyond its borders. It ultimately fell to internal strife and betrayal, at least according to many versions of the story.
Camelot: the medieval incarnation of the Eden. According to legend it is supposed to rise again along with it's king, Arthur, and restore the utopia that once existed. This is regarded as a fairytale to us today and even to most British citizens. Yet the essential concept of Camelot still exists today, in the idea of a Christian nation.
What is a Christian nation?
The definition probably depends on who you ask and what time period you're referring to, fore the ideal of this 'Camelot' is not a new one. It goes back even further than the Arthurian legends, to nearly the dawn of Christianity. The first three or four centuries for the new born faith was defined as a time where it was considered an outsider's religion, but a growing one. Many Caesars and rulers at times deemed it a threat, so those first few centuries also had intermittent periods of persecution. However the religion was finally legitimized by Emperor Constantine. It was not only legitimized but made the official state religion: all other faiths were banned.
There were still conflicts within legal Christianity over doctrines and the formation of what it is now known as the bible, but after this period, the Roman empire was widely regarded by many as a Christian empire. This might be considered a bold and even presumptuous statement, but I think this mentality is uncovered by Augustine's writing of 'The Tale of Two Cities'.
The background of this writing is that Rome had just fallen to the barbarians. Many Christians, either in name or faith, had come to regard the imperial capital as the center of the world. This was not only in politics and power but in religion as well for the Pope also resided in Rome as head of the Universal Catholic Church. When it fell, many felt as if the end of the world had come, the apocalyptic book of Revelations fulfilled.
Augustine was one of the leading theologians of his day, and from North Africa he wrote the 'Tale of Two Cities' in order to try and explain to people why what happened occurred and to encourage them as well. Yet the purpose of the book isn't the point of mentioning this. The point I'm making is that people still considered the Roman empire after it's conversion a Christian empire.
This was not the last time this tendency was to surface. Centuries later when Charlemagne took over most of central and western Europe, he proclaimed his empire the Holy Roman Empire, and for centuries that geographical area of the world was come to be known as 'Christendom'.
And even after that empire had come and gone, most western rulers had some form of Christianity as its official state religion. So what was a Christian nation to these people? I think perhaps they would've defined it as a country or empire whose state religion and public policies of behavior -at least in public- followed along the lines of that religion. In other words it was a show or a mask for legitimacy of that state's existence and actions. Many are the wars that tore that continent apart that began partially because of the call of restoring righteousness and true Godliness to a heretic nation or people.
And what of today?
I think towards the middle of the last century, Europe had largely dropped all pretense of the use of religion in state affairs or policies. Nations took action because they had the power to, more than anything else. Christianity was just an after thought that could be useful in making the act legitimate. And by the turn of the of the last century they had definitely taken on a more secular and pluralistic standpoint, while trying to respect the power of the church that still existed in the hearts of many of its' people. I think this religious/societal shift was in large part due to the sickening of the wars of religion that had plagued them for so long, as well as war in general.
However turning attention to America and the take becomes slightly different. It is still is a relatively young nation when compared to Europe. It's beginning is different from Europe.
America was already settled by the local inhabitants. And Spain and France already had established territories in Central and North America. Christianity had come over with them in the form of missionaries and monks. Because it was part of the conquerors' religion, it already had inherited a dubious association. Many of the invaders were cruel and inhuman to the locals. And though the locals had been fighting amongst themselves long before the Europeans arrived, Christians often had a difficult time showing just how their society was any better. If anything it was a two-sided coin: on the one hand you had many missionaries and monks trying to help the people with diseases brought over by the invaders, and efforts by some to stop some of the cruel acts the occupiers were committing to the people. And on the other you had those who actively went around trying to erase all aspects of the conquered culture they deemed ungodly and heathen. This took the form of destroying buildings, and forcing the natives to abandon their traditional garb and language for the more 'civilized' European fashion.
By the time of the Pilgrims, the nation was not yet a nation. It was either claimed territories or uncharted land yet to be claimed. These first settlers are what many believers would consider today the founders of the Christian ideals. Fleeing from persecution in Europe they came to the new world seeking a place to practice their brand of faith without fear.
Through out the years, more and more people came and because their land was just a territory and not really a state, their religious practices went largely unchallenged. When the American revolution came around, it introduced a dichotomy that was the first of its kind in history: that while God was mentioned in the its constitution, the church was not to be part of the ruling government or the official state religion.
This was a huge statement because every culture across the world has had in some part of its foundation some sort of divine mandate that everyone must follow and adhere to so social order and sovereign power was maintained. In the American Revolution however, while God's influence in was respected for bestowing inalienable human rights, it was the people and by the people that the nation was to be run. No more is this more clear than in the mandate of freedom of religion.
And for the last three hundred years Christian influence has been continually part of this country. Therefore this leads to the inevitable question:
Is the United States of America truly the Christian nation that believers have dreamed of since Constantine made the faith legal two millennia ago?
Is the United States for all intents and purposes, the legendary Camelot?
While many believers would not have phrased it this way they would probably say yes, or at least it used to be. Today we live in a post-modern society, a pluralistic society where it seems to many that the influence that Christianity had on the culture has diminished. Often cited as proof of this is the banning of prayer from schools, the removal of the Ten Commandments from public and government buildings, and the legalization of abortion among other things. The media is filled with images of sex and violence, and our society is continuing to degenerate. All signs that we have fallen from the grace of God.
This is the position of many believers, but not all. There are many other believers while not arguing the woes of society, don't even know what a Christian nation looks like! And there are others, especially among the new generation of believers who don't believe in it at all!
I say that I don't believe America is Camelot. Nor do I believe it ever was Camelot, the Christian nation.
While the influence of Christianity cannot be ignored in American history, I think it is to egocentric to simply look at the Christian nation concept from a strictly American viewpoint. Hence the point of this article, those American believers are not alone in the belief that their country, their empire, their territory, is a Christian one. Many have adopted this idea for the last two thousand years and for the same reasons. And when the society they lived in seemed to start falling apart right before their eyes, the call often went out that the nation has fallen from God's grace and needs to return to him. One of the Roman persecutions of Christians began for this very reason.
What's more, one must also not just look at the morals talked, preached, or publicly upheld about, but at the behaviors of the people within that society as well. The ideal that a nation was Christian always went hand-in-hand historically with a number of immoral, ungodly, and even inhuman activities that everyone knew of but that many did nothing about or spoke of. Morals that were preached from the pulpit often times went out the window when it came into contact with everyday life and personal areas of how they chose to live. People who did not go along with the public morals and were open about it often were judged, and even tortured or killed. These are facts that cannot be denied.
However if I were to just deal with America alone, even then it was never a Christian nation. Why? Because the behaviors of the people and even the very concept itself doesn't seem to match up with the bible as a whole. Let me explain.
It is true that God set a nation, Israel, to be a godly nation, or in today's language, a theocracy. A society ruled by God alone. Even then, God didn't judge if they achieved this by what laws they followed or even to a degree what they did, but where their hearts were at. Often times while criticizing the Jews for their lack of care for the poor, God often cites, and sometimes it seems with more passion, that their hearts had drifted away from him. They no longer followed the law because of love or even obedience to God, but because it had become a habit that meant nothing, a ritualization.
And in the New Testament Jesus even says that it's not what comes from a man's mouth that makes a man that makes him evil, but the heart from which he does it. What separates the faith from all others is that it is not done for the sake of society, fulfilling a status quo, or to make ourselves feel good or better than others. The source and strength of the faith comes from a heart of love: love for God and love for others as ourselves. Without these, all other actions there of have no meaning.
The behavior of America as a people cannot be ignored or simplified and still be called a Christian nation. Much of these are already well known and documented: slavery, the stealing of lands from the local inhabitants, breaking of treaties, judgment of others deemed not worthy of the same human rights or even being human, mistreatment of the poor and mentally ill, greedetc.
To say that God is now angry with actions committed within the last forty or so years but allowed these previous inhuman injustices to slide for the last four hundred years is hypocritical. It essentially says that not only does God support our interpretation of how society should be, but it says that to those who have suffered under such injustices that God doesn't care about them, that he shows favoritism. This perception throws John 3:16, 'for God so loved the world he gave his only begotten son', right out the window.
So do I believe Camelot exists or is it just a nave dream?
A friend of mine summed it up best when he gave his take on the subject:
"I don't believe in the Christian nation because Christianity can't be taught or passed down. At some point it will lose all meaning to that society and it will become just another ritual. Christianity is a subjective religion where one must experience God for themselves in order for the actions they live out to be true. A Christian nation cannot accomplish this."
I'm not attacking Christian principles, though I understand that some people become so attached to a principle or an idea that they treat such comments like an attack against there own person. What I am challenging is a concept and a trap that we as believers have been falling into for centuries and have allowed to taint our view of the world, those living in it, and what God is doing and wants to do in it.
Under this mentality, we are the ones in control and not God. And anything that does not come from God, even if for good intentions, is carnal and can therefore be manipulated by Satan. We are the ones telling God how his vision of society should work and look like. We act like Peter when he tried to tell Jesus that he could not go turn himself in and eventually be killed.
In the world as it is now, I do believe that we should live our lives in full and Godly ways, but that is something that cannot never be forced upon another unless a person's dignity as a human being is at stake. To chose to live for God is a choice. However I also believe in Camelot, a world of peace, love, and justice for all. And I believe that world should be strived for, not from carnal sense of control but from a constant hope of the future. When all people embrace this from choice, then you will see the true Christian nation, but because this involves changing the hearts of people, only God can bring this about and not us.







Jamal Smith, graduated Roberts wesleyan college, BA, Brockport, NY, email: [email protected]

Article Source: http://www.faithwriters.com







Thanks!

Thank you for sharing this information with the author, it is greatly appreciated so that they are able to follow their work.

Close this window & Print