Bob, I would agree that America has always been mostly Christian. During that time we ran the Indians off their land, killed most of them and put the rest on reservations. We had a huge war mostly over slavery, an economic system which was defended by the Southern Christians. We finally freed the slaves, but didn't allow them to vote until much, much later. They didn't get their civil rights until the 1960's, and there are still a lot of hate crimes against them. Nor were women treated with any equality at all until the last twenty-five years or so, and it has been a huge struggle.
So if Christians used to be the mainstream, do you think their beliefs did a lot for all Americans? I sure don't. The whole resurgence of the religious right now is because many people seek simple answers for complex problems. But Christianity was not the answer before, and it is not the answer now.
Do you really believe the media tells the public what to think? The media reflects the culture, for the most part, not the other way around. Again, an easy answer to a complex problem.
Were those cop killers militia members? The militia is Christian, and does talk about "end times". I am not sure what you were trying to say there. Can you clarify your point? |