America is evolving. The capitalist theme of this country is beginning to be questioned for its true value wrt the people, and it is found wanting. Capitalism, once the proud signature of this nation has become gnarled, and self-absorbed. It no longer cares for the people, or the community, only the profit margin. Some may say that it never cared, but in the past the government held the power to curb it. Now, this is no longer true because Capitalism employs government itself, doling out financial charities to syncophantic political parties in return for preferential treatment, and 'business as usual'. This can no longer continue if we are to remain a free nation. We can no longer allow Capitalism to rule our lives, govern our politicians. And the only entity that can hope to combat it is the Federal Government. Our government must be answerable to the people, not Capitalism. More needs to be done to curb the inevitable greed that has infested American big business. We are the richest nation in the world yet most of us never see this wealth, so what good is it to us, the people? None. Government must hold the reins of power in this country, because it is the only thing the people have left to trust in. Not big business. Not politicians.
I'm not talking about Communism. That is too harsh, and violence is not a means to an ends. I'm not even talking pure Socialism. But I am talking of a merging of Socialism and Democracy where the government oversees the process, and protects the people. To continue on our present road of Capitalism will only result in our demise as a nation, as it inevitably devours itself from greed and hunger. Maybe not today, but in two to three decades.