Notes on the end of the First American Republic

In 2003 I initially suggested that the First American Republic was over. Although the idea has gained credibility in some progressive circles, it remains unimaginable to traditional media, politicians and academics. Instead, there has been a bounce in talk of American “exceptionalism” even though when America was, in fact, exceptional it seldom used the term. But then, denial of reality is one good sign that a society is on its way down. What follows is a series of notes on some of the factors that helped to end the First American Republic, some of which a remix of earlier writings. If it all seems a bit depressing, check some of the essays listed at the end on what to do about it.

From Grand Old Party to Grumpy Old Prigs

Although there were signs of trouble as early as 1944, when the conservative Human Events magazine was launched, Republicans in general stayed within traditional American culture until the Reagan administration. There were exceptions, the most striking being Joseph McCarthy and his ilk, but on the whole Republicans represented a wing of American politics rather than, as at present, a political asteroid threatening to blow the whole place up. People such as Robert A. Taft and Margaret Chase Smith were like your grandfather and grandmother, out of touch with the times but still members of the family. Dwight Eisenhower was a moderate and Richard Nixon – for all his personal faults – was on domestic issues the last liberal president America has had.

That changed radically with Ronald Reagan, who applied principles he had used to sell Chesterfield cigarettes to hawk a toxic form of government described well by Robert Lekachman:

“Ronald Reagan must be the nicest president who ever destroyed a union, tried to cut school lunch milk rations from six to four ounces, and compelled families in need of public help to first dispose of household goods in excess of $1,000″.

Of course, Reagan had help, including from Margaret Thatcher who served as his brains as he went around oh shucksing the voters.

There is considerable evidence that the collapse of the First American Republic began in no small part with Reagan’s inauguration:

– The number of federal inmates increased from approximately 25,000 in FY1980 to nearly 219,000 in FY2012.

– According to the U.S. Census Bureau, an all-time record 49 percent of all Americans live in a home where at least one person receives financial assistance from the federal government. Back in 1983, that number was less than 30 percent.

– Black unemployment is the worst since 1984

– From 1947 to 1979 family income of the bottom 20% went up 116% and those in the top 20% went up 99%. Between 1980 and 2009, the bottom 20% went up 15% while the top 20% went up 95%

– Hours worked per employees are the highest since the 1980s.

– Middle class debt is the worst since the 1980s.

– Personal bankruptcies are up 400% since the 1980s.

– Student loan debt is the worst since the 1980s

– In the 1980s there were 50 corporations controlling most of the major media. Now there are six.

– During the Reagan administration the number of families living below the poverty line increased by one-third.

There are other aspects of the Reagan years we tend to forget. For example, the Reagan administration was among the most corrupt in American history including, by one estimate, 31 convictions of top officials. By comparison 40 government officials were indicted or convicted in the wake of Watergate. 47 individuals and businesses associated with the Clinton machine were convicted of or pleaded guilty to crimes with 33 of these occurring during the Clinton administration itself.

David R. Simon and D. Stanley Eitzen in Elite Deviance, report that 138 appointees of the Reagan administration either resigned under an ethical cloud or were criminally indicted.

The Reagan administration also had secret plans for an unconstitutional takeover of the federal government under an ill-defined national emergency. Members of the government created by the coup had been selected and included Richard Cheney.

Reagan’s policies also led to what was then the greatest financial scandal in American history: the savings & loan debacle which cost taxpayers billions of dollars.

And according to Steven Komacki in Salon, “By the summer of 1992, just 24 percent of Americans said their country was better off because of the Reagan years, while 40 percent said it was worse off — and that more Americans (48 percent) viewed Reagan unfavorable than favorably (46 percent).

The two Bushes that followed now seem like mild regressions compared with the rise of the Tea Party and its supporters, a collection of GOP governors whose course was directed by extreme rightwing campaign contributors, and an unprecedented collection of retrograde members of Congress. Together they led the strongest effort to undo decades of positive American policy seen since the post-Reconstruction era.

Fortunately, this effort was driven by an activism of desperation by those whose real fear was that, as older white men in a demographically and culturally changing land, this was their last chance. It was more an act of desperation than of revolution, albeit desperation with immense potential for damage to the country and its people.

The major media did little to reveal the flaws underlying the revival of Know Nothing hatred of immigrants, the childish economics of those such as Paul Ryan or policies done in the name of Reagan that even Reagan would have avoided. Instead they presented prejudice unseen since the civil rights era, deficit obsession and similar dysfunction as though they were just moral issues about which the public would have to wisely choose.

In fact, as I noted last spring (see below), what the Repubicans were up to had much more in common with the a final movement among American Indians as they were about to lose their last hold on their native land. The Indian ghost dance cult and the Republican Tea Party are the examples of a last effort by a lost cause.

What I had not expected, however, was how quickly this would become apparent even to some of the media that had encouraged the rightwing absurdities in the first place.

Within months of Obama’s second inauguration, wiser Repbulicans, after counting the heads and observing reality, produced a 50,000 word blueprint for revival. As Buzzfeed reported:

“Specifically, the word ‘Christian’ does not appear once … Nor does the word ‘church.’ Abortion and marriage, the two issues that most animate social conservatives, are nowhere to be found. There is nothing about the need to protect religious liberty, or promote Judeo-Christian values in society. The calmer voices, no longer frightened into silence, were trying to pick up the pieces.”

And, with alsmost perfect symetry, the journal that in 1944 had helped launch the irrational right – Human Events – announced in February that it was ending its print edition. Human Events, as late as 2005 had listed the works of John Dewey, Betty Friedan, and John Maynard Keynes as among the most harmful books of the 19th and 20th century, along with Mein Kampf and Das Kapital. Making the honorable mention list were The Origin of Species, by Charles Darwin, Unsafe at any Speed by Ralph Nader and Silent Spring, by Rachel Carson.

That was the good news. The bad news was that the Democratic Party had come to rely on the idiocies, excesses, and perverted thinking of the GOP as its own raison d’etre. Without Republican madness it had little to offer including hardly a significant new idea in 30 years. The GOP had not only killed itself; it had almost killed its opposition.

Sam Smith, Aprl 2012 – The departure of two Ricks and Michelle Bachmann, the collapse of Gingrich, as well as governors Scott, Walker, LePage, Kasich and Perry all having approval ratings below 45%, suggests that the Tea Party was somewhat overrated by the corporate media. It also gives me courage to suggest a theory that has been bouncing about in my mind, namely that the unprecedented craziness of the Republican Party leadership has been a reflection of pathology rather than of politics and that what we have witnessed has been the last rites of those trying futilely to return America to a place that they thought, mistakenly, once was and which will never be.

Real politicians, for example, don’t go out and deliberately alienate a demographic as large as women. That’s pure masochism. The Center for American Women and Politics at Rutgers University notes: “In recent elections, voter turnout rates for women have equaled or exceeded voter turnout rates for men. Women, who constitute more than half the population, have cast between four and seven million more votes than men in recent elections. In every presidential election since 1980, the proportion [of] female adults who voted has exceeded the proportion of male adults who voted.”

There simply aren’t enough old white guys to compensate for the anger being created by the GOP among women.

And consider a few of the other constituencies that prominent Republicans have insulted:

9/11 responders, AARP members, Americorps members, bicyclists, black men, children with pre-existing health conditions, college graduates, college students, consumers, cops, disabled people, disaster victims, ethnically mixed couples, gays, home owners, ill people who need medical marijuana, immigrants and their children, journalists, latinos, Methodists, minimum wage workers, residents of DC and Puerto Rico, scientists, Social Security recipients, state workers, and unemployed workers.

What may well have happened is what sometimes occurs when a longstanding culture finds itself facing near fatal attack.

For example, during a solar eclipse on January 1, 1889, an American Indian named Wovoka claimed to have had a dream in which all his fellow native Americans were taken into the sky as the Earth opened up and swallowed all the whites upon it. The earth then returned to its natural state as a land where native Americans could live in peace.

According to Wovoka, to make this dream real, his native Americans were to follow these instructions: “When you get home you must begin a dance and continue for five days. Dance for four successive nights, and on the last night continue dancing until the morning of the fifth day, when all must bathe in the river and then return to their homes. You must all do this in the same way. . . I want you to dance every six weeks. Make a feast at the dance and have food that everybody may eat.”

The ghost dance culture would sweep across the tribes of western America as the dancers were losing their last hold on their beloved lands.

There are other examples:

– As military supplies poured into the Pacific Islands during World War II, local peoples reacted to the sudden change by developing “cargo cults” that offered magical explanations for the flow of imports. When the war ended, members of the cults built imitation landing strips and aircraft to attempt to recreate the former reality and restart the influx of goods.

– The early 20th century Maji Maji rebellion in Africa was spurred by a medium who offered medicine he claimed would turn German colonials’ bullets into water.

– And sometimes the bad times produce not just the strange but the disastrous, as with the rise of Nazism.

Typically, such strange phenomena are a reaction to events that have overwhelmed many and led them to seek solace in a simplistic and seemingly comfortably symbolic solution.

Nazism, for example, didn’t spring up as just an arbitrary evil virus. It fed on:

– Unhappiness in the wake of World War I, a war whose mass killings help set a new low value on human life.

– The collapse of conventional liberal and conservative politics that bears uncomfortable similarities to what we are now experiencing.

– The gross mismanagement of the economy and of such key worker concerns as wages, inflation, pensions, layoffs, and rising property taxes. There were also bankruptcies, negative trade balance, major decline in national production, and a large national debt rise compensated for by foreign investment. In other words, a version of what America and its workers are experiencing today.

– The use of negative campaigning, a contribution to modern politics by Joseph Goebbels. The Nazi campaigns argued what was wrong with their opponents and ignored stating their own policies. Sound at all familiar?

– The collapse of the country’s self image, falling from world leadership in education, industry, science, and literacy.

Like Ghost Cult dancers in the 19th century, World War II Pacific Islanders wondering where their cargo was, Africans beset by German colonialists, and Germans beset by economic and cultural decline, Americans today face an extraordinary assemblage of change, discouragement, challenges and uncertainties.

Add together climate change, the erosion of democracy, the greatest economic crisis since the 1930s, the decline of America’s position in the world, rapid changes in both technology and social values, and the collapse of conventional conservative and liberal politics and we’re lucky to have a reaction no stranger than that of the Tea Party movement.

The dismantling of the Democrats

The last clearly identifiable period during which the Democratic Party was a positive influence on the country was during the Johnson administration. Since then the number of Democratic Party led improvements have been minor. If this seems excessively critical consider the following list of things achieved before 1970 and compare that to more recent legislation:

– Regulation of banks and stock brokerage firms
– Protection of your bank account
– Social Security
– A minimum wage
– Legal alcohol
– Regulation of the stock exchanges
– Right of labor to bargain with employers
– Soil Conservation Service and other early environmental programs
– National parks and monuments
– Tennessee Valley Authority
– Rural electrification
– College education for innumerable veterans
– Housing loans for innumerable veterans
– FHA housing loans
– The bulk of hospital beds in the country
– Unemployment insurance
– Small Business Administration
– National Endowment for the Arts
– Medicare
– Peace Corps
– Veterans benefits including health and housing

While the current problems of the Democratic Party began in the Carter era, they escalated with the choice of Bill Clinton as its presidential candidate.He was elected by the Democratic Leadership Council in part for the purposes he served, namely undoing key elements of Democratic past that even Ronald Reagan couldn’t achieve such as social welfare. Clinton has been outdone by another DLC vetted candidate, Barack Obama, who has not only showed contempt for the Constitution on various civil liberties issues but is the first president to propose reducing the benefits of Social Security and Medicare.

Further, since Clinton took office, the Republican Party has had no effective opposition and the Democratic Party has become overwhelmingly beholden to its corporate contributors.
Because the media has increasingly narrowed political coverage topics to the presidency, it is not widely known, for example, that Democrats held a 1542 seat lead in state legislatures in 1990. As of 1998 that lead had shrunk to 288. That’s a loss of over 1,200 state legislative seats, nearly all of them under Clinton. Not to mention that over 400 Democratic office holders became Republicans during the Clinton regime.

By 2011, the Democratic state legislative lead had disappeared the GOP was ahead by 622 seats.

Further, in 1992, the Democrats controlled 17 more state legislatures than the Republicans. After 1998, the Republicans controlled one more than the Democrats. Not only was this a loss of 9 legislatures under Clinton, but it was the first time since 1954 that the GOP had controlled more state legislatures than the Democrats (they tied in 1968). Today the GOP fully controls 8 more state legislatures than the Democrats.

As for the 2000 election, Democrats love to blame Ralph Nader for their loss, but the facts point in quite another direction: to Clinton and Gore. If, for example, you check the changes in Bush’s and Nader’s poll figures during the last month of the campaign, it is clear that Gore lost far more votes to Bush than to Nader.

It is also apparent that if Gore had disassociated himself from Clinton, he would have done far better in the campaign. According to the 2000 exit polls:

– 60% of voters disapproved of Clinton as a person
– 68% said he would go down in the history books for his scandals rather than for his leadership
– 44% thought the Clinton scandals were important or somewhat important.
– 15% of those who had voted for Clinton in 1996 voted for Bush in 2000.

And it was just not the leadership that was responsible for the party’s decline. As a portion of the Democratic base became more affluent, it increasingly separated itself from the concerns of less wealthy party members. This started the rise of the “Reagan Democrats” and continued as the party elite lost interest in economic issues that concerned a major portion of the American electorate.

There was nothing mutually exclusive between these economic issues and, say, gay or women’s rights, but Democrats lost the capacity to deal with both at the same time.
As the Review noted some time back, “Franklin Roosevelt’s labor secretary, Frances Perkins, was central to more progressive economic legislation than the entire liberal movement has been able to come up with in the past thirty years. It’s hard to get liberals excited anymore about issues like pensions or the minimum wage and eventually politics reflects this fact. Consider the example of the women’s movement, which – with a few exceptions like the group Nine to Five – has been stunningly uninvolved with the most oppressed women in the country, those of lower incomes and social class. Further, treating those you should be organizing as just a bunch of Bible thumping, gun toting idiots doesn’t help much.”

And so it became possible for the Republicans to pose falsely as friends of the middle and lower classes while the Democrats did little to prove this was wrong while dismissing constituencies that once had been central to the party.

Elite emigration

Sam Smith – There is major concern over the effects of immigration on our economy. It is also badly misplaced.

The foreign born population in the US – as a percentage of the total – is about what it was from 1860 to 1910: 13%. It has ranged from a low of 5% in 1970 to 15% between 1880 and 1900. The only time it has fallen below 10% in the past century was between 1940 and 1980. As for undocumented foregin born, the figure is less than five percent.

The truly bad effect on the American economy and on the First America Republic has not been the undocumented immigration of the poor but the undocumented emigration of the country’s elite, a record number of whom today have moved to a virtual global community without laws, without taxes, and without a nation and communities to which they feel loyalty and responsibility. While they still live in America, their jobs, employees, profits and souls all reside elsewhere.

We have never had an economic elite so fiscally separated from, and in many ways disloyal to, our land. Even the robber barons at least reinvested their money in Americans jobs and industry. The defection of our elite has been a major factor in the collapse of the First American Republic.

Some time ago I suggested testing out the role of immigrants with these questions:

  1. Has a Mexican ever fired or laid you off?
  2. Was the plant for which your worked until it was sent overseas been bought by Mexicans or is it still owned by the same people you used to work for?
  3. Has a Mexican ever cut your pension or health benefits? Outsourced your job to India?
  4. Do you think Mexicans or the pharmaceutical corporations are more responsible for high drug costs?
  5. How much of the corruption in Washington has been instigated by the Mexicans?
  6. Did the Mexicans’ make us invade Iraq?

The new reality I described it this way in Shadows of Hope, a book on Bill Clinton, in 1993:

The real Clinton foreign policy is simply this: there are no foreign countries any more, there are only undeveloped markets. The slogan has become “Make quarterly earnings growth, not war!” Trade has replaced ideology as the engine of foreign affairs.

At one level this should be celebrated, since it is far less deadly. On the other hand, this development also means that politics, nationhood and the idea of place itself is being replaced by a huge, amorphous international corporate culture that rules not by force but by market share. This culture, in the words of French writer and advisor to Francois Mitterand Jacques Attali, seeks an “ideologically homogenous market where life will be organized around common consumer desires.” It is a world that will become increasingly indifferent to local variation.

And when Attali speaks of American influence he says: “We have to build a word which would be ‘New York-Hollywoodization,’ because we are not Americanized in the sense that we are not going to be closer to St. Louis, Mo., or someplace else. These countries are far from us and we are far from them. They are less in advance, less influencing than New York and Hollywood.”

Here is a world in which Babar loses out to Mickey Mouse in France and where a sophisticated Frenchman speaks of St. Louis — but not Hollywood or Manhattan — as a foreign country. It is the world of what Marshall Blonsky calls “international man.”

International man — and he is mainly just that — is unlocalized. He wears a somewhat Italian suit, perhaps a vaguely British regimental tie, a faintly French shirt and shoes — says international man Furio Columbo, president of Fiat USA — “with an element of remembering New England boats and walking on the beach.” As Blonsky puts it, “You self-consciously splice genres, attitudes, styles.”
International man thrives in Washington. At the moment you call, though, he may well be in Tokyo, Bonn or London sharing with colleagues who are nominally Japanese, German or British a common heritage in the land of the perpetually mobile.

It is this unnamed country of international law, trade and finance, with its anthem to “global competition in the first half of the 21st century,” that is increasingly providing the substance and the style to our politics. It is their dual citizenship in America and in the Great Global Glob that characterizes the most powerful among us, now more than ever including even our own political leaders. International man dreams of things like NAFTA and GATT and then gets them passed. And he knows that he, as a corporate executive or licensed professional, will pass quickly through Mexican customs in his somewhat Italian suit and shoes with a hint of a New England beach because the agreement he helped to draft and pass has declared him entitled to such consideration. The union worker, the tourist from St. Louis, are, under the new world order, from far countries and so it will take awhile longer.

This then is the Clinton foreign policy: it is the policy of International Man, a policy that brings Mexico City ever nearer and starts to makes St. Louis a stranger in its own land. . .

How could any republic survive when its own elite deserts it the way ours has?

Top rot

Sam Smith – One good way to judge the state of a culture is to check out its leadership. Absent a strong, continuing rebellion from the bottom or a conscious devolution of power, this leadership leaves its mark on every aspect of society.

For us, this is not a new problem. Shortly before 9/11 I wrote:

Over lunch one day, I asked journalist Stephen Goode how he would describe our era. Without hesitation, he said it was a time of epigons.

An epigon, he explained to my perplexed frown, is one who is a poor imitation of those who have preceded. The word comes from the epigoni — the afterborn — specifically the sons of the seven Greek chieftans killed in their attempt to take Thebes. The kids avenged the deaths by capturing Thebes – but they also destroyed it. They were generally not considered as admirable and competent as their fathers.

Being around epigons is like being trapped at a bad craft fair where everything you see seems to have been made before, only better.

Of course, you can not expect your leaders in politics, academia, business or the media to tell you this. People have to figure it out for themselves and that can take a long time.

In politics, for example, our conservative leaders are devoting themselves to the revival of segregation, even attacking unfettered voting that civil rights activists early targeted as essential for a decent society. These conservatives do not use slur words and other crudities of the old south but in some ways are even more dangerous since they seek to discriminate formally not only against blacks, but against every ethnic minority, women, gays and anyone not making enough money to contribute to their campaign. Because the mass media has so thoroughly embedded itself in the culture of our leadership, there is little hint of this on the evening news. Stealing the voting rights of citizens is treated as just another political issue to ponder soberly and not – as it should be – a crime.

In a collapsing society, one of the best clues is to look at the supposed good guys. What we find in politics is a Democratic Party that has not only betrayed its present supporters, but is actively undoing the progress it made over 80 years in economic equity and civil liberties, just to name two examples.

We are presented as heroes people like Bill Clinton, a politician with an exceptionally seedy past who cut social welfare and helped create the fiscal disaster from which we have yet to recover. And Barack Obama, who has shown unprecedented contempt towards civil liberties for a White House Democrat, is also the first to favor cutting Social Security, and generally supports reactionary solutions to our economic crisis.

We like to think that changing parties alters far more than it often does. There may be a cultural trend that affects, for better or worse, whoever is in power. Thus, we have the irony of Richard Nixon being the last president whose domestic politics could be fairly described as liberal, while people like Clinton and Obama have moved the country to the right.

Driving this now is a culture of impunity, which, some years back, I described this way:

In a culture of impunity, rules serve the internal logic of the system rather than whatever values typically guide a country, such as those of its constitution, church or tradition. The culture of impunity encourages coups and cruelty, and at best practices only titular democracy. A culture of impunity differs from ordinary political corruption in that the latter represents deviance from the culture while the former becomes the culture. Such a new culture does not announce itself.

In a culture of impunity, what replaces constitution, precedent, values, tradition, fairness, consensus, debate and all that sort of arcane stuff? Mainly greed. We find ourselves without heroism, without debate over right and wrong, with little but an endless narcissistic struggle by the powerful to get more money, more power, and more press than the next person. In the chase, anything goes and the only standard is whether you win, lose, or get caught.

The major political struggle has become not between conservative and liberal but between ourselves and our political, economic, social and media elites. Between the toxic and the natural, the corporate and the communal, the technocratic and the human, the competitive and the cooperative, the efficient and the just, meaningless data and meaningful understanding, the destructive and the decent.

Today almost every principle upon which this country was founded is being turned on its head. Instead of liberty we are being taught to prefer order, instead of democracy we are taught to be follow directions, instead of debate we are inundated with propaganda. Most profoundly, American citizens are no longer considered by their elites to be members or even worker drones of society, but rather as targets – targets of opportunity by corporations and of suspicion and control by government.

One of the major shifting moments in our politics was the arrival of television. Up to that time, there was no media powerful enough to put the whole country on the same wavelength, using the same clichés, and assuming the same myths. Over time, television would transform politics from a community based activity to one in which you could simply buy your status.

This affected not only normal government but the nature of political corruption. Until television, political corruption in America was a feudal system: politicians gained power but in return were expected to provide distinct services to the voters. Now, with the size of campaign contributors being the major factor and advertising the major voice of politics, voters no longer matter the way they once did.

As one example of what has happened, let’s take another look at a politician widely regarded as corrupt: former DC mayor Marion Barry. In fact, the story is far more complicated.

I sometimes even describe Barry as the last of the great white mayors.

Barry was raised in Memphis TN at the end of almost a half century of political dominance in that town and Tennessee by the Crump machine. EH Crump was only mayor twice during that period but he controlled the politics for decades. Crump, like Earl Long of Louisiana, was rare among white southern politicians in that he actively organized and sought the support of black voters. One of Crump’s lieutenants, for example, was a black funeral director named Harold Ford, whose grandson of the same name served in the House as recently as 2007. And the blues godfather, WC Handy, even wrote a song, the “EH Crump Blues”, that would be sung on street corners to garner crowds for rallies.

I handled press for Barry in the 1960s when he was head of Washington’s SNCC. Our relations would sour over the years (“Where’s that son of a bitch?” he once asked my wife at a party), but even with his drug conviction and corrupt activities, you still had to admit that he was an exceptional politician. And as time went on, his politics seemed from out of another era, from that of Crump, Daley, and Curley – white mayors who defined urban politics at a time when the underclasses were struggling for equity against city elites. It was not that they weren’t corrupt, but that they gave something back to ordinary citizens in return.

For example, Barry started an important summer youth jobs program in the city when he became mayor. With instructive irony, a leading DC councilmember got sentenced to prison last year for embezzling over $350,000 – from a youth program.

Similarly, now councilmember Barry just held hearings on finding ways to help seniors get more food and other essentials while the nation’s designated black role model, Barack Obama, is cutting services to the poor and Social Security for seniors.

Not bad metaphors for the collapse of our political leadership.

Meanwhile, in academia, theoretically a source of national wisdom, a strange silence prevails. Could it be the effect, as reported by the Congressional Research Service, of the approximately 60% of federal research and development funds that go to academic institutions?

And what about a media controlled by fewer and fewer large corporations with journalists who have an increasingly hard time separating their jobs from the social and economic benefits of being nice to those in power? Just one small example:

“Project Censored researched the board members of 10 major media organizations from newspaper to television to radio. Of these ten organizations, we found there are 118 people who sit on 288 different American and international corporate boards proving a close on-going interlock between big media and corporate America. We found media directors who also were former Senators or Representatives in the House.”

And how can one brag of business leadership in an era when our economy is in such terrible shape?

Finally, we have from the arts little leadership as well, including not one major protest subculture since the punk rebellion.

It is not just the people who are the problem. It is the ideas and values that they spread with their power.

Consider how business school clichés have infected even non-profit organizations and how the media just accepts them as truths.

Or how our government is driven by data drones, process obsessives, legal lemmings, and test tyrants.

Or how our public school system is being wrecked by grotesquely erroneous concepts of education.

Or how our freedoms are being slashed in the name of a security that is drifting ever further away.

Or how truth, reality, decency, integrity, fairness and cooperation have become, in the eyes of our elite and its subservient media, just toys of the left and the naïve.

In such an environment, it is hard for anything good and valuable to survive, including a democratic republic.

Blowin’ in the winds of cultural decay

This is another in a series of essays on the end of the First American Republic, this one mostly written in 2005.

Sam Smith – Thomas Jefferson saw it coming. He warned, “From the conclusion of this war we shall be going down hill. It will not then be necessary to resort every moment to the people for support. They will be forgotten, therefore, and their rights disregarded. They will forget themselves, but in the sole faculty of making money, and will never think of uniting to effect a due respect for their rights. The shackles, therefore, which shall not be knocked off at the conclusion of this war, will remain on us long, will be made heavier and heavier, till our rights shall revive or expire in a convulsion.”

Among the conceits of our elite and media is the assumption that America, in the form that they wish to imagine it, is immortal. Part of this is the arrogance of the big, part comes from an admirable if naive faith in progress, part of it is pathological delusion. For a host of reasons, beginning with our own survival, it is long past time to permit the question to be raised: is America collapsing as a culture?

It is easy to forget that history is strewn with the rubble of collapsed civilizations, entropic remains of once sturdy cultures, societies we now remember only thanks to a handful of artifacts guarded in museums.

Our own country was built on the wreckage of Indian culture. Guatemalans use Timex watches rather than checking the Mayan Calendar. The European Union is a covert chapter of Empires Anonymous. And in the Peruvian desert there are huge spirals in the earth and straight lines that stretch for miles whose origins are totally forgotten.

Some sixty years ago, anthropologist Alfred Kroeber noted that elements of a culture do die out, “dissolve away, disappear, and are replaced by new ones. The elements of the content of such cultures may have previously spread to other cultures and survive there. Or their place may be taken at home by elements introduced from abroad. Or they may survive, with or without modification, at home, in the different configuration that gradually takes the place of the old one as a successor culture.” Thus even if American democracy dies here; pieces of it may survive somewhere else, or we may become the largest latino culture in the world and, in any event, the Thais may keep the faith of the Ipod alive regardless of what happens to us.

As an example, Kroeber says that there came a time when the ancient Egyptians had clearly attained “the greatest military might, expansion, wealth, excellence of art and development of thought. The inherent patterns of their culture may be said to have been fully realized or to have been saturated then. After that, with pattern potentialities exhausted, there could be only diminished or devitalized repletion; unless the patterns can be reformulated in the direction of a new set of values – which would be equivalent to recasting the civilization into a new one or into a thoroughly new phase of one. This latter did not happen in Egypt; so more and more sluggish mechanical repetition within the realized but fully exhausted patterns became the universal vogue.”

Does this begin to sound a bit familiar?

Let’s take the example of popular music, useful because music is a creative discipline with a mathematical base, thus lending itself to more objective analysis than some of its artistic colleagues. In fact, you can write a succinct history of western music by simply outlining the progression of chords used and their relationship with one another. This is what Ward Cannel, a journalist, and Fred Marx, a classical pianist, did in a remarkable guide, “How to Play Piano Despite Years of Lessons.”

Charting the basic chords – separated by a common distance of notes and placed around a circle like guests at a large dinner table – you can describe the rise of western music by simply checking off which of these chords were being used by musicians at a particular time. Thus with folk music, children’s songs, early hymns and Bach’s Minuet In G, it was typical to use one chord and its neighbor on either side.

In later classical harmony, composers moved from the base chord to another, say, three or four seats away counter clockwise and then begin a slow procession home stopping at the other chairs. Examples would include Bach’s Well Tempered Clavichord. It doesn’t seem like much, but in the history of music, it was a revolutionary change.

Along the way, there were other variations such as starting at the second or third chair and moving back towards home as in Honeysuckle Rose.

If you really wanted to be wild, you threw in a chord not on the way home at all, but in the other direction.

Then came a new stage and the game was played on the clockwise side of the circle. Later a tune might work its way entirely around the circle. Or if you want to be really hip, you could leap across the circle to the other side.

Similarly, the baker’s dozen of notes in the western scale have been rearranged over time in increasingly complex ways, starting with the simple chords we associate with folk music and moving on to add the 7th, flatted 9th, 13th and so forth.

If you were to take every piece of music in America ever written and categorize it by these standards – the number and placement of chords and their complexity – you would find that musical opportunity grew with the rest of the republic.

This didn’t mean that you had to use all these opportunities to make good music – bluegrass and the blues prove that – only that the potential for musicians and composers were ever expanding, a sign of a thriving culture. As Thelonius Monk put it, “I’m after new chords, new ways of syncopating, new figures, new runs. How to use notes differently. That’s it. Just using notes differently.”

Unfortunately, however, there are only so many chairs at the table and there are only so many combinations of movement. Eventually you run out of chairs for chords, variations on the order you play them, and their complexity. You reach the point that Kroeber described: “With pattern potentialities exhausted, there could be only diminished or devitalized repletion. . . so more and more sluggish mechanical repetition within the realized but fully exhausted patterns became the universal vogue.”

Which is to say, much of the music of today.

There is, to be sure, another major source of change: other cultures. American folk music, for example, is a history of immigration translated into notes. The blues, it has been suggested, originated in a blend of the western and African scale. As early as Jelly Roll Morton, jazz musicians were borrowing from latin sounds with a more recent folk example being the blending of Paul Simon and Ladysmith Black Mambazo in ‘Graceland.’

This continues today but in a critically modified form: Jelly Roll Morton and Paul Simon were inventive musicians seeking the best in another culture; Ricky Martin and Gloria Estefan are products of a huge anglo recording company looking for something new to exploit.

I suspect the decay of American music may have begun with the disco drum machine of 1970s, the beginning of percussion mechanicus to go along with Erich Fromm’s homo mechanicus. Both share a problem: they aren’t human. A live drummer is constantly listening to the other musicians, finding new ways to back them up, discovering a groove by intent or accident, making a two bar point, or just showing off. If you were to analyze the sound with lab equipment you might be amazed at how irregular it actually is – the inevitable result of being human rather than mechanical.

But that is part of the secret of real music. Much of the appeal of jazz, for example, comes from listening to the alteration, manipulation or distortion of the familiar. Thus a singer may hold a note longer than expected or lend it excruciating pain when you were expecting nothing more than a simple B flat. One writer described it as repetition just to the point of boredom – at which something new and unexpected happens.

As amplifiers replaced acoustic sound, there were other changes in music. The recording companies began dumbing down music, reducing the number of chords, replacing melody with repetitive phrases, emphasizing only the extreme end of the dynamic range, and in the end – with rap – doing away with the need for music almost entirely.

This is not to say that there was not merit within these forms – the pain and rebellion of punk, the soul of rap – but rather that for the most part the corporate monopolies had seized control of our ear drums and locked them down in a few tiny cells.

The result is telling. In 2002, ABC asked respondents for the top rock n roll star of all time. Elvis Presley got 38%, no one else got more than 5% and listed in the top ten were such golden oldies as Jimi Hendrix, John Lennon, Mick Jagger, Bruce Springseen, Paul McCartney and Eric Clapton. Michael Jackson got 2%.

Rolling Stone’s listing of top 40 songs found only 8% more recent than 1980.

Thus when you ask, what’s been happening in American popular music over the past 25 years, a reasonable answer is: not much.

You find similarities in other arts. For example, a Modern Library critics’ listing of the 100 best English language novels of the 20th century includes only one written after 1980: Ironweed by William Kennedy, written in 1983.

One list of the 100 most acclaimed films finds only nine post-1980. The American Film Institute’s list includes only 13.

An American Film Institute’s listing of top film musicals found only 12% after the 1970s and none in the 1980s.

The Art Wolf’s listing of top 50 artists could come up with only Andy Warhol and Jean Michel Basquiat as recent examples.

One may quarrel with such lists, but a culture that is truly thriving will tend, if anything, to overvalue its own contributions and downplay those of the past. You may argue, for example, with those who claimed to come from ‘the greatest generation,’ but you can’t argue that they felt that way. Now, instead of bragging, we just order Butch Cassidy from Neflix one more time.

A vibrant culture will be spurred by what it considers greatness. This doesn’t mean that it necessarily is, but the mere presumption affects how the society behaves.

For example, Victor Davis Hanson wrote that “Whether or not you agreed with them, university presidents used to be dignified figures on the American scene. They often were distinguished scholars, capable of bringing their own brand of independent thinking to bear on the operation and reform of their institutions. Above all, they took seriously the university’s mission to seek and transmit the truth, and thereby to strengthen the free society that made such inquiry possible.

“But it has been a long time since Woodrow Wilson (at Princeton), Robert Hutchins (at Chicago) or James Bryant Conant (at Harvard) set the tone for American campuses. Over the past year, four university presidents have been in the news – from Harvard; the University of California, Santa Cruz; the University of Colorado; and the University of California, Berkeley. In each case, the curtains have briefly parted, allowing the public to glimpse the campus wizards working the levers behind the scenes, and confirming that something has gone terribly wrong at our best public and private universities.”

Of course, Woodrow Wilson spread segregation in the government and James Conant may have done public education incalculable damage by setting it on a course of gargantuan factory-like school districts, but that is not the point. The point is that they were icons of a society that thought it knew where it was going and what it admired.

Today, such figures have largely been reduced to talk of their fundraising skill or excessive expense accounts. Few suggest that they are people we should actually admire.

Similarly, in the churches there is a stunning lack of models. This is not merely the fault of the neo-Gantries who have taken over much of American Christianity but of other Protestant sects that say not a mumblin’ word about the theological hijacking by the right and who offer little alternative in such areas as social justice and world peace. Judaism, which once helped carry the banner for social change, has largely abandoned that field in favor of supporting Israel. As for the Catholics, the best they can do is try to find ways to prove that they’re not a bunch of perverts.

The dearth of greatness is most painfully obvious perhaps in the nation’s capital, in its politics, think tanks and media. To be sure, a pantomime is performed, but everyone knows it is just for television. Obama compares himself to Roosevelt, Koppel pretends he’s Murrow, but nobody’s really fooled. The disappearance of greatness – whether rightly or wrongly recognized as such – is common throughout American society – from football coaches to moral leaders. In the end we are left with Justin Bieber and Linsday Lohan.

Part of the problem was identified as far back as the 1920s by Julien Benda in his book, The Treason of the Intellectuals: “At the very top of the scale of moral values [the intellectuals] place the possession of concrete advantages, of material power and the means by which they are procured; and they hold up to scorn the pursuit of truly spiritual advantages, of non-practical or disinterested values.”

Instead of being outsiders, critics and moral observers, the American intelligentsia have become players accepting many of the values of the system they should be scorning.

Benda listed some of these values:

– “The extolling of courage at the expense of other virtues. . .

– “The extolling of harshness and the scorn for human love — pity, charity, benevolence. . .
– “The teaching which says that when a will is successful that fact alone gives it a moral value, whereas the will which fails is for that reason alone deserving of contempt.”

In my last book, Why Bother?, I wrote:

Older Americans remember the victories and their celebrations; they remember Norman Rockwell men standing motionless for the national anthem in baseball stadiums with fedoras held over their hearts; a government that did more than regulate or arrest you; politicians who were revered; newscasters who were trusted; and music that dripped syrup over our spirits and made them sweet and sticky. They remember when there was a right and wrong and who and what belonged with each, whether it was true or not. They remember a time when those in power lied and were actually able to fool us. They remember what a real myth was like even when it was false, cruel, deceptive, and the property of only a few.

Now, despite the improved economic and social status of women and minorities, despite decades of economic progress, despite Velcro, SUVs, MTV, NASA, DVD, cell phones, and the Internet you can’t raise a majority that is proud of this country. We neither enjoy our myths nor our reality. We hate our politicians, ignore our moral voices, and distrust our media. We have destroyed natural habitats, created the nation’s first downwardly mobile generation, stagnated their parent’s income, and removed the jobs of each to distant lands. We have created rapacious oligopolies of defense and medicine, frittered away public revenues and watched indifferently as, around the world, the homeless and the miserable pile up. Our leaders and the media speak less and less of freedom, democracy, justice, or of their own land. Perhaps most telling, we are no longer able to react, but only to gawk.

Too be sure, many of the symbols of America remain, but they have become crude — desperately or only commercially imitative of something that has faded. We still stand for the Star Spangled Banner, but we no longer know what to do while on our feet. We still subscribe to the morning paper but it reads like stale beer. And some of us even still vote, but expect ever less in return. Where once we failed to practice our principles, now we no longer even profess to honor them.

Media: from watch dog to lap dog

Sam Smith – In the late 1930s a survey asked Washington journalists for their reaction to the following statement:

It is almost impossible to be objective. You read your paper, notice its editorials, get praised for some stories and criticized for others. You ‘sense policy’ and are psychologically driven to slant the stories accordingly.

Sixty percent of the respondents agreed.

To understand the role of the media in the collapse of the First American Republic, it doesn’t help to cling to romantic notions of what journalism once was; the days for which some yearn never existed.

Admittedly there were differences that today seem almost bizarre. Eighty years ago, 40% of the Washington correspondents surveyed were born in towns of less than 2,500 population, and only 16% came from towns of 100,000 or more. In 1936, the Socialist candidate for president was supported by 5% of the Washington journalists polled and one even cast a ballot for the Communists. One third of Washington correspondents, the cream of the trade, lacked a college degree in 1937.

And what also existed was much more competition in the news industry. By the 1980s, most of what Americans saw, read, or heard was controlled by fewer than two dozen corporations. By the 1990s just five corporations controlled all or part of 26 cable channels. Some 75% of all dailies are now in the hands of chains and just four of these chains own 21% of all the country’s daily papers. The situation since has deteriorated further, including the poor financial shape of many newspapers that has put banks and hedge funds silently in charge of them.
With these changes, the chances of getting the story right deteriorated and America was increasingly informed and persuaded by members of the same oligarchy that was running it.

And it was not only the owners. The national press corps was losing its relationship with America.

When I started out as a Washington reporter in the 1950s, only about half of American journalists had more than a high school degree. They naturally identified with their readership rather than with their publishers or elite sources. I didn’t let anyone know I had gone to Harvard because that would not have improved my standing either with staffers on the Hill or colleagues in the media.

Ben Bagdikian, a bit older than myself, described the craft in his memoir, Double Vision, this way:

“Before the war a common source of the reporter was an energetic kid who ran newsroom errands for a few years before he was permitted to accompany the most glamorous character on the staff, the rough-tough, seen-it-all, blood-and-guts police reporter. Or else, as in my case, on a paper with low standards, reporters started off as merely warm bodies that could type and would accept $18 a week with no benefits.

“Some of us on that long-ago paper had college educations but we learned to keep quiet about it; there was a suspicion that a degree turned men into sissies. Only after the war did the US Labor Department’s annual summary of job possibilities in journalism state that a college degree is ‘sometimes preferred.'”

And there were changes at the top as well. One of my first major shocks about my chosen trade, was listening to a top Washington editor talking about how he had been discussing with the White House the best way to handle the arrest of Walter Jenkins, LBJ’s top aide who had been caught giving a blow job to a man at a local YMCA. It had never occurred to me that an editor would actually consult with politicians on how their stories were to be covered. But in a few decades journalists would be thoroughly “embedded” both in war zones and at the White House and find nothing strange about it.

Journalists were changing socially as well. In the late 1960s, the Washington Post replaced its women’s section with one called “Style” and before long members of a once scrubby journalism trade were turning up in it as participants at major social events. They were no longer just interviewing people leaving the party, they were part of the party.

Another media factor that dramatically changed the nature of American politics was television. Over time, television transformed politics from a community based culture to one in which you could simply buy your status. Our social culture also became distorted by television until the major things we shared as a society were no longer values as much as symbols like Lindsay Lohan, Justin Bieber, Hillary Clinton and Barack Obama

And it was all brought to us by someone. According to CBS, Jay Walker-Smith of the market firm Yankelvich, estimated that in the 1970s we saw about 500 ads a day. By 2009 it was up to as many as 5,000.

The Internet was supposed to save us, but it hasn’t. In fact, the country has moved to the right since its creation. It was a problem that cropped up early, as I described in my 1994 book, Shadows of Hope:

“The computer, once considered primarily a tool of orthodoxy, has now become a major weapon against authoritarianism. The highly effective campus anti-apartheid protests were organized with the help of a computer bulletin board that advised newcomers how to plan demonstrations and deal with the media. In the last days of the Soviet Union, the relative security of computer information provided dissidents a means of communications with each other and with the outside world. More recently, computers have established the first strong link among environmentalists working to save Lake Baikal in Siberia. . . And thousands of miles away, in the Silicon Valley community of Sunnydale CA, a city councilman was elected with 60% of the vote after campaigning almost exclusively on the Internet computer network.”

But I also sensed a problem:

“Yet the very anarchistic nature of our new sources of data, — including computer services, cable channels, special interest magazines, and the archives of our video store — also means that we may have less information in common. At a time when communications and transportation make it ever simpler to cross geographic and cultural borders, we increasingly make the trip alone. We see far more than we understand or are understood. Louis Farrakhan and the Anti-Defamation League have the same technology available to them but they are checking in at different bulletin boards.”

Thus, in a variety of ways journalism lost its capacity as a primary guard of our democractic republic. There was growing ownership concentration, the changing social status of journalists, a shift in journalists from being private eyes of the public to being private tools of the establishment, and technology driving citizens into safe political niches where the idea of varied groups joining together in some larger cause faded away.

Journalism had lost its historic virtue of getting the bastards before they got you. And the First Republic took anther hit.

The corporate war against the American republic

Sam Smith – To cite greed, off-shoring of jobs, oligarchies, and other rapacious behavior when speaking of the modern American economy has become almost redundant. But to understand the scale of what has happened and how it has encouraged the collapse of the first American Republic, it helps to look at some of the history behind it.

Most free workers (i.e. white males) in this country were self-employed well into the 19th century. They were thus economic as well as political citizens.

Further, until the last decades of the 19th century, Americans believed in a degree of fair distribution of wealth that would shock many today. James L. Huston writes in the American Historical Review:

“Americans believed that if property were concentrated in the hands of a few in a republic, those few would use their wealth to control other citizens, seize political power, and warp the republic into an oligarchy. Thus to avoid descent into despotism or oligarchy, republics had to possess an equitable distribution of wealth.”

Such a distribution, in theory at least, came from enjoying the “fruits of one’s labor” but no more. Businesses that sprung up didn’t flourish on competition because there generally wasn’t any and, besides, cooperation worked better. You didn’t need two banks or two drug stores in the average town. Prices and business ethics were not regulated by the marketplace but by a complicated cultural code and the fact that the banker went to church with his depositors.

Although the practice was centuries old, the term capitalism — and thus the religion of the same name — didn’t even exist until the middle of the 19th century.

Americans were instead intensely commercial, but this spirit was propelled not by Reaganesque fantasies about competition but by the freedom that engaging in business provided from the hierarchical social and economic system of a monarchy. Business, including the exchange as well as the making of goods, was seen as a natural state allowing a community and individuals to get ahead and to prosper without the blessing of nobility or feudalism.

In the beginning, if you wanted to form a corporation you needed a state charter and had to prove it was in the public interest, convenience and necessity. During the entire colonial period only about a half-dozen business corporations were chartered; between the end of the Revolution and 1795 this rose to about a 150. Jefferson to the end opposed liberal grants of corporate charters and argued that states should be allowed to intervene in corporate matters or take back a charter if necessary.

With the pressure for more commerce and indications that corporate grants were becoming a form of patronage, states began passing free incorporation laws and before long Massachusetts had thirty times as many corporations as there were in all of Europe.

Still it wasn’t until after the Civil War that economic conditions turned sharply in favor of the large corporation.

These corporations, says Huston, “killed the republican theory of the distribution of wealth and probably ended whatever was left of the political theory of republicanism as well. . . .[The] corporation brought about a new form of dependency. Instead of industry, frugality, and initiatives producing fruits, underlings in the corporate hierarchy had to be aware of style, manners, office politics, and choice of patrons — very reminiscent of the Old Whig corruption in England at the time of the revolution — what is today called ‘corporate culture.'”

Concludes Huston:

“The rise of Big Business generated the most important transformation of American life that North America has ever experienced.”

By the end of the 19th century the Supreme Court had declared corporations to be persons under the 14th Amendment, entitled to the same protections as human beings. As Morton Mintz pointed out in the National Law Journal, this 1888 case ignored the fact that “the only ‘person’ Congress had in mind when it adopted the 14th Amendment in 1866 was the newly freed slave.” Justice Black observed in the 1930s that in the first fifty years following the adoption of the 14th Amendment, “less than one-half of 1 percent [of Supreme Court cases] invoked it in protection of the Negro race, and more than 50 percent asked that its benefits be extended to corporations.” During this period the courts moved to limit democratic power in other ways as well. For example, the Supreme Court restricted the common law right of juries to nullify a wrongful law; other courts erected barriers against third parties such as banning fusion slates.

It was during this same time that the myth of competitive virtue sprouted, helping to justify one of the great rapacious periods of American business. It was a time when J.P. Morgan would come to own half the railroad mileage in the country — the same J. P. Morgan who got his start during the Civil War by buying defective rifles for $3.50 each from an army arsenal and then selling them to a general in the field for $22 apiece. The founding principles of what we now proudly call the “American free market system” flowered in an era of enormous bribes, massive legislative corruption, and the creation of great anti- competitive cartels. It was a time when the government, in a precursor to modern industrial policy, gave two railroad companies 21 million acres of free land.

The populist, progressive and New Deal eras helped to reverse much of this. But all that is now in the past. The new robber barons are not only underpaying and mistreating their workers they are moving their jobs overseas. As for the great reforms of the past century, one Google chart showing the frequency of use of the phrase “anti-trust” in books from 1880 to the present tells it well:

The political movement of populism, which Jonathan Rowe calls the “last spasm of economic freedom in an American context,” did battle with the new corporations but lost, as did the eurocentric socialists who followed. Save during the depression, generations of Americans would come to accept the myth of the free markets and free enterprise. Today, only about 7 percent of non-farm workers are self-employed.

From activism to clicktivism

Sam Smith – It is hard to retain a democratic republic without movements effectively demanding that it not give up its purpose and integrity. While there are still such efforts – the Occupiers and 350 come quickly to mind – effective pressure on the establishment has largely waned over the past 30 years. Manifestations include:

  • The absence of an effective antiwar movement despite two of the longest and most pointless wars in American history.
  • The success over various older movements – civil rights, women’s and labor, for example – has increased the distance between their leaders and the people they represent. Glass ceilings have been broken but not locked doors, and the leadership lacks the passion to do something about it.

    · The most dramatic example, perhaps, is the lack of liberal effort on economic issues that had once defined liberalism.

  • The Internet has had a counterproductive effect on activism including making it more difficult to bring people together physically rather than just in cyberspace and an assumption that clicktivism and raising money is all you really need to change things.

One other factor that deserves attention – because it illustrates the complexity and subtlety of forces working against change – is the way funding affects activism these days. This is just one example of the problems efforts that change faces but is particularly instructive because it gets so little attention:

Bob Feldman, Where’s the Change, 2009 – In her groundbreaking Foundations and Public Policy book, Joan Roelofs begins a chapter that examines foundation influence on social change organizations by asserting that “philanthropy suggests yet another explanation for the decline of the 1960s and 1970s protest movements.” In Roelofs’ view, “radical activism often was transformed by grants and technical assistance from liberal foundations into fragmented and local organizations subject to elite control” and “energies were channeled into safe, legalistic, bureaucratic activities.”

Left media and left think tank staff people generally deny that the acceptance by their organizations of grants from liberal foundations has “transformed” their organizational priorities, made them “subject to elite control” or channeled their energies into “safe, legalistic, bureaucratic” activities…

Yet in a 1998 article in The Nation, the executive director of the Institute for Policy Studies between 1992 and 1998, Michael Shuman, wrote:

“A number of program officers at progressive foundations are former activists who decided to move from the demand to the supply side to enjoy better salaries, benefits and working hours. Yet they still want to live like activists vicariously. . . by exercising influence over grantees through innumerable meetings, reports, conferences and `suggestions.’. . . Many progressive funders treat their grantees like disobedient children who need to be constantly watched and disciplined.” . . .

In a September, 2002 e-mail, the executive director of the left media web site, John Moyers (a former executive with the Schumann Foundation, as well), also stated: “Like any other grantee, I must report fully my activities and finances to all of my funders, including Schumann, on an annual basis…If they don’t like what we’re doing, we don’t get funded for the next year.”

Like the left media, left think tanks have also been receiving large amounts of money from liberal foundations since the 1990s. As Roelofs observes:

“There are some think tanks considered left wing or progressive. They do important work, especially in documenting the activities, and consequences of corporate and government policies. Nevertheless, almost all are funded by the liberal foundations; their challenges to the system are muted. . . There are several possible explanations for the mellowing that has occurred, including foundation funding and, sometimes, foundation staff joining the boards of funded institutes.” . . .

In her Foundations and Public Policy chapter on “Social Change Organizations,” Roelofs indicates the various ways that foundation funding of U.S. left groups appear to have exercised a special influence over the political direction of the U.S. left since the 1970s. Foundation grants to one left group rather than another enables liberal power elite foundations to steer the U.S. left’s agenda so that “threatening alternatives” don’t appear on the serious political agenda. More militant left groups which the elite foundation boards or program managers regard as “irresponsible” or “unrealistic” are not funded: and, as a result, are more easily excluded from left political discourse than are the left groups favored with foundation grants.

Foundations can influence unfunded left groups to change the design of their projects and structure in accordance with a foundation board’s special agenda, in order to qualify for grants from a particular foundation. Foundations can influence a left groups’ choice of leaders by only giving grants to left groups whose leaders they regard as politically unthreatening. Foundations can promote “the fragmentation of protest” on the U.S. Left by using their grants to create and sustain “a universe of overlapping and competing social change organizations” and discouraging the unification of U.S. left dissident groups.

As Roelofs notes:

“It is to the elite’s advantage to be countered by a ‘mass movement’ consisting of fragmented, segmented, local, and non-ideological bureaucracies doing good works and, furthermore, being dependent on foundations for support. Diverse organizations emphasize differences among the disadvantaged: ethnic, racial, sexual, rural-urban, or age, and they discourage a broad left recognizing common interests.”

The liberal and progressive effort is largely dominated by groups modeled on the classic Washington or state lobby, groups that purport to represent a particular interest but do so in a limited fashion, notably excluding effective mass participation.

These groups compete with one another for funding, achieve that funding through niche rather than holistic programs and have little vested interest in joining diverse coalitions. For example, the development director of one such state group described to me the troubles he faced in fund raising because his organization had joined others in opposition to a tax proposal. Some funders clearly did not like this detour from the group’s stated focus. You don’t need too many experiences like that before you learn to mind your own business. Good for the bottom line; lousy for an effective movement.

There is also the problem that so much funding comes from centrist foundations that use their financial power to tame the groups they support. A covert trade of soul for dollars has increasingly been part of the American liberal story.

The gradocracy

Sam Smith – About sixty years ago, America was just a decade past the last war it would ever win. The length of the average work week was down significantly from the 1930s but real income had been soaring and would continue do so through the 1970s. We had a positive trade balance and the share of total income gained by the top 1% of the country was only around 8%, down from 24% in the 1930s.

As Jermie D. Cullip describes it:

“From 1950 to 1959, the total number of females employed increased by 18%. The standard of living during the fifties also steadily rose. Most people expected to own a car and a house, and believed that life for their children would be even better. . . The number of college students doubled. Getting a college education was no longer for the rich or elite

“The decade of the fifties was a decade of major breakthroughs in technology. James Watson and Francis Crick won the Nobel Prize for decoding the molecular structure of DNA. Tuberculosis had all but disappeared, and Jonas Salk’s vaccine was wiping out polio in the United States. . .

“Over the decade the housing supply increased 27 percent . . . Growth in the economy also led to increasing popularity of other financial intermediaries. Life insurance companies flourished for the first half of the decade and a large number of new private firms entered the market to absorb the excesses of personal savings.

“Savings and Loan Association holdings of mortgage loans during the decade clearly demonstrate the boom in construction at this time. In 1950 $13.6 billion was held rising to $60.1 billion in 1960. Another important growth in the 1950s capital markets was in pension funds. This industry grew from $11 billion in 1950 to $44 billion in 1960.

“By mid-1955, the country had pulled out of the previous year’s recession and gross national product was growing at a rate of 7.6 percent. The boom was so great that the budget for 1956 predicted a surplus of $4.1 billion. With the surges in production and the economy, the 1950s is often recognized as the decade that eliminated poverty for the great majority of Americans. Over the decade, GNP per capita almost doubled and the public welfare reacted accordingly as the cost of living index rose by just 1 percent and unemployment dropped to 4.1 percent'”

All in all not a bad decade to be in if you were running a business. So much so, in fact, that some began griping about it all in books like The Organization Man and plays like Death of a Salesman.

But here is the truly amazing part – given all we have been taught in recent years: America did it even as its universities were turning out less than 5,000 MBAs a year.

By 2005 these schools graduated 142,000 MBAs in one year.

There are plenty of worthy arguments to be made correlating the rise of business school culture with the decline of our economy and our country. A cursory examination of American business suggests that its major product has become wasted energy. And not just the physical sort Compute all the energy loss created by corporate lawyers, Washington lobbyists, marketing consultants, CEO benefits, advertising agencies, leadership seminars, human resource supervisors, strategic planners and industry conventions and it is amazing that this country has any manufacturing base at all. We have created an economy based not on actually doing anything, but on facilitating, supervising, planning, managing, analyzing, tax advising, marketing, consulting or defending in court what might be done if we had time to do it. The few remaining truly productive companies become immediate targets for another entropic activity, the leveraged buyout and the rise of the killer hedge fund.
And it was not just business school graduates that were the problem. In 2009, the Washingtonian Magazine estimated there were 80,000 lawyers in Washington.

The law has always been a favored profession for the Congress. Even Thomas Jefferson complained, “If the present Congress errs in too much talking, how can it be otherwise in a body to which the people send one hundred and fifty lawyers, whose trade it is to question everything, yield nothing, and talk by the hour? ”

But the interesting thing about lawyers in Washington, is that the percent in Congress actually declined in recent years. Using the Washingtonian’s estimates, about a third of the attorneys are in the government bureaucracy and a large part of the other two thirds are paid to influence them.

In short, instead of having lawyers just writing laws, we have them administering government and lobbying those who do.

As for our presidents, while 40% in the past century have had law degrees, Barack Obama and William Howard Taft are bookends in the sense that they were far more into the law than almost all their colleagues, many of whom seem to have used the law as an early way station on their road to something important.

Taft was an assistant prosecutor, superior court judge, solicitor general and and a federal court of appeals judge.

On the other hand, Gerald Ford opened a law firm and one year later was an ensign in the World War II Navy. Coolidge was a country lawyer. Bill Clinton had his eye on bigger things, serving as a law professor for just a year before running for Congress. FDR was in the state house within two years of his law degree

In fact, the commitment to law was so weak that Richard Nixon could declare that he was, in the words of Wikipedia, “the only modern president to have worked as a practicing attorney,” He had risen to full partner.

It was a given until recent times, that from a political point of view, understanding law or economics or business was a valuable asset but one that fell far behind social intelligence upon which successful politics relied. As my father, a lawyer who worked in the New Deal, would tell my buddies, “Go to law school, then do something else.” Roosevelt wasn’t as gracious towards the academic elites: “”I took economics courses in college for four years, and everything I was taught was wrong.”

Obama thus represents a new era in American politics: the ultimate triumph of the gradocracy. Here is Wikipedia’s summary of his early career:

“In late 1988, Obama entered Harvard Law School. He was selected as an editor of the Harvard Law Review at the end of his first year and president of the journal in his second year. During his summers, he returned to Chicago, where he worked as an associate at the law firms of Sidley Austin in 1989 and Hopkins & Sutter in 1990. After graduating with a J.D. magna cum laude from Harvard in 1991, he returned to Chicago.

“In 1991, Obama accepted a two-year position as Visiting Law and Government Fellow at the University of Chicago Law School to work on his first book. He then taught at the University of Chicago Law School for twelve years—as a Lecturer from 1992 to 1996, and as a Senior Lecturer from 1996 to 2004—teaching constitutional law.

“In 1993, he joined Davis, Miner, Barnhill & Galland, a 13-attorney law firm specializing in civil rights litigation and neighborhood economic development, where he was an associate for three years from 1993 to 1996, then of counsel from 1996 to 2004. His law license became inactive in 2007.

Key to such a career is intense attention to process, regulations, the manipulation of language and data. Applied to politics, this means the human factor can start to bring up the rear. Politics is then no longer like music in which soul and skill are melded; instead it becomes another bureaucracy. Good evidence of this in the Obama years would be Obamacare, a two thousand page hard to decipher collection of virtue, uncertain results, payoffs to the health industry, and excessive paper work. A good politician of another time would have led with something that everyone understood, such as lowering the age of Medicare, and then adding on their favorite sweetheart deals.

Another example of gradocracy is what has happened to public education. A two hundred year old hallmark of American democracy is now being dismantled for a combination of corrupt profit and distorted theory. Data collection – i.e. standardized tests – has taken time previously used for history, civics, and other things that gave mere facts some context. And taken time away from sports or theater, things that forced one to apply skill and knowledge in a cooperative manner.

Theory – subject to no testing at all – has replaced empirical wisdom. And teachers have been reduced to minor bureaucrats dutifully fulfilling procedures of dubious or destructive value. Add to this the corrupt goals of the education industry that is driving the war on public education and you have one of the most profound examples of child abuse that we have known.

It is not that it is wrong to study or practice the law, economics, business or education. But to usurp other skills, behavior, empirical knowledge and types of wisdom makes no more sense than for a dentist to attempt to instruct an attorney on how to address the court because he’s an expert on teeth.

Finally, at times, it seems that there are no governments anymore, only budget offices. As the numerologists have risen in power, programs increasingly became transformed into line items. Numbers began serving as adjectives, ideas were reduced to figures and policy became a matter of where one placed the decimal point.

We have been taken over by legal lemmings, process perverts, and data drones.

But then, as Peter Hennessy Whitehall, former head of the British Civil Service put it: “The business of the civil service is the orderly management of decline.”

This concludes the sad part of the story, overwhelming evidence that America’s first republic has been wrecked and that its culture is but a bad imitation of what it once was. This evidence has come from politics, education, business, the arts, and the media.

Even what is perhaps the best exception is also a highly ironic one: remarkable advances in cyber technology have encouraged us to be more isolated from communities and more defined by our niche interests than the common values that create a functioning society.

About the most important job of a democracy — next to serving its people — is to make sure it stays a democracy. Forms of government don’t have tenure, and governments that rely on the consent of the governed — rather than, say, on tanks and prisons — require constant tending. As things now stand, we could easily become the first people in history to lose democracy and its constitutional freedoms simply because we have forgotten what they are about.

The major political struggle has become not between conservative and liberal but between ourselves and our political, economic, social and media elites. Between the toxic and the natural, the corporate and the communal, the technocratic and the human, the competitive and the cooperative, the efficient and the just, meaningless data and meaningful understanding, the destructive and the decent.

Today almost every principle upon which this country was founded is being turned on its head. Instead of liberty we are being taught to prefer order, instead of democracy we are taught to be follow directions, instead of debate we are inundated with propaganda. Most profoundly, American citizens are no longer considered by their elites to be members or even worker drones of society, but rather as targets – targets of opportunity by corporations and of suspicion and control by government.

So what the hell do we do about it?

In Washington there is a neighborhood known as Shaw that until the modern civil rights movement and desegregation, was an African-American community shut out without a vote, without economic power, without access, and without any real hope that any of this would change.

Its response was remarkable. For example, in 1886 there were only about 15 black businesses in the area. By 1920, with segregation in full fury, there were more than 300.

Every aspect of the community followed suit. Among the institutions created within these few square miles was a building and loan association, a savings bank, the only good hotel in the Washington where blacks could stay, the first full-service black YMCA in the country, the Howard Theatre (opened with black capital twenty years before Harlem’s Apollo became a black stage) and two first rate movie palaces.

There were the Odd Fellows, the True Reformers, and the Prince Hall Lodge. There were churches and religious organizations, a summer camp, a photography club, settlement houses, and the Washington Urban League.

Denied access to white schools, the community created a self-sufficient educational system good enough to attract suburban African-Americans students as well as teachers with advanced degrees from all over the country. And just to the north, Howard University became the intellectual center of black America. You might have run into Langston Hughes, Alain Locke, or Duke Ellington, all of whom made the U Street area their home before moving to New York.

All this occurred while black Washingtonians were being subjected to extraordinary economic obstacles and being socially and politically ostracized. If there ever was a culture entitled to despair and apathy it was black America under segregation.

Yet not only did these African-Americans develop self-sufficiency, they did so without taking their eyes off the prize. Among the other people you might have found on U Street were Thurgood Marshall and Charles Houston, laying the groundwork for the modern civil rights movement.

Older residents would remember the former neighborhood with a mixture of pain and pride — not unlike the ambivalence found in veterans recalling a war. None would voluntarily return to either segregation or the battlefield but many would know that some of their own best moments of courage, skill, and heart had come when the times were at their worst.

Another example is Umbria, a section of Italy north of Rome remarkably indifferent to 500 years of its history, where even the homes and whole villages seem to grow like native plants out of the rural earth rather than being placed there by human effort. Yet the Umbrians have been invaded, burned, or bullied by the Etruscans, Roman Empire, Goths, Longobards, Charlemagne, Pippin the Short, the Vatican, Mussolini, the German Nazis, and, most recently, the World Trade organization. Umbria is a reminder of the durability of the human spirit during history’s tumults, an extremely comforting thought to an American these days.

Or consider the increasingly cited novel, 1984. Orwell saw it coming, only his timing was off. The dystopia described in 1984 is so overwhelming that one almost forgets that most residents of Oceana didn’t live in it. Only about two percent were in the Inner Party and another 13% in the Outer Party. The rest numbering some 100 million were the proles.

Orwell’s division of labor and power was almost precisely replicated in East Germany decades later, where about one percent belonged to the General Secretariat of the Communist Party, and another 13% being far less powerful party members.

As we move towards – and even surpass – the fictional bad dreams of Orwell and the in many ways more prescient Aldous Huxley’s ‘Brave New World,’, it is helpful to remember that these nightmares were actually the curse of the elites and not of those who lived in the quaint primitive manner of humans rather than joining the living dead at the zenith of illusionary power.

This bifurcation of society into a weak, struggling, but sane, mass and a manic depressive elite that is alternately vicious and afraid, unlimited and imprisoned, foreshadows what we find today – an elite willing, on the one hand, to occupy any corner of the world and, on the other, terrified of young men with minimal weapons.

Strange as it may seem, it is in this dismal dichotomy between countryside and the political and economic capitals that the hope for saving America’s soul resides. The geographical and conceptual parochialism of those who have made this mess leaves vast acres of our land still free in which to nurture hopes, dreams, and perhaps even to foster the eventual eviction of those who have done us such wrong.

Successfully confronting the present disaster will require far more than attempting to serially blockade its serial evils, necessary as this is. There must also be a guerilla democracy that defends, fosters, and celebrates our better selves – not only to provide an alternative but to create physical space for decent Americans to enjoy their lives while waiting for things to get better. It may, after all, take the rest of their lifetimes. We must not only condemn the worst, but offer witness for the better. And create places in which to live it.


One thought on “Notes on the end of the First American Republic

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.