The gradocracy

Sam Smith – About sixty years ago, America was just a decade past the last war it would ever win. The length of the average work week was down significantly from the 1930s but real income had been soaring and would continue do so through the 1970s. We had a positive trade balance and the share of total income gained by the top 1% of the country was only around 8%, down from 24% in the 1930s.

As Jermie D. Cullip describes it:

“From 1950 to 1959, the total number of females employed increased by 18%. The standard of living during the fifties also steadily rose. Most people expected to own a car and a house, and believed that life for their children would be even better. . . The number of college students doubled. Getting a college education was no longer for the rich or elite

“The decade of the fifties was a decade of major breakthroughs in technology. James Watson and Francis Crick won the Nobel Prize for decoding the molecular structure of DNA. Tuberculosis had all but disappeared, and Jonas Salk’s vaccine was wiping out polio in the United States. . .

“Over the decade the housing supply increased 27 percent . . . Growth in the economy also led to increasing popularity of other financial intermediaries. Life insurance companies flourished for the first half of the decade and a large number of new private firms entered the market to absorb the excesses of personal savings.

“Savings and Loan Association holdings of mortgage loans during the decade clearly demonstrate the boom in construction at this time. In 1950 $13.6 billion was held rising to $60.1 billion in 1960. Another important growth in the 1950s capital markets was in pension funds. This industry grew from $11 billion in 1950 to $44 billion in 1960.

“By mid-1955, the country had pulled out of the previous year’s recession and gross national product was growing at a rate of 7.6 percent. The boom was so great that the budget for 1956 predicted a surplus of $4.1 billion. With the surges in production and the economy, the 1950s is often recognized as the decade that eliminated poverty for the great majority of Americans. Over the decade, GNP per capita almost doubled and the public welfare reacted accordingly as the cost of living index rose by just 1 percent and unemployment dropped to 4.1 percent'”

All in all not a bad decade to be in if you were running a business. So much so, in fact, that some began griping about it all in books like The Organization Man and plays like Death of a Salesman.

But here is the truly amazing part – given all we have been taught in recent years: America did it even as its universities were turning out less than 5,000 MBAs a year.

By 2005 these schools graduated 142,000 MBAs in one year.

There are plenty of worthy arguments to be made correlating the rise of business school culture with the decline of our economy and our country. A cursory examination of American business suggests that its major product has become wasted energy. And not just the physical sort Compute all the energy loss created by corporate lawyers, Washington lobbyists, marketing consultants, CEO benefits, advertising agencies, leadership seminars, human resource supervisors, strategic planners and industry conventions and it is amazing that this country has any manufacturing base at all. We have created an economy based not on actually doing anything, but on facilitating, supervising, planning, managing, analyzing, tax advising, marketing, consulting or defending in court what might be done if we had time to do it. The few remaining truly productive companies become immediate targets for another entropic activity, the leveraged buyout and the rise of the killer hedge fund.

And it was not just business school graduates that were the problem. In 2009, the Washingtonian Magazine estimated there were 80,000 lawyers in Washington.
The law has always been a favored profession for the Congress. Even Thomas Jefferson complained, “If the present Congress errs in too much talking, how can it be otherwise in a body to which the people send one hundred and fifty lawyers, whose trade it is to question everything, yield nothing, and talk by the hour? “
But the interesting thing about lawyers in Washington, is that the percent in Congress actually declined in recent years. Using the Washingtonian’s estimates, about a third of the attorneys are in the government bureaucracy and a large part of the other two thirds are paid to influence them.
In short, instead of having lawyers just writing laws, we have them administering government and lobbying those who do.
As for our presidents, while 40% in the past century have had law degrees, Barack Obama and William Howard Taft are bookends in the sense that they were far more into the law than almost all their colleagues, many of whom seem to have used the law as an early way station on their road to something important.
Taft was an assistant prosecutor, superior court judge, solicitor general and and a federal court of appeals judge.
On the other hand, Gerald Ford opened a law firm and one year later was an ensign in the World War II Navy. Coolidge was a country lawyer. Bill Clinton had his eye on bigger things, serving as a law professor for just a year before running for Congress. FDR was in the state house within two years of his law degree
In fact, the commitment to law was so weak that Richard Nixon could declare that he was, in the words of Wikipedia, “the only modern president to have worked as a practicing attorney,” He had risen to full partner.
It was a given until recent times, that from a political point of view, understanding law or economics or business was a valuable asset but one that fell far behind social intelligence upon which successful politics relied. As my father, a lawyer who worked in the New Deal, would tell my buddies, “Go to law school, then do something else.” Roosevelt wasn’t as gracious towards the academic elites: “”I took economics courses in college for four years, and everything I was taught was wrong.”
Obama thus represents a new era in American politics: the ultimate triumph of the gradocracy. Here is Wikipedia’s summary of his early career:
“In late 1988, Obama entered Harvard Law School. He was selected as an editor of the Harvard Law Review at the end of his first year and president of the journal in his second year.  During his summers, he returned to Chicago, where he worked as an associate at the law firms of Sidley Austin in 1989 and Hopkins & Sutter in 1990. After graduating with a J.D. magna cum laude from Harvard in 1991, he returned to Chicago.
“In 1991, Obama accepted a two-year position as Visiting Law and Government Fellow at the University of Chicago Law School to work on his first book. He then taught at the University of Chicago Law School for twelve years—as a Lecturer from 1992 to 1996, and as a Senior Lecturer from 1996 to 2004—teaching constitutional law.
“In 1993, he joined Davis, Miner, Barnhill & Galland, a 13-attorney law firm specializing in civil rights litigation and neighborhood economic development, where he was an associate for three years from 1993 to 1996, then of counsel from 1996 to 2004. His law license became inactive in 2007.

Key to such a career is intense attention to process, regulations, the manipulation of language and data. Applied to politics, this means the human factor can start to bring up the rear. Politics is then no longer like music in which soul and skill are melded; instead it becomes another bureaucracy. Good evidence of this in the Obama years would be Obamacare, a two thousand page hard to decipher collection of virtue, uncertain results, payoffs to the health industry, and excessive paper work. A good politician of another time would have led with something that everyone understood, such as lowering the age of Medicare, and then adding on their favorite sweetheart deals.

Another example of gradocracy is what has happened to public education. A two hundred year old hallmark of American democracy is now being dismantled for a combination of corrupt profit and distorted theory. Data collection – i.e. standardized tests – has taken time previously used for history, civics, and other things that gave mere facts some context. And taken time away from sports or theater, things that forced one to apply skill and knowledge in a cooperative manner.

Theory – subject to no testing at all – has replaced empirical wisdom. And teachers have been reduced to minor bureaucrats dutifully fulfilling procedures of dubious or destructive value. Add to this the corrupt goals of the education industry that is driving the war on public education and you have one of the most profound examples of child abuse that we have known.

It is not that it is wrong to study or practice the law, economics, business or education. But to usurp other skills, behavior, empirical knowledge and types of wisdom makes no more sense than for a dentist to attempt to instruct an attorney on how to address the court because he’s an expert on teeth.

Finally, at times, it seems that there are no governments anymore, only budget offices. As the numerologists have risen in power, programs increasingly became transformed into line items. Numbers began serving as adjectives, ideas were reduced to figures and policy became a matter of where one placed the decimal point.

We have been taken over by legal lemmings, process perverts, and data drones.

But then, as Peter Hennessy Whitehall, former head of the British Civil Service put it: “The business of the civil service is the orderly management of decline.”

This concludes the sad part of the story, overwhelming evidence that America’s first republic has been wrecked and that its culture is but a bad imitation of what it once was. This evidence has come from politics, education, business, the arts, and the media.
Even what is perhaps the best exception is also a highly ironic one: remarkable advances in cyber technology have encouraged us to be more isolated from communities and more defined by our niche interests than the common values that create a functioning society.
About the most important job of a democracy — next to serving its people — is to make sure it stays a democracy. Forms of government don’t have tenure, and governments that rely on the consent of the governed — rather than, say, on tanks and prisons — require constant tending. As things now stand, we could easily become the first people in history to lose democracy and its constitutional freedoms simply because we have forgotten what they are about.

The major political struggle has become not between conservative and liberal but between ourselves and our political, economic, social and media elites. Between the toxic and the natural, the corporate and the communal, the technocratic and the human, the competitive and the cooperative, the efficient and the just, meaningless data and meaningful understanding, the destructive and the decent.

Today almost every principle upon which this country was founded is being turned on its head. Instead of liberty we are being taught to prefer order, instead of democracy we are taught to be follow directions, instead of debate we are inundated with propaganda. Most profoundly, American citizens are no longer considered by their elites to be members or even worker drones of society, but rather as targets – targets of opportunity by corporations and of suspicion and control by government.

So what the hell do we do about it?

In Washington there is a neighborhood known as Shaw that until the modern civil rights movement and desegregation, was an African-American community shut out without a vote, without economic power, without access, and without any real hope that any of this would change.

Its response was remarkable. For example, in 1886 there were only about 15 black businesses in the area. By 1920, with segregation in full fury, there were more than 300.

Every aspect of the community followed suit. Among the institutions created within these few square miles was a building and loan association, a savings bank, the only good hotel in the Washington where blacks could stay, the first full-service black YMCA in the country, the Howard Theatre (opened with black capital twenty years before Harlem’s Apollo became a black stage) and two first rate movie palaces.

There were the Odd Fellows, the True Reformers, and the Prince Hall Lodge. There were churches and religious organizations, a summer camp, a photography club, settlement houses, and the Washington Urban League.

Denied access to white schools, the community created a self-sufficient educational system good enough to attract suburban African-Americans students as well as teachers with advanced degrees from all over the country. And just to the north, Howard University became the intellectual center of black America. You might have run into Langston Hughes, Alain Locke, or Duke Ellington, all of whom made the U Street area their home before moving to New York.

All this occurred while black Washingtonians were being subjected to extraordinary economic obstacles and being socially and politically ostracized. If there ever was a culture entitled to despair and apathy it was black America under segregation.

Yet not only did these African-Americans develop self-sufficiency, they did so without taking their eyes off the prize. Among the other people you might have found on U Street were Thurgood Marshall and Charles Houston, laying the groundwork for the modern civil rights movement.

Older residents would remember the former neighborhood with a mixture of pain and pride — not unlike the ambivalence found in veterans recalling a war. None would voluntarily return to either segregation or the battlefield but many would know that some of their own best moments of courage, skill, and heart had come when the times were at their worst.

Another example is Umbria, a section of Italy north of Rome remarkably indifferent to 500 years of its history, where even the homes and whole villages seem to grow like native plants out of the rural earth rather than being placed there by human effort. Yet the Umbrians have been invaded, burned, or bullied by the Etruscans, Roman Empire, Goths, Longobards, Charlemagne, Pippin the Short, the Vatican, Mussolini, the German Nazis, and, most recently, the World Trade organization. Umbria is a reminder of the durability of the human spirit during history’s tumults, an extremely comforting thought to an American these days.

Or consider the increasingly cited novel, 1984. Orwell saw it coming, only his timing was off. The dystopia described in 1984 is so overwhelming that one almost forgets that most residents of Oceana didn’t live in it. Only about two percent were in the Inner Party and another 13% in the Outer Party. The rest numbering some 100 million were the proles.

Orwell’s division of labor and power was almost precisely replicated in East Germany decades later, where about one percent belonged to the General Secretariat of the Communist Party, and another 13% being far less powerful party members.

As we move towards – and even surpass – the fictional bad dreams of Orwell and the in many ways more prescient Aldous Huxley’s ‘Brave New World,’, it is helpful to remember that these nightmares were actually the curse of the elites and not of those who lived in the quaint primitive manner of humans rather than joining the living dead at the zenith of illusionary power.

This bifurcation of society into a weak, struggling, but sane, mass and a manic depressive elite that is alternately vicious and afraid, unlimited and imprisoned, foreshadows what we find today – an elite willing, on the one hand, to occupy any corner of the world and, on the other, terrified of young men with minimal weapons.

Strange as it may seem, it is in this dismal dichotomy between countryside and the political and economic capitals that the hope for saving America’s soul resides. The geographical and conceptual parochialism of those who have made this mess leaves vast acres of our land still free in which to nurture hopes, dreams, and perhaps even to foster the eventual eviction of those who have done us such wrong.

Successfully confronting the present disaster will require far more than attempting to serially blockade its serial evils, necessary as this is. There must also be a guerilla democracy that defends, fosters, and celebrates our better selves – not only to provide an alternative but to create physical space for decent Americans to enjoy their lives while waiting for things to get better. It may, after all, take the rest of their lifetimes. We must not only condemn the worst, but offer witness for the better. And create places in which to live it.


The media: from watch dog to lap dog

Sam Smith – In the late 1930s a survey asked Washington journalists for their reaction to the following statement:

It is almost impossible to be objective. You read your paper, notice its editorials, get praised for some stories and criticized for others. You ‘sense policy’ and are psychologically driven to slant the stories accordingly.

Sixty percent of the respondents agreed.

To understand the role of the media in the collapse of the First American Republic, it doesn’t help to cling to romantic notions of what journalism once was; the days for which some yearn never existed

Admittedly there were differences that today seem almost bizarre. Eighty years ago, 40% of the Washington correspondents surveyed were born in towns of less than 2,500 population, and only 16% came from towns of 100,000 or more. In 1936, the Socialist candidate for president was supported by 5% of the Washington journalists polled and one even cast a ballot for the Communists. One third of Washington correspondents, the cream of the trade, lacked a college degree in 1937.

And what also existed was much more competition in the news industry. By the 1980s, most of what Americans saw, read, or heard was controlled by fewer than two dozen corporations. By the 1990s just five corporations controlled all or part of 26 cable channels. Some 75% of all dailies are now in the hands of chains and just four of these chains own 21% of all the country’s daily papers. The situation since has deteriorated further, including the poor financial shape of many newspapers that has put banks and hedge funds silently in charge of them.

With these changes, the chances of getting the story right deteriorated and America was increasingly informed and persuaded by members of the same oligarchy that was running it.

And it was not only the owners. The national press corps was losing its relationship with America.

When I started out as a Washington reporter in the 1950s, only about half of American journalists had more than a high school degree. They naturally identified with their readership rather than with their publishers or elite sources. I didn’t let anyone know I had gone to Harvard because that would not have improved my standing either with staffers on the Hill or colleagues in the media.

Ben Bagdikian, a bit older than myself, described the craft in his memoir, Double Vision, this way:

“Before the war a common source of the reporter was an energetic kid who ran newsroom errands for a few years before he was permitted to accompany the most glamorous character on the staff, the rough-tough, seen-it-all, blood-and-guts police reporter. Or else, as in my case, on a paper with low standards, reporters started off as merely warm bodies that could type and would accept $18 a week with no benefits.

“Some of us on that long-ago paper had college educations but we learned to keep quiet about it; there was a suspicion that a degree turned men into sissies. Only after the war did the US Labor Department’s annual summary of job possibilities in journalism state that a college degree is ‘sometimes preferred.'”

And there were changes at the top as well. One of my first major shocks about my chosen trade, was listening to a top Washington editor talking about how he had been discussing with the White House the best way to handle the arrest of Walter Jenkins, LBJ’s top aide who had been caught giving a blow job to a man at a local YMCA. It had never occurred to me that an editor would actually consult with politicians on how their stories were to be covered. But in a few decades journalists would be thoroughly “embedded” both in war zones and at the White House and find nothing strange about it.

Journalists were changing socially as well. In the late 1960s, the Washington Post replaced its women’s section with one called “Style” and before long members of a once scrubby journalism trade were turning up in it as participants at major social events. They were no longer just interviewing people leaving the party, they were part of the party.

Another media factor that dramatically changed the nature of American politics was television. Over time, television transformed politics from a community based culture to one in which you could simply buy your status. Our social culture also became distorted by television until the major things we shared as a society were no longer values as much as symbols like Lindsay Lohan, Justin Bieber, Hillary Clinton and Barack Obama

And it was all brought to us by someone. According to CBS, Jay Walker-Smith of the market firm Yankelvich, estimated that in the 1970s we saw about 500 ads a day. By 2009 it was up to as many as 5,000.

The Internet was supposed to save us, but it hasn’t. In fact, the country has moved to the right since its creation. It was a problem that cropped up early, as I described in my 1994 book, Shadows of Hope:

“The computer, once considered primarily a tool of orthodoxy, has now become a major weapon against authoritarianism. The highly effective campus anti-apartheid protests were organized with the help of a computer bulletin board that advised newcomers how to plan demonstrations and deal with the media. In the last days of the Soviet Union, the relative security of computer information provided dissidents a means of communications with each other and with the outside world. More recently, computers have established the first strong link among environmentalists working to save Lake Baikal in Siberia. . . And thousands of miles away, in the Silicon Valley community of Sunnydale CA, a city councilman was elected with 60% of the vote after campaigning almost exclusively on the Internet computer network.”

But I also sensed a problem:

“Yet the very anarchistic nature of our new sources of data, — including computer services, cable channels, special interest magazines, and the archives of our video store — also means that we may have less information in common. At a time when communications and transportation make it ever simpler to cross geographic and cultural borders, we increasingly make the trip alone. We see far more than we understand or are understood. Louis Farrakhan and the Anti-Defamation League have the same technology available to them but they are checking in at different bulletin boards.”

Thus, in a variety of ways journalism lost its capacity as a primary guard of our democratic republic. There was growing ownership concentration, the changing social status of journalists, a shift in journalists from being private eyes of the public to being private tools of the establishment, and technology driving citizens into safe political niches where the idea of varied groups joining together in some larger cause faded away.

Journalism had lost its historic virtue of getting the bastards before they got you. And the First Republic took another hit.

This article is a partial remix of earlier pieces by the author.

 Blowin’ in the winds of change

This is the fifth in a series of essays on the end of the First American Republic, this one mostly written in 2005.

Sam Smith – Thomas Jefferson saw it coming. He warned, “From the conclusion of this war we shall be going down hill. It will not then be necessary to resort every moment to the people for support. They will be forgotten, therefore, and their rights disregarded. They will forget themselves, but in the sole faculty of making money, and will never think of uniting to effect a due respect for their rights. The shackles, therefore, which shall not be knocked off at the conclusion of this war, will remain on us long, will be made heavier and heavier, till our rights shall revive or expire in a convulsion.”

Among the conceits of our elite and media is the assumption that America, in the form that they wish to imagine it, is immortal. Part of this is the arrogance of the big, part comes from an admirable if naive faith in progress, part of it is pathological delusion. For a host of reasons, beginning with our own survival, it is long past time to permit the question to be raised: is America collapsing as a culture?

It is easy to forget that history is strewn with the rubble of collapsed civilizations, entropic remains of once sturdy cultures, societies we now remember only thanks to a handful of artifacts guarded in museums.

Our own country was built on the wreckage of Indian culture. Guatemalans use Timex watches rather than checking the Mayan Calendar. The European Union is a covert chapter of Empires Anonymous. And in the Peruvian desert there are huge spirals in the earth and straight lines that stretch for miles whose origins are totally forgotten.

Some sixty years ago, anthropologist Alfred Kroeber noted that elements of a culture do die out, “dissolve away, disappear, and are replaced by new ones. The elements of the content of such cultures may have previously spread to other cultures and survive there. Or their place may be taken at home by elements introduced from abroad. Or they may survive, with or without modification, at home, in the different configuration that gradually takes the place of the old one as a successor culture.” Thus even if American democracy dies here; pieces of it may survive somewhere else, or we may become the largest latino culture in the world and, in any event, the Thais may keep the faith of the Ipod alive regardless of what happens to us.

As an example, Kroeber says that there came a time when the ancient Egyptians had clearly attained “the greatest military might, expansion, wealth, excellence of art and development of thought. The inherent patterns of their culture may be said to have been fully realized or to have been saturated then. After that, with pattern potentialities exhausted, there could be only diminished or devitalized repletion; unless the patterns can be reformulated in the direction of a new set of values – which would be equivalent to recasting the civilization into a new one or into a thoroughly new phase of one. This latter did not happen in Egypt; so more and more sluggish mechanical repetition within the realized but fully exhausted patterns became the universal vogue.”

Does this begin to sound a bit familiar?

Let’s take the example of popular music, useful because music is a creative discipline with a mathematical base, thus lending itself to more objective analysis than some of its artistic colleagues. In fact, you can write a succinct history of western music by simply outlining the progression of chords used and their relationship with one another. This is what Ward Cannel, a journalist, and Fred Marx, a classical pianist, did in a remarkable guide, “How to Play Piano Despite Years of Lessons.”

Charting the basic chords – separated by a common distance of notes and placed around a circle like guests at a large dinner table – you can describe the rise of western music by simply checking off which of these chords were being used by musicians at a particular time. Thus with folk music, children’s songs, early hymns and Bach’s Minuet In G, it was typical to use one chord and its neighbor on either side.

In later classical harmony, composers moved from the base chord to another, say, three or four seats away counter clockwise and then begin a slow procession home stopping at the other chairs. Examples would include Bach’s Well Tempered Clavichord. It doesn’t seem like much, but in the history of music, it was a revolutionary change.

Along the way, there were other variations such as starting at the second or third chair and moving back towards home as in Honeysuckle Rose.

If you really wanted to be wild, you threw in a chord not on the way home at all, but in the other direction.

Then came a new stage and the game was played on the clockwise side of the circle. Later a tune might work its way entirely around the circle. Or if you want to be really hip, you could leap across the circle to the other side.

Similarly, the baker’s dozen of notes in the western scale have been rearranged over time in increasingly complex ways, starting with the simple chords we associate with folk music and moving on to add the 7th, flatted 9th, 13th and so forth.

If you were to take every piece of music in America ever written and categorize it by these standards – the number and placement of chords and their complexity – you would find that musical opportunity grew with the rest of the republic.

This didn’t mean that you had to use all these opportunities to make good music – bluegrass and the blues prove that – only that the potential for musicians and composers were ever expanding, a sign of a thriving culture. As Thelonius Monk put it, “I’m after new chords, new ways of syncopating, new figures, new runs. How to use notes differently. That’s it. Just using notes differently.”

Unfortunately, however, there are only so many chairs at the table and there are only so many combinations of movement. Eventually you run out of chairs for chords, variations on the order you play them, and their complexity. You reach the point that Kroeber described: “With pattern potentialities exhausted, there could be only diminished or devitalized repletion. . . so more and more sluggish mechanical repetition within the realized but fully exhausted patterns became the universal vogue.”

Which is to say, much of the music of today.

There is, to be sure, another major source of change: other cultures. American folk music, for example, is a history of immigration translated into notes. The blues, it has been suggested, originated in a blend of the western and African scale. As early as Jelly Roll Morton, jazz musicians were borrowing from latin sounds with a more recent folk example being the blending of Paul Simon and Ladysmith Black Mambazo in ‘Graceland.’

This continues today but in a critically modified form: Jelly Roll Morton and Paul Simon were inventive musicians seeking the best in another culture; Ricky Martin and Gloria Estefan are products of a huge anglo recording company looking for something new to exploit.

I suspect the decay of American music may have begun with the disco drum machine of 1970s, the beginning of percussion mechanicus to go along with Erich Fromm’s homo mechanicus. Both share a problem: they aren’t human. A live drummer is constantly listening to the other musicians, finding new ways to back them up, discovering a groove by intent or accident, making a two bar point, or just showing off. If you were to analyze the sound with lab equipment you might be amazed at how irregular it actually is – the inevitable result of being human rather than mechanical.

But that is part of the secret of real music. Much of the appeal of jazz, for example, comes from listening to the alteration, manipulation or distortion of the familiar. Thus a singer may hold a note longer than expected or lend it excruciating pain when you were expecting nothing more than a simple B flat. One writer described it as repetition just to the point of boredom – at which something new and unexpected happens.

As amplifiers replaced acoustic sound, there were other changes in music. The recording companies began dumbing down music, reducing the number of chords, replacing melody with repetitive phrases, emphasizing only the extreme end of the dynamic range, and in the end – with rap – doing away with the need for music almost entirely.

This is not to say that there was not merit within these forms – the pain and rebellion of punk, the soul of rap – but rather that for the most part the corporate monopolies had seized control of our ear drums and locked them down in a few tiny cells.

The result is telling. In 2002, ABC asked respondents for the top rock n roll star of all time. Elvis Presley got 38%, no one else got more than 5% and listed in the top ten were such golden oldies as Jimi Hendrix, John Lennon, Mick Jagger, Bruce Springseen, Paul McCartney and Eric Clapton. Michael Jackson got 2%.

Rolling Stone’s listing of top 40 songs found only 8% more recent than 1980.

Thus when you ask, what’s been happening in American popular music over the past 25 years, a reasonable answer is: not much.

You find similarities in other arts. For example, a Modern Library critics’ listing of the 100 best English language novels of the 20th century includes only one written after 1980: Ironweed by William Kennedy, written in 1983.

One list of the 100 most acclaimed films finds only nine post-1980. The American Film Institute’s list includes only 13.

An American Film Institute’s listing of top film musicals found only 12% after the 1970s and none in the 1980s.

The Art Wolf’s listing of top 50 artists could come up with only Andy Warhol and Jean Michel Basquiat as recent examples.

One may quarrel with such lists, but a culture that is truly thriving will tend, if anything, to overvalue its own contributions and downplay those of the past. You may argue, for example, with those who claimed to come from ‘the greatest generation,’ but you can’t argue that they felt that way. Now, instead of bragging, we just order Butch Cassidy from Neflix one more time.

A vibrant culture will be spurred by what it considers greatness. This doesn’t mean that it necessarily is, but the mere presumption affects how the society behaves.

For example, Victor Davis Hanson wrote that “Whether or not you agreed with them, university presidents used to be dignified figures on the American scene. They often were distinguished scholars, capable of bringing their own brand of independent thinking to bear on the operation and reform of their institutions. Above all, they took seriously the university’s mission to seek and transmit the truth, and thereby to strengthen the free society that made such inquiry possible.

“But it has been a long time since Woodrow Wilson (at Princeton), Robert Hutchins (at Chicago) or James Bryant Conant (at Harvard) set the tone for American campuses. Over the past year, four university presidents have been in the news – from Harvard; the University of California, Santa Cruz; the University of Colorado; and the University of California, Berkeley. In each case, the curtains have briefly parted, allowing the public to glimpse the campus wizards working the levers behind the scenes, and confirming that something has gone terribly wrong at our best public and private universities.”

Of course, Woodrow Wilson spread segregation in the government and James Conant may have done public education incalculable damage by setting it on a course of gargantuan factory-like school districts, but that is not the point. The point is that they were icons of a society that thought it knew where it was going and what it admired.

Today, such figures have largely been reduced to talk of their fundraising skill or excessive expense accounts. Few suggest that they are people we should actually admire.

Similarly, in the churches there is a stunning lack of models. This is not merely the fault of the neo-Gantries who have taken over much of American Christianity but of other Protestant sects that say not a mumblin’ word about the theological hijacking by the right and who offer little alternative in such areas as social justice and world peace. Judaism, which once helped carry the banner for social change, has largely abandoned that field in favor of supporting Israel. As for the Catholics, the best they can do is try to find ways to prove that they’re not a bunch of perverts.

The dearth of greatness is most painfully obvious perhaps in the nation’s capital, in its politics, think tanks and media. To be sure, a pantomime is performed, but everyone knows it is just for television. Obama compares himself to Roosevelt, Koppel pretends he’s Murrow, but nobody’s really fooled. The disappearance of greatness – whether rightly or wrongly recognized as such – is common throughout American society – from football coaches to moral leaders. In the end we are left with Justin Bieber and Linsday Lohan.

Part of the problem was identified as far back as the 1920s by Julien Benda in his book, The Treason of the Intellectuals: “At the very top of the scale of moral values [the intellectuals] place the possession of concrete advantages, of material power and the means by which they are procured; and they hold up to scorn the pursuit of truly spiritual advantages, of non-practical or disinterested values.”

Instead of being outsiders, critics and moral observers, the American intelligentsia have become players accepting many of the values of the system they should be scorning.

Benda listed some of these values:

– “The extolling of courage at the expense of other virtues. . .

– “The extolling of harshness and the scorn for human love — pity, charity, benevolence. . . – “The teaching which says that when a will is successful that fact alone gives it a moral value, whereas the will which fails is for that reason alone deserving of contempt.”

In my last book, Why Bother?, I wrote:

Older Americans remember the victories and their celebrations; they remember Norman Rockwell men standing motionless for the national anthem in baseball stadiums with fedoras held over their hearts; a government that did more than regulate or arrest you; politicians who were revered; newscasters who were trusted; and music that dripped syrup over our spirits and made them sweet and sticky. They remember when there was a right and wrong and who and what belonged with each, whether it was true or not. They remember a time when those in power lied and were actually able to fool us. They remember what a real myth was like even when it was false, cruel, deceptive, and the property of only a few.

Now, despite the improved economic and social status of women and minorities, despite decades of economic progress, despite Velcro, SUVs, MTV, NASA, DVD, cell phones, and the Internet you can’t raise a majority that is proud of this country. We neither enjoy our myths nor our reality. We hate our politicians, ignore our moral voices, and distrust our media. We have destroyed natural habitats, created the nation’s first downwardly mobile generation, stagnated their parent’s income, and removed the jobs of each to distant lands. We have created rapacious oligopolies of defense and medicine, frittered away public revenues and watched indifferently as, around the world, the homeless and the miserable pile up. Our leaders and the media speak less and less of freedom, democracy, justice, or of their own land. Perhaps most telling, we are no longer able to react, but only to gawk.

Too be sure, many of the symbols of America remain, but they have become crude — desperately or only commercially imitative of something that has faded. We still stand for the Star Spangled Banner, but we no longer know what to do while on our feet. We still subscribe to the morning paper but it reads like stale beer. And some of us even still vote, but expect ever less in return. Where once we failed to practice our principles, now we no longer even profess to honor them.

Top rot

Sam Smith – One good way to judge the state of a culture is to check out its leadership. Absent a strong, continuing rebellion from the bottom or a conscious devolution of power, this leadership leaves its mark on every aspect of society.

For us, this is not a new problem. Shortly before 9/11 I wrote:

Over lunch one day, I asked journalist Stephen Goode how he would describe our era. Without hesitation, he said it was a time of epigons.

An epigon, he explained to my perplexed frown, is one who is a poor imitation of those who have preceded. The word comes from the epigoni — the afterborn — specifically the sons of the seven Greek chieftans killed in their attempt to take Thebes. The kids avenged the deaths by capturing Thebes – but they also destroyed it. They were generally not considered as admirable and competent as their fathers.

Being around epigons is like being trapped at a bad craft fair where everything you see seems to have been made before, only better.

Of course, you can not expect your leaders in politics, academia, business or the media to tell you this. People have to figure it out for themselves and that can take a long time.

In politics, for example, our conservative leaders are devoting themselves to the revival of segregation, even attacking unfettered voting that civil rights activists early targeted as essential for a decent society. These conservatives do not use slur words and other crudities of the old south but in some ways are even more dangerous since they seek to discriminate formally not only against blacks, but against every ethnic minority, women, gays and anyone not making enough money to contribute to their campaign. Because the mass media has so thoroughly embedded itself in the culture of our leadership, there is little hint of this on the evening news. Stealing the voting rights of citizens is treated as just another political issue to ponder soberly and not – as it should be – a crime.

In a collapsing society, one of the best clues is to look at the supposed good guys. What we find in politics is a Democratic Party that has not only betrayed its present supporters, but is actively undoing the progress it made over 80 years in economic equity and civil liberties, just to name two examples.

We are presented as heroes people like Bill Clinton, a politician with an exceptionally seedy past who cut social welfare and helped create the fiscal disaster from which we have yet to recover. And Barack Obama, who has shown unprecedented contempt towards civil liberties for a White House Democrat, is also the first to favor cutting Social Security, and generally supports reactionary solutions to our economic crisis.

We like to think that changing parties alters far more than it often does. There may be a cultural trend that affects, for better or worse, whoever is in power. Thus, we have the irony of Richard Nixon being the last president whose domestic politics could be fairly described as liberal, while people like Clinton and Obama have moved the country to the right.

Driving this now is a culture of impunity, which, some years back, I described this way:

In a culture of impunity, rules serve the internal logic of the system rather than whatever values typically guide a country, such as those of its constitution, church or tradition. The culture of impunity encourages coups and cruelty, and at best practices only titular democracy. A culture of impunity differs from ordinary political corruption in that the latter represents deviance from the culture while the former becomes the culture. Such a new culture does not announce itself.

In a culture of impunity, what replaces constitution, precedent, values, tradition, fairness, consensus, debate and all that sort of arcane stuff? Mainly greed. We find ourselves without heroism, without debate over right and wrong, with little but an endless narcissistic struggle by the powerful to get more money, more power, and more press than the next person. In the chase, anything goes and the only standard is whether you win, lose, or get caught.

The major political struggle has become not between conservative and liberal but between ourselves and our political, economic, social and media elites. Between the toxic and the natural, the corporate and the communal, the technocratic and the human, the competitive and the cooperative, the efficient and the just, meaningless data and meaningful understanding, the destructive and the decent.

Today almost every principle upon which this country was founded is being turned on its head. Instead of liberty we are being taught to prefer order, instead of democracy we are taught to be follow directions, instead of debate we are inundated with propaganda. Most profoundly, American citizens are no longer considered by their elites to be members or even worker drones of society, but rather as targets – targets of opportunity by corporations and of suspicion and control by government.

One of the major shifting moments in our politics was the arrival of television. Up to that time, there was no media powerful enough to put the whole country on the same wavelength, using the same clichés, and assuming the same myths. Over time, television would transform politics from a community based activity to one in which you could simply buy your status.

This affected not only normal government but the nature of political corruption. Until television, political corruption in America was a feudal system: politicians gained power but in return were expected to provide distinct services to the voters. Now, with the size of campaign contributors being the major factor and advertising the major voice of politics, voters no longer matter the way they once did.

As one example of what has happened, let’s take another look at a politician widely regarded as corrupt: former DC mayor Marion Barry. In fact, the story is far more complicated.

I sometimes even describe Barryas the last of the great white mayors.

Barry was raised in Memphis TN at the end of almost a half century of political dominance in that town and Tennessee by the Crump machine. EH Crump was only mayor twice during that period but he controlled the politics for decades. Crump, like Earl Long of Louisiana, was rare among white southern politicians in that he actively organized and sought the support of black voters. One of Crump’s lieutenants, for example, was a black funeral director named Harold Ford, whose grandson of the same name served in the House as recently as 2007. And the blues godfather, WC Handy, even wrote a song, the “EH Crump Blues”, that would be sung on street corners to garner crowds for rallies.

I handled press for Barry in the 1960s when he was head of Washington’s SNCC. Our relations would sour over the years (“Where’s that son of a bitch?” he once asked my wife at a party), but even with his drug conviction and corrupt activities, you still had to admit that he was an exceptional politician. And as time went on, his politics seemed from out of another era, from that of Crump, Daley, and Curley – white mayors who defined urban politics at a time when the underclasses were struggling for equity against city elites. It was not that they weren’t corrupt, but that they gave something back to ordinary citizens in return.

For example, Barry started an important summer youth jobs program in the city when he became mayor. With instructive irony, a leading DC councilmember got sentenced to prison last year for embezzling over $350,000 – from a youth program.

Similarly, now councilmember Barry just held hearings on finding ways to help seniors get more food and other essentials while the nation’s designated black role model, Barack Obama, is cutting services to the poor and Social Security for seniors.

Not bad metaphors for the collapse of our political leadership.

Meanwhile, in academia, theoretically a source of national wisdom, a strange silence prevails. Could it be the effect, as reported by the Congressional Research Service, of the approximately 60% of federal research and development funds that go to academic institutions?

And what about a media controlled by fewer and fewer large corporations with journalists who have an increasingly hard time separating their jobs from the social and economic benefits of being nice to those in power? Just one small example:

“Project Censored researched the board members of 10 major media organizations from newspaper to television to radio. Of these ten organizations, we found there are 118 people who sit on 288 different American and international corporate boards proving a close on-going interlock between big media and corporate America. We found media directors who also were former Senators or Representatives in the House.”

And how can one brag of business leadership in an era when our economy is in such terrible shape?

Finally, we have from the arts little leadership as well, including not one major protest subculture since the punk rebellion.

It is not just the people who are the problem. It is the ideas and values that they spread with their power.

Consider how business school clichés have infected even non-profit organizations and how the media just accepts them as truths.

Or how our government is driven by data drones, process obsessives, legal lemmings, and test tyrants.

Or how our public school system is being wrecked by grotesquely erroneous concepts of education.

Or how our freedoms are being slashed in the name of a security that is drifting ever further away.

Or how truth, reality, decency, integrity, fairness and cooperation have become, in the eyes of our elite and its subservient media, just toys of the left and the naïve.

In such an environment, it is hard for anything good and valuable to survive, including a democratic republic.

This is the fourth in a series of essays on the end of the First American Republic. The earlier essays are posted herealong with some alternative approaches we might take.

Elite emmigration

 Sam Smith – There is major concern over the effects of immigration on our economy. It is also badly misplaced.

The foreign born population in the US – as a percentage of the total – is about what it was from 1860 to 1910: 13%. It has ranged from a low of 5% in 1970 to 15% between 1880 and 1900. The only time it has fallen below 10% in the past century was between 1940 and 1980. As for undocumented foregin born, the figure is less than five percent.

The truly bad effect on the American economy and on the First America Republic has not been the undocumented immigration of the poor but the undocumented emigration of the country’s elite, a record number of whom today have moved to a virtual global community without laws, without taxes, and without a nation and communities to which they feel loyalty and responsibility. While they still live in America, their jobs, employees, profits and souls all reside elsewhere.

We have never had an economic elite so fiscally separated from, and in many ways disloyal to, our land. Even the robber barons at least reinvested their money in Americans jobs and industry. The defection of our elite has been a major factor in the collapse of the First American Republic.

Some time ago I suggested testing out the role of immigrants with these questions:

1. Has a Mexican ever fired or laid you off?

2. Hasthe plant for which your worked until it was sent overseas been bought by Mexicans or is it still owned by the same people you used to work for?

3. Has a Mexican ever cut your pension or health benefits? Outsourced your job to India?

4. Do you think Mexicans or the pharmaceutical corporations are more responsible for high drug costs?

8. How much of the corruption in Washington has been instigated by the Mexicans?

9. Did the Mexicans’ make us invade Iraq?

The new reality I described it this way in Shadows of Hope, a book on Bill Clinton, in 1993:

The real Clinton foreign policy is simply this: there are no foreign countries any more, there are only undeveloped markets. The slogan has become “Make quarterly earnings growth, not war!” Trade has replaced ideology as the engine of foreign affairs.

At one level this should be celebrated, since it is far less deadly. On the other hand, this development also means that politics, nationhood and the idea of place itself is being replaced by a huge, amorphous international corporate culture that rules not by force but by market share. This culture, in the words of French writer and advisor to Francois Mitterand Jacques Attali, seeks an “ideologically homogenous market where life will be organized around common consumer desires.” It is a world that will become increasingly indifferent to local variation.

And when Attali speaks of American influence he says: “We have to build a word which would be ‘New York-Hollywoodization,’ because we are not Americanized in the sense that we are not going to be closer to St. Louis, Mo., or someplace else. These countries are far from us and we are far from them. They are less in advance, less influencing than New York and Hollywood.”

Here is a world in which Babar loses out to Mickey Mouse in France and where a sophisticated Frenchman speaks of St. Louis — but not Hollywood or Manhattan — as a foreign country. It is the world of what Marshall Blonsky calls “international man.”

International man — and he is mainly just that — is unlocalized. He wears a somewhat Italian suit, perhaps a vaguely British regimental tie, a faintly French shirt and shoes — says international man Furio Columbo, president of Fiat USA — “with an element of remembering New England boats and walking on the beach.” As Blonsky puts it, “You self-consciously splice genres, attitudes, styles.”
International man thrives in Washington. At the moment you call, though, he may well be in Tokyo, Bonn or London sharing with colleagues who are nominally Japanese, German or British a common heritage in the land of the perpetually mobile.

It is this unnamed country of international law, trade and finance, with its anthem to “global competition in the first half of the 21st century,” that is increasingly providing the substance and the style to our politics. It is their dual citizenship in America and in the Great Global Glob that characterizes the most powerful among us, now more than ever including even our own political leaders. International man dreams of things like NAFTA and GATT and then gets them passed. And he knows that he, as a corporate executive or licensed professional, will pass quickly through Mexican customs in his somewhat Italian suit and shoes with a hint of a New England beach because the agreement he helped to draft and pass has declared him entitled to such consideration. The union worker, the tourist from St. Louis, are, under the new world order, from far countries and so it will take awhile longer.

This then is the Clinton foreign policy: it is the policy of International Man, a policy that brings Mexico City ever nearer and starts to makes St. Louis a stranger in its own land. . .

How could any republic survive when its own elite deserts it the way ours has?

Dismantling the Democrats

Sam Smith – The last clearly identifiable period during which the Democratic Party was a positive influence on the country was during the Johnson administration. Since then the number of Democratic Party led improvements have been minor. If this seems excessively critical consider the following list of things achieved before 1970 and compare that to more recent legislation:

– Regulation of banks and stock brokerage firms
– Protection of your bank account
– Social Security
– A minimum wage
– Legal alcohol
– Regulation of the stock exchanges
– Right of labor to bargain with employers
– Soil Conservation Service and other early environmental programs
– National parks and monuments
– Tennessee Valley Authority
– Rural electrification
– College education for innumerable veterans
– Housing loans for innumerable veterans
– FHA housing loans
– The bulk of hospital beds in the country
– Unemployment insurance
– Small Business Administration
– National Endowment for the Arts
– Medicare
– Peace Corps
– Veterans benefits including health and housing

While the current problems of the Democratic Party began in the Carter era, they escalated with the choice of Bill Clinton as its presidential candidate.He was elected by the Democratic Leadership Council in part for the purposes he served, namely undoing key elements of Democratic past that even Ronald Reagan couldn’t achieve such as social welfare. Clinton has been outdone by another DLC vetted candidate, Barack Obama, who has not only showed contempt for the Constitution on various civil liberties issues but is the first president to propose reducing the benefits of Social Security and Medicare.

Further, since Clinton took office, the Republican Party has had no effective opposition and the Democratic Party has become overwhelmingly beholden to its corporate contributors.
Because the media has increasingly narrowed political coverage topics to the presidency, it is not widely known, for example, that Democrats held a 1542 seat lead in state legislatures in 1990. As of 1998 that lead had shrunk to 288. That’s a loss of over 1,200 state legislative seats, nearly all of them under Clinton. Not to mention that over 400 Democratic office holders became Republicans during the Clinton regime.

By 2011, the Democratic state legislative lead had disappeared the GOP was ahead by 622 seats.

Further, in 1992, the Democrats controlled 17 more state legislatures than the Republicans.After 1998, the Republicans controlled one more than the Democrats. Not only was this a loss of 9 legislatures under Clinton, but it was the first time since 1954 that the GOP had controlled more state legislatures than the Democrats (they tied in 1968). Today the GOP fully controls 8 more state legislatures than the Democrats.

As for the 2000 election, Democrats love to blame Ralph Nader for their loss, but the facts point in quite another direction: to Clinton and Gore. If, for example, you check the changes in Bush’s and Nader’s poll figures during the last month of the campaign, it is clear that Gore lost far more votes to Bush than to Nader.

It is also apparent that if Gore had disassociated himself from Clinton, he would have done far better in the campaign. According to the 2000 exit polls:

– 60% of voters disapproved of Clinton as a person
– 68% said he would go down in the history books for his scandals rather than for his leadership
– 44% thought the Clinton scandals were important or somewhat important.
– 15% of those who had voted for Clinton in 1996 voted for Bush in 2000.

And it was just not the leadership that was responsible for the party’s decline. As a portion of the Democratic base became more affluent, it increasingly separated itself from the concerns of less wealthy party members. This started the rise of the “Reagan Democrats” and continued as the party elite lost interest in economic issues that concerned a major portion of the American electorate.

There was nothing mutually exclusive between these economic issues and, say, gay or women’s rights, but Democrats lost the capacity to deal with both at the same time.As the Review noted some time back, “Franklin Roosevelt’s labor secretary, Frances Perkins, was central to more progressive economic legislation than the entire liberal movement has been able to come up with in the past thirty years. It’s hard to get liberals excited anymore about issues like pensions or the minimum wage and eventually politics reflects this fact. Consider the example of the women’s movement, which – with a few exceptions like the group Nine to Five – has been stunningly uninvolved with the most oppressed women in the country, those of lower incomes and social class. Further, treating those you should be organizing as just a bunch of Bible thumping, gun toting idiots doesn’t help much.”

And so it became possible for the Republicans to pose falsely as friends of the middle and lower classes while the Democrats did little to prove this was wrong while dismissing constituencies that once had been central to the party.

In short, over the past twenty years the Democratic Party has been neither purposeful, progressive nor particularly honest. It has been one more case study in the collapse of the First American Republic.