An excerpt from Why Bother?
by Sam Smith
All through my boyhood I had a profound conviction that I was no good, that I was wasting my time, wrecking my talents, behaving with monstrous folly and wickedness and ingratitude – and all this, it seemed, was inescapable, because I lived among laws which were absolute, like the law of gravity, but which it was not possible for me to keep. . . .
But this sense of guilt and inevitable failure was balanced by something else; that is, the instinct to survive. Even a creature that is weak, ugly, cowardly, smelly and in no way justifiable still wants to stay alive and be happy under its own fashion. — George Orwell in “Such, Such were the Joys. . .”
Orwell, a lower income boy in an upscale British public school, found himself alone, without mentor, parent, or therapist. Instead he faced “the schoolmasters with their canes, the millionaires with their Scottish castles, the athletes with their curly hair — these were the armies of the unalterable law. It was not easy, at that date, to realize that in fact it was alterable. And according to that law I was damned.”
Orwell went to school in another place and another time, but nearer and more recent is a school in Littleton, Colorado, where 13 youths were killed by a pair of fellow students. “Far from being a united, happy bunch,” wrote the London Guardian afterwards, “Columbine students operated a fiercely regimented social hierarchy”:
There were the jocks, principally the football team, regarded by the rest as being allowed to operate as a law unto themselves by the school authorities. There were the preppies, the rich kids, despised by their peers because of a perception that they could buy their way through life. There were the skateboard punks, the cool kids envied for their street style. And, right at the bottom of the food chain, there were the students who could not fit into any of the other groups, the quiet, brooding, intelligent ones …. These pupils were invariably shunned by the other tribes, and frequently bullied, verbally and physically.
Sara Rimer in the New York Times found something similar:
The ‘individuals’ shun the Gap and Abercrombie & Fitch, the labels favored by the jocks and preppies. [Jessica] would not be able to afford the $30 shirts and $25 hats even if she liked them. …. “‘One jock has a Hummer,” Jessica said. “He totaled one Hummer, and his dad bought him another.’ …. “
Susan Greene in the Denver Post quoted an anonymous 18-year-old who said he was taunted and terrorized by his schoolmates,
so-called jocks who called him ‘faggot,’ bashed him into lockers and threw rocks at him from their cars while he rode his bike home from school …. Jocks would ‘speed past at 40, 50 mph’ and toss pop cans or cups full of sticky soda at him. Sometimes they threw rocks or even sideswiped his bike with their cars. …. In the cafeteria, he continued, jocks threw mashed potatoes at him.
Long before cable TV and fashion magazines and high school students driving Hummers, the idea of an ideal lured Americans like sirens calling from the ledge. Tocqueville caught it early:
Among democratic nations men easily attain a certain equality of condition., but they never can attain as much as they desire. It perpetually retires from before them, yet without hiding itself from their sight, and in retiring draws them on …. They are near enough to see its charms, but too far off to enjoy them; and before they have fully tasted its delights, they die.
Long before MTV and chicken dressed in 16 herbs, there was an industrial revolution and in its wake, said Erich Fromm, man created a “world of man-made things as it never existed before. He has constructed a complicated social machine to administer the technical machine he built. The more powerful and gigantic the forces are which he unleashes, the more powerless he feels himself as a human being. He is owned by his own creation, and has lost ownership of himself.”
Long before Prozac and Ritalin and ecstasy there was soma, the drug of choice in Aldous Huxley’s Brave New World. Long before global eavesdropping by the National Security Agency there was 1984.
Yet something seems different. For example, suicide rates among the young have risen for four decades. Between 1980 and 1996, the suicide rate for young black men increased 105% and there has been a similar leap in suicides by all children between 10 and 14. According to one calculation, more than 1.5 million young people under 15 are seriously depressed. There has been a 40% increase in prescriptions for anti-depressants in recent years.
Kay Redfield Jamison, a leading researcher on suicide, tells of a 1997 study that found fully one in five high school students considering killing themselves in the past year, School slayings – though peaking in 1992-93 and but a fraction of youths murdered by members of their own families — reflect a change as well. In earlier years, reported the New York Times, most of these deaths were gang related, or were stabbings, or involved money or a fight over a girlfriend. Towards the end of the decade, the motive changed. According to Dr. Bill Reisman, who profiles youth behavior for law enforcement officials, the most common factor was deep depression: “They’ll all have depression in the state in which they do these things. When they’re cornered, the first thing they say is, ‘Kill me.’ It’s suicide by cop.”
A study of 2,000 young men and women found 43% saying that they sometimes are pushed too far and feel like they might explode. And 58% of this group said they would use a gun “if they had to.” The authors of the study, Liz Nickles and Laurie Ashcraft, observed that most people assume that “violent tendencies are the result of hands-off parenting.” Said Nickles to the New York Times, “In the population we studied the opposite is the case. And Ashcraft added, “Over-scheduled, pressured children are an emotional powder keg.”
Evidence of changes in the life of even younger children was found in a University of Michigan study that examined how time was spent by those aged 3-12 in 1981 and 1997, based on diaries kept by the youngsters or their parents. Among the changes:
Time spent playing: down 25%
Time spent in school, organized post-school programs and child care: up 8 hours a week
Time playing organized team sports: almost doubled
Time spent eating meals: down an hour a week
Time spend sitting and talking to someone at home: down 50% to one-half hour a week.
When the worst happens, the “adult” reaction is typically to expand our automated distrust of the young. Politicians and the media often deal with public tragedies of the young as though they were endemic rather than exceptions. Instead of reconciliation and comprehension, we demand still more control. But it never works quite the way we think it will, in part because the solutions are aimed at specific acts rather than the culture and problems that bred them. Thus we add surveillance cameras and guards at schools, and in so doing increase the sense of repression, distrust, and disconnection from the adult world that helped create the problem in the first place.
Corey Lyons, writing in the youth on-line zine, Brat, described it this way:
So here’s what we have: kids who are constantly feeling repressed. They feel like the world is out to get them. Everything from school to television to church tells them they aren’t good enough, that they are the problem. Then at school they face the same accusations of inadequacy by the self-righteous cliques that revel in our culture’s mundane standards. The world suppresses their feelings until they feel they have no choice but to explode.
Or withdraw. At the start of the Nineties, 40% of freshmen said that keeping up to date with political affairs was important. By 1998 only 27% said so. The number of freshmen participating in student elections dropped from 77% in 1968 to 21% three decades later. 36% said they are frequently bored in class, up ten points from 1985. 26% say their parents were either divorced or not living together, three times more than when first asked in 1972.
And it’s not just the young. Over half of Americans in 1996 saw a strong or moderate decline in the quality of television and entertainment, moral and ethical standards, family life, education and schools, the quality of our national leaders, our health care system, the work ethic, and our standard of living. In not a single area did more than a quarter of those responding see moderate or strong improvements. Worse, more and more Americans seem incapable of imagining life any other way or the possibility that they might have a role in making it otherwise. They have become passive consumers of their own demise.
Though we may concur in our critique, we don’t know what to do about it and we don’t know how to do it or whom to do it with. And as we struggle, all around us are shops with beautiful goods, photographs with beautiful bodies, politicians with beautiful economies, and the repetitive assurance that everything is all right. If we see it otherwise, it becomes our problem.
As our problem, we are quickly guided to beautiful solutions — to glib self-help literature, to multi-point nostrums, towards the smug assurance by the protected and entitled as they offer counsel to those who have never known a true triumph. This is America. If you can’t make it, it’s your own fault.
Even if you achieve a reputation for individuality, it becomes quickly stereotyped and you are expected to manifest your eccentricities in comfortably familiar and predictable ways. Composer John Cage once said that whenever he did anything new, people just wanted him to keep doing it over and over again. And Cage, unlike famous rock musicians, didn’t even have to contend with tens of thousands of fans in a stadium, an omnipresent media, and a cornucopia of temptations making choice as difficult as scarcity does.
Kurt Cobain sang, “I feel stupid and contagious . . . Here we are now entertain us.” When he was 12, “I wanted to be a rock and roll star, and I thought that would be my pay-back to all of the jocks who got girlfriends all of the time. But I realized way before I became a rock star that was stupid.” Years later Cobain saw the pay-back as even less appealing:
I think of myself as a success because I still haven’t compromised my music, but that’s just speaking on an artistic level. Obviously, all the other parts that belong with success are just driving me insane. What I really can’t stand about being successful is when people confront me and say, ‘Oh, you should just mellow out and enjoy it’. I don’t know how many times I have to fucking say this. I never wanted it in the first place.
And in his suicide note, he wrote
Sometimes I feel as if I should have a punch-in time clock before I walk out on stage
Cobain was far from alone. Here are some other musicians, many compiled by Michael Woodall, who killed themselves or died of drug-induced causes. Not included are those who died in airplane crashes or under disputed circumstances, who were murdered:
Johnny Ace, Chris Acland, John Belushi, Mike Bloomfield, Tommy Bolin, Graham Bond, John Bonham, Adrian Borland, Roy Buchannan, Tim Buckley, Paul Butterfield, Glen Buxton, David Byron, Steve Clarke, Kurt Cobain, Ian Curtis, Nick Drake, Tom Evans, Peter Farndon, Bobby Fuller, Danny Gatton, Lowell George, Ric Grech, Pete Ham, Donny Hathaway, James Honeyman-Scott, Shannon Hoon, Douglas Hopkins, Randy Jo Hobbs, Michael Hutchence, Robert Johnson, Billy Jones, Brian Jones, Janis Joplin, Paul Kossoff, Frankie Lymon, Richard Manuel, Robbie McIntosh, Joe Meek, Jonathan Melvoin, Keith Moon, Billy Murcia, Brent Mydland, Bradley Nowell, Phil Ochs, Brian O’Hara, Gram Parsons, Kristen Pfaff, Danny Rapp, Bon Scott, Del Shannon, Mel Street, Screaming Lord Sutch, Gary Thain, Johnny Thunders, E. William Tucker, Sid Vicious, Paul Williams, Rozz Williams, Wendy O Williams, Kevin Wilkinson, Alan “Blind Owl” Wilson.
These were voices with whom contemporary youth on both sides of the Atlantic grew up. Their average age at time of self-inflicted death: 34.1 years.
Cobain sang, “Gonna do it gonna die/Slowly, lonely, holy, lonely.”
Somebody quoted Nine Inch Nails on a Nirvana web bulletin board, “I hurt myself today to see if I still feel. I focus on the pain, the only thing that’s real”
And Marilyn Manson told a Milwaukee newspaper, “I try to show people that everything is a lie — pick the lie you like best — and I hope mine is the best.”
Ever since I read Thoreau in high school and adopted as my own his declaration that he would rather sit alone on a pumpkin than be crowded on a velvet stool, I have made the pursuit of individual freedom a part of my daily business. I follow it like others follow football. I know the game, the players, and the rules. And one of the most important things I have discovered is how few people are able to help you much. The psychiatrist with his elegant degree, the minister with an eye on the vestry’s budget, the philosophy professor just short of tenure, and, yes, even the author of this book are likely to fail because their work has become partly the suppression of their own free will.
This compromise may in the end represent a better balance of desire and action than that of which you dream, but it will not necessarily guide you well. It will not teach you how to do something even though you are afraid; even though no one else is suggesting that you do it let alone making you do it; it will not tell you when to take the risk of doing the right thing or trying something that no one has done before.
Which may be why one of the wisest observations on this subject came to me from a former LA narcotics detective. This detective, investigating corruption and involvement by intelligence agencies in the drug trade, has repeatedly put his life at risk to get to the bottom of an extremely dirty business. He has two bullet holes in his left arm and one in his left ear. While describing his efforts to a small group, someone asked how he managed to stick at it year after year despite having been shot at and threatened. He said he had borrowed a trick another cop had taught him; when in danger he simply considered himself already dead. Then he was able to move without fear.
Such an ability to confront and transcend — rather than deny, adjust to, replace, recover from, or succumb to — the universe in which you find yourself is among the things that permits freedom. This man, with Buddhist-like deconstruction and Christian-like rebirth, had taken apart the pieces of his fear and dumped them on the ground — a mercy killing of dreams and nightmares on behalf of survival.
Yet imagine the health column in your local paper suggesting that you cure your phobia by going about pretending you are dead. Or phoning your psychiatrist in the middle of the night and having her tell you, “Just do the drop-dead thing I told you about and call me in the morning.”
There is something almost perverse and subversive about the idea; like playing Russian roulette with your brain. One of the rules of our culture is not to let our minds come too close to death.
Albert Camus, on the other hand, claimed that the only serious philosophical question is suicide. Of this provocation, Christopher Scott Wyatt has written:
According to Camus, suicide was a sign that one lacked the strength to face “nothing.” Life is an adventure without final meaning, but still worth experiencing. Since there is nothing else, life should be lived to its fullest and derive meaning from human existence.
Kierkegaard said that “the more consciousness, the more intense the despair,” and that the torment of despair is not being able to die. But lurking in this death wish, paradoxically, is a passion for life. If one survives this perilous proximity of death, one consciously chooses life.
But just what has been chosen? Certainly not a pristine and puerile Pleasantville of the soul. Perhaps it is what one of Camus’ characters says: “I have surrendered myself to the magnificent indifference of the universe.” Or perhaps we make Kierkegaard’s “leap of faith,” of which Donald Palmer wrote:
The negative is present in all consciousness. Doubt accentuates the negative. Belief chooses to cancel the negative. Every mortal act is composed of doubt and belief …. It is belief that sustains thought and holds the world together. Nevertheless, belief understands itself as uncertain, as not justified by any objective fact.
It is, in the end, your choice. Take a leap of faith and end up at the local church hoping someone can explain all this better to your kids in Sunday school than you can, or ride bareback across the philosophical and theological plains. In either case, as long as you fully engage with life, doubt will not be far away.
The next witness is Corporal Gary Schluter of the Florida Highway Patrol. He has worked the Sunshine Skyway Bridge across Tampa Bay where his tasks included stopping people from killing themselves. He had little formal training, but over a three year period he and his colleagues helped to save over half the 53 people who tried to jump. In one instance, after talking for 40 minutes to a would-be jumper sitting on the rail of the bridge, legs over the side, the man said, “Gary, I’m really sorry, but I really have to go.” Schluter and another officer inched close, finally near enough to gently lay a hand on the man’s leg. “That’s what I needed,” the man said and came down off of the wall. Just a touch.
Schluter’s partner, James Covert told the New York Times’ Rick Bragg, “I believe that everyone who goes up there has the intention of going through with it. They feel they’ve exhausted their options, and this is the last part of their lives they have control of. I tell them this is a permanent solution to a temporary problem.”
In fact, a study done of attempted suicides on the Golden Gate Bridge — where 1,200 have jumped to their deaths over six decades — found that those who survive rarely try it a second time. As Emerson said, “We learn geology the day after the earthquake.”
Happily, there is only one suicide each year for every 9,000 people in this country. There is less than a one in a million chance a student will be murdered on campus. Twice as many people were killed by lightning in 1997 as died in such incidents.
Still, about a half million Americans are treated in emergency rooms each year after trying to kill themselves. And as the suicide rate of people over 50 decreased in recent decades, the adolescent suicide rate almost tripled between 1960 and 1990. Further, the ratio of attempted to achieved suicides may be 20 times higher for the young than for adults.
If one comes down off the bridge (metaphorical or real) and resumes endlessly pushing the stone up the hill so it can roll back down again, you find yourself once more living with the inexplicable, the insoluble, the absurd. Camus pulled no punches on this score: “Living the absurd… means a total lack of hope (which is not the same as despair), a permanent rejection (which is not the same as renunciation), and a conscious dissatisfaction (which is not the same as juvenile anxiety).”
Can we handle it? Or do we escape by saving our bodies and letting our soul and minds leap for us? Do we become among those who, as Benjamin Franklin suggested, die at 25 but aren’t buried until they are 70?
Camus and Kierkegaard are called existentialists. When you see that term these days it is often moored alongside another: angst. To suffer public angst or ask deep questions without good answers is to be a bit quaint and out of touch — a Woody Allen in a world full of Bill and Hillary Clintons. In fact, even to admit such doubts is a sign of weakness that might cost you a another date, if not a promotion or an election.
We prefer something less difficult, such as the comfort of conformity, illusory perfection, and the presumption of immortality. We live in a society that expects machines, including the human one, to work without flaw, and when they don’t, we feel confused and let down. Such expectations and the optimism that drives them have helped to give us great medical, scientific and technological advances, but they have also made us less wise and alive than we might otherwise be. And so popular literature overflows with prescriptions, reforms, recipes, repairs and remodeling. As a good American, I have contributed my share but like other good Americans I have also left out something important: What to do when it doesn’t work? When nothing works? When no one even seems to care whether it works. And if no one else cares, why should you? Why bother?
The most common reaction to despair may be no more dramatic than a sense of boredom, of apathy, and indifference. In many ways, this is precisely the response our culture would prefer. It makes us ideal consumers of experience and excitement and assures that we won’t interfere with the flow of goods and services by introducing novel notions of how society might be better rearranged.
Or one might take that leap of faith towards something that protects us from the unknown. “Life is at the start a chaos in which one is lost,” wrote José Ortega y Gasset:
The individual suspects this, but he is frightened at finding himself face to face with this terrible reality, and tries to cover it over with a curtain of fantasy, where everything is clear. It does not worry him that his ‘ideas’ are not true, he uses them as trenches for the defense of his existence, as scarecrows to frighten away reality.
And here lies the paradox of therapy or, as Ernest Becker calls it, psychological rebirth:
If you get rid of the four-layered neurotic shield, the armor that covers the characterlogical lie about life, how can you talk about ‘enjoying’ this Pyrrhic victory? The person gives up something restricting and illusory, it is true, but only to come face to face with something even more awful: genuine despair. Full humanness means full fear and trembling, at least some of the waking day. When you get a person to emerge into life, away from his dependencies, his automatic safety in the cloak of someone else’s power, what joy can you promise him with the burden of his aloneness?
You don’t have to be a psychiatrist to confront this anomaly. I have spent my journalistic life attempting to tell people things that will help them understand what is really happening around them. Yet the closer I have come to succeeding, the more resistance I have found. For some, even asking hard questions is a suspect activity. And why not? After all I am stealing their scarecrows.
Fortunately, not everybody thinks so. My last book was about politics. After it was published, however, something unexpected happened. I found people writing and talking to me not so much about the ideas in the book — agreeing with this or castigating that — but about its hope, about the possibility that one could still change things for the better. These people hadn’t been ready for politics because they simply didn’t believe it would work.
Here’s part of one letter from a recent college graduate who had read a chapter excerpted in Utne Reader:
I am just out of college here in central Illinois and, while I hang out with a varied and intelligent group of friends, I am often disturbed by our collective sense of boredom — or even apathy — towards many of the very aspects of society we seem to rail against continually. I gave your article to some of these friends of mine and everyone agreed that we had been going about our business all wrong. The only effective and lasting way to tackle the tough, entrenched problems America presents is in a positive, almost cheerful mood; taking the negative all the time only drags the protester down, and usually nothing will get done.
Funny thing, I’ve gone through most of my life assuming things will just take care of themselves and in any case how could one person make any kind of a difference anyway?. . . . I was driving to work and it occurred to me that despite the huge machine we all rather dumbly assume is taking care of everything, it really takes just a handful of people to make things right. And then your [radio] interview came on . . . The point you were making about human beings being the ones to make change — the first step of which is to care — really struck home since that realization had just hit me.
Another woman wrote me to say that the book had caused her to look anew at her political involvement which, she said, had moved from apathetic to pathetic.
But then a man in his 20s came up to me after a talk and said, “I arrived late and I heard you mentioning choice, so I thought you must be talking about abortion.” He politely suggested that I be more cautious in using the term.
A word that was for me at the core of what it means to be alive and human was for him simply the right to select a certain clinical procedure that would, justifiably or not, actually prevent life. Between our differing views of that one word’s meaning lay several decades of freedom’s decay, constriction of choice, and evaporation of hope.
One side of the divide still hears echoes of Emerson: “The office of America is to liberate, to abolish kingcraft, priestcraft, caste, monopoly, to pull down the gallows, to burn up the bloody statute-book, to take in the immigrant, to open the doors of the sea and the fields of the earth.” From the other side comes the voice of New York mayor Rudolph Giuliani: “Freedom is about authority. Freedom is about the willingness of every single human being to cede to lawful authority a great deal of discretion about what you do.”
It’s not just the words that have changed. More than 30 years ago, a small group of Washington citizens set out to stop the construction of freeways that were headed through their homes, their neighborhoods, and one of the more attractive cities in America. My wife Kathy recalls going with me to a meeting of anti-freeway activists shortly after we married in 1966. Recently arrived from neat, orderly Wisconsin, she couldn’t believe that this mere handful of people thought they actually were going to stop a freeway.
In fact, we weren’t able to stop the one we discussed that night, but before the fight was over we had halted much of a road system that would have turned Washington into an east coast Los Angeles.
We used every tool within our reach, including music and art. When city council meetings went awry, protesters would stand and sing, “Oh beautiful for spacious roads. . .” When the leader of the movement, commercial artist Sammie Abbott, discovered secret plans for a route through the middle of the black center city, he designed a oversized two-color poster that was plastered all over the affected neighborhood. Under a headline, “White Men’s Road’s Through Black Men’s Homes,” it clearly outlined in red every building that would be destroyed. Fearing a riot, the city backed off the freeway plan within a few weeks.
The homes in the path of the bulldozers belonged to both whites and blacks and the movement left a heritage of biracial politics that would soften some of the ethnic polarization of the city. Besides, when you held a rally and the main speakers were Grosvenor Chapman, president of the all white Georgetown Citizens Association, and Reginald Booker, president of the militant Niggers Inc,, even the most dismissive politician had to take notice.
All that, though, was more than 30 years ago. Anne Heutte, one of those deeply involved, recalls that “the hard thing during the freeway fight was getting people to believe that it was really possible to change things.” Today, what was once hard now seems impossible. A woman told me recently, “A lot of people are afraid of politics. A lot of people want to be told what to do.” A Northwestern University student says of her classmates that they “seem to be scared of activism. It’s better just to be quiet and go with the flow.”
In 1999, the Washington Post published an article by the chair of the Metropolitan Board of Trade. In it John Schwieters argued that Washington had made a terrible mistake by not becoming an east coast Los Angeles. Schwieters’ pitch: let’s do now for freeways what those “special interests,” i.e. ordinary citizens, prevented the city from doing in the 1960s. As I read the article, I began counting the troops currently on our side, our arsenal, our artillery, our allies. Then I counted those who had moved out, sold out, died out, or burned out and tried to calculate the entropy of spirit, hope and will involved. It was no illusion. Times had indeed changed.
Those around in times when effort bore some rough correlation to result tend to overrate themselves and underrate their era in crediting their good fortune. Hard as we may have rowed, we were borne on favorable currents. And when we flagged, there were plenty of others who pulled our weight.
It’s not like that now. Consider, for example, the problem of discovering unpleasant truths about our land. If a revolution takes place in the forest and no one reports it, does it make a sound? If the second coming occurred tomorrow, would the media cover it? There seems little doubt but that the civil rights, peace, and women’s movement would have had far less salutary outcomes had they been forced to confront today’s media and the skill with which it ignores that it doesn’t like. Gone is the ground rule that once required social and political change to be covered — even if the publisher didn’t approve of it. Gone is the notion that if you made news, they would come. In an age of corporatist journalism, in which Peter Jennings has become the professional colleague of Mickey Mouse and Donald Duck, it no longer matters. News is just another item in the multinational product line with little value outside of its contribution to market share and other corporate objectives.
Worse, it has become just about impossible to find anyone in power who is ashamed of this. In fact, it is just about impossible to find anyone in power who is ashamed of anything. For centuries, shame has been one of the most useful restraints on power. As Edmund Burke noted, “Whilst shame keeps its watch, virtue is not wholly extinguished in the heart.” But one of the perks of contemporary power is to exist without shame.
Shame and its benign cousin, conscience, once served a less public but equally vital role. The belief that if one tried hard enough, you could draw clean water even from a seemingly dry well, kept many an activist striving beyond rational expectations.
But disillusionment set in. The civil rights activist John Lewis would later recall the attempt to unseat the all-white Mississippi delegation at the 1964 Democratic convention: “This was the turning point for the civil rights movement …. Until then, despite every setback …. the belief still prevailed that the system would work, the system would listen …. We had played by the rules, done everything we were supposed to do, had arrived at the door-step and found the door slammed in our face.” The writer Dorothy Allison has also spoken of betrayed optimism: “I had the idea that if you took America and shook it really hard it would do the right thing.”
As such possibilities faded we eventually found ourselves in a time when the concept of wrong was just one more social construct to be argued about on a talk show, one more small obstacle people put in your way on your climb to the top. The effect on efforts for change was like trying to bake bread without yeast.
Moral sensibility became part of the folkways of the masses, exploitable for ratings, but not necessary or desirable for actual policy or for those who made it. Thus the time between when the prominent declared their shock at some behavior and when they declared that we “should move on and put this behind us” became ever shorter.
The reporter risking status by telling the truth, the government official risking employment by exposing the wrong, the civic leader refusing to go with the flow — these are all essential catalysts of change. A transformation in the order of things is not the product of immaculate conception; rather it is the end of something that starts with the willingness of just a few people to do something differently. There must then come a critical second wave of others stepping out of a character long enough to help something happen — such as the white Mississippian who spoke out for civil rights, the housewife who read Betty Friedan and became a feminist, the parents of a gay son angered by the prejudice surrounding him. But for such dynamics to work there must be space for non-conformity and places for new ideas and the chance to be left alone by those who would manipulate, commodify, or destroy our every thought.
To be sure, thirty years ago some of those seeking change — especially those demanding justice in the south — found themselves confronted with far more life-threatening dangers than does today’s cultural rebel. But on average, activists today face a more hostile media, a more repressive government, a more passive and defeated potential constituency, and an extraordinary competition for people’s time and interest. One reason for this is that the dogs and clubs of Bull Connor’s cops have been replaced by far more subtle stratagems. For example, if you choose to challenge authority, you may be labeled delusional, dangerous, or both. In recent years, both state and media have taken to dubbing someone a “paranoid” or a “conspiracy theorist” simply for not accepting the conventional wisdom about a politician or issue.
Meanwhile the reliably non-rebellious are kept that way in part by a surge of messages implying that they are but a stone’s throw from inadequacy or danger of some sort or other. For example, one of the greatest changes in the media in recent decades has been the spread of a nanny-like tone to its ministrations, exemplified by the mass hypochondria encouraged by disease-of-the week TV shows and sternly prescriptive health columns.
Usually missing from these accounts, however, is any suggestion that corporations, government, and the culture as a whole might play a role in our health. For example, breast cancer rates have increased throughout the world since 1930 about 1 to 2 percent a year. According to World Watch, when you take all the controllable and uncontrollable personal factors into account — from diet to age of first menstruation — only 20-30% of all breast cancers can be explained. Does the environment explain much of the rest? We don’t know, but we do know politicians and the media aren’t particularly interested in finding out. Nor was the national newsweekly that trumpeted a two-decade explosion in the rate of skin cancer without a single mention of changes of environmental factors. And you won’t find a magazine in your doctor’s office with the headline, “How Government Reform Can Reduce Breast Cancer Six Ways.”
Similarly, economics can affect your health in ways ranging from your sense of well-being to not being able to afford a doctor, but you will never see an obituary that says, “Jones had been in poor wealth for many years.” And culture plays its part. For example, 50% more French men possess two of the four major heart attack risk factors than do American men, but French men are one-third less likely to die of the disease than are US males. On the other hand, the supposedly nonchalant Frenchman is 50% more likely to kill himself than is his American counterpart, nearly half again as likely to take a tranquilizer and 13 times more likely to go to a doctor because of depression.
Yet despite the importance of such external factors as economics, culture, and the environment, the overwhelming message of contemporary America is that you bear full responsibility for your own health. The personal role is awesome enough without adding to our lives blame for whatever physical, cultural, political, or psychological poisons are also involved.
This is no less true of one’s state of mind. To view our times as decadent and dangerous, to mistrust the government, to imagine that those in power as not concerned with our best interests is not paranoid but perceptive; to be depressed, angry or confused about such things is not delusional but a sign of consciousness. Yet our culture suggests otherwise.
But if all this is true, then why not despair? The simple answer is this: despair is the suicide of imagination. Whatever reality presses upon us, there still remains the possibility of imagining something better, and in this dream remains the frontier of our humanity and its possibilities To despair is to voluntarily close a door that has not yet shut. The task is to bear knowledge without it destroying ourselves, to challenge the wrong without ending up on its casualty list. “You don’t have to change the world,” the writer Colman McCarthy has argued. “Just keep the world from changing you.”
Oddly, those who instinctively understand this best are often those who seem to have the least reason to do so – survivors of abuse, oppression, and isolation who somehow discover not so much how to beat the odds, but how to wriggle around them. They have, without formal instruction, learned two of the most fundamental lessons of psychiatry, philosophy, and religion:
You are not responsible for that into which you were born..
You are responsible for doing something about it.
These individuals move through life like a skilled mariner in a storm rather than as a victim at a sacrifice. Relatively unburdened by pointless and debilitating guilt about the past, uninterested in the endless regurgitation of the unalterable, they free themselves to concentrate upon the present and the future. They face the gale as a sturdy combatant rather than as cowering supplicant.
Judith Herman, a specialist in psychological trauma, says the most important principles of recovery for abused persons are “restoring power, choice, and control” and helping the abused reconnect with people who are important to them. In short: choice and community. Survivors understand this implicitly even if they can’t or don’t express it.
My friend Steven Wolin is a psychiatrist who has been particularly interested in children of alcoholics. Wolin described his training:
In my psychiatric residency that followed medical school, I glibly applied the terminology of physical disease to the ‘disorders’ of behavior and the mind. Eventually, I became so immersed in pathology that I no longer even used the word healthy. Instead, I conceived of health as the absence of illness and referred to people who were well as “asymptotic,” “nonclinical,” unhospitalized,” or “have no severe disturbance.” In retrospect, the worst offender was the term “unidentified,” as if the only way I could know a persons was by his or her sickness. The peculiar vocabulary that my colleagues and I used to describe our patients reflected our meager regard for the forces that keep people healthy. . . .
Wolin would come to call this the “damage model,” in which “pathologies are layered on pathologies, and eventually the survivor is no better off than his or her troubled parents.”
But slowly, both in his research and therapy, he found what the profession calls “clinical failures,” which is to say that the damage model didn’t hold up. Not only did Wolin discover that the transmission of addictive drinking from parent to child was not as predictable as he had expected, not only did some of the children lead surprisingly satisfying lives, but the prescribed therapeutic techniques did not always work as well as they might. Among the reasons:
· There was too much focus on the past and not enough on how to build a future
· Instead of being energized, some survivors fell into what Wolin calls the “victim’s trap”
· The model overemphasized pain at the expense of possibility, causing some survivors, who had left the past behind, to begin feeling like walking time bombs trapped by the inference that family problems inevitably repeat themselves from generation to generation.
Some of Wolin’s patients had done extremely well on their own. Among their techniques: not dwelling on the past, not blaming their parents, and not becoming victims. They discovered and built on their own strengths, deliberately improved upon their parents’ lifestyles, consciously married into strong families, and replaced memories of bad family gatherings with satisfying rituals of their own.
Out of this tough and honest reevaluation of his own work, Wolin and his wife (a developmental psychologist) came up with what they called the “challenge model” in which the experience of damage is balanced by conscious and unconscious resiliencies, in which trouble is not denied but neither is it allowed to rule.
In The Resilient Self, Steven and Sybil Wolin list ways in which survivors reframe personal stories in order to rise above the troubles of their past: insight, independence, relationships, initiative, humor, creativity, and morality. Survivors often strike out on their own, find other adults to help them when their own family fails them, and reject their parents’ image of themselves.
The book is not only a personal guide for those who are or would be survivors. It is, whether intended or not, also a political guide. After all, our country and culture often stand in locus parentis and many of the pathologies we associate with families are mirrored and magnified in the larger society. Yet when we seek political therapy we repeatedly run up against a damage model enticing or forcing whole communities or groups into victimhood and leading them towards blame or surrender rather than resilience.
If insight, independence, relationships, initiative, humor, creativity, and morality form sturdy support for personal resilience, might they not also serve us collectively as the abused offspring of a culture that is chronically drunk on its own power and conceits?
Not far away from the Wolins’ Washington office is a community centered around U Street now known as Shaw where for decades just such a collective form of survival thrived. It has been a particular interest of my wife, historian Kathryn Schneider Smith. In the wake of the Civil War, this area north of Washington’s downtown — originally occupied by both whites and blacks — experienced a building boom. With Jim Crow and the coming of the streetcar, whites moved beyond the center city and blacks increasingly found themselves isolated. Until the modern civil rights movement and desegregation, this African-American community was shut out without a vote, without economic power, without access, and without any real hope that any of this would change.
Its response was remarkable. For example, in 1886 there were only about 15 black businesses in the area. By 1920, with segregation in full fury, there were more than 300.
Every aspect of the community followed suit. Among the institutions created within these few square miles was a building and loan association, a savings bank, the only good hotel in the Washington where blacks could stay, the first full-service black YMCA in the country, the Howard Theatre (opened with black capital twenty years before Harlem’s Apollo converted to black performances) and two first rate movie palaces.
There were the Odd Fellows, the True Reformers, and the Prince Hall Lodge. There were churches and religious organizations, a summer camp, a photography club that produced a number of professional photographers, settlement houses, and the Washington Urban League.
Denied access to white schools, the community created a self-sufficient educational system good enough to attract suburban African-Americans students as well as teachers from all over the country. And just to the north, Howard University became the intellectual center of black America. You might have run into Langston Hughes, Alain Locke, or Duke Ellington, all of whom made the U Street area their home before moving to New York.
This was a proud community. “We had everything we needed,” recalls one older resident. “And we felt good about it. Our churches, our schools, banks, department stores, food stores. And we did very well.”
The community shared responsibility for its children. A typical story went like this: “There was no family my family didn’t know or that didn’t know me. I couldn’t go three blocks without people knowing exactly where I had been and everything I did on the way. It wasn’t just the schools. We learned from everyone. We learned as much from Aunt So-and-So down the street, who was not even related to us.”
All this occurred while black Washingtonians were being subjected to extraordinary economic obstacles and being socially and politically ostracized. If there ever was a culture entitled to despair and apathy it was black America under segregation.
Yet not only did these African-Americans develop self-sufficiency, they did so without taking their eyes off the prize. Among the other people you might have found on U Street were Thurgood Marshall and Charles Houston, laying the groundwork for the modern civil rights movement.
Years later, while serving on a NAACP task force on police and justice, I would go to a large hall in the organization’s headquarters on U Street — at the same address that was on the 1940s flyers calling for civil rights protests. In that hall, except for the addition of a few plaques, nothing much has changed over the decades. We only needed two tables pushed together so there was plenty of room for the ghosts of those who once sat around such tables asking the same questions, seeking the same solutions, striving for some way for decency to get a foothold. Basic legal strategies for the civil rights movement were planned along this street. Did perhaps Thurgood Marshall or Clarence Mitchell once sit at one end of this hall and also wonder what to do next? Just the question lent courage.
With the end of segregation, as free choice replaced a community of necessity, the area around U Street began to change. The black residents dispersed. Eventually the street would become better known for its crime, drugs, and as the birthplace of the 1968 riots. The older residents would remember the former neighborhood with a mixture of pain and pride — not unlike the ambivalence found in veterans recalling a war. None would voluntarily return to either segregation or the battlefield but many would know that some of their own best moments of courage, skill, and heart had come when the times were at their worst. Some of the people in this community were only a couple of generations away from slavery, some had come from Washington’s early free black community. But whatever their provenance, they had learned to become self-sufficient in fact and spirit even as they battled to end the injustices that required them to be so.
One black woman with whom I worked closely, Josephine Butler, first went on a picket line in the 1930s. Only multiple heart attacks in the mid-1990s stopped her from doing so again. The last time I saw Jo Butler in the intensive care unit we discussed books. Though burdened with the cold, involuntary appendages of medical technology, Jo spoke with the same enthusiasm she applied to the latest political developments.
In fact, there was little — from earthworms to earth-shaking — that did not stir Jo’s curiosity and, when required, her compassionate and effective concern. Though her heart might be filled with the overwhelming political and social problems of our time, her eye was always on the sparrow and she seldom wasted much time on sorrow.
I loved to run into Jo on the street — her bag overflowing with yet to be distributed documents of truth and her hat bedizened with buttons — campaign ribbons from the endless battlefields where she had stood on the side of the fair, the decent, and the just. She carried the spirit of the city and the spirit of hope not as a possession or a totem, but as seeds to share with anyone who would stop and talk for a moment or two.
She had worked longer for, and lost more battles on behalf of, justice than anyone I ever met, yet I never saw her fearful, impatient or exhausted. She would, from time to time, show up on my block of Connecticut Avenue like some angel on a temporal inspection tour. We would talk, and laugh, and worry together and when we parted I would always feel more directed, more responsible for what was happening around me, but also happier and braver and more willing to try the difficult one more time. She lived that life so well described by the poet Samuel Hazo, filled with “hard questions and the nights to answer them, and grace of disappointment, and the right to seem the fool for justice.”