Friday, August 10, 2012

Stupidity, Postmodernism, and the Shallows

Occasional columnist and former radio personality Bruce Maiman had a piece a couple of days ago in the Sacramento Bee entitled “As stupidity becomes fashionable, we all lose,” which unwittingly makes the author complicit in the title’s assertion.

The Recovering Bureaucrat almost always reads Mr. Maiman’s columns, not so much because of his powerful insights about our political economic problems and what to do about them, but because at least he cares about their complexity and is not necessarily wed to conventional viewpoints of them.

This column is an example of the value of his contributions.  Almost no one in the MSM tackles the issue of the dumbing down of our public discourse—unless it’s to offer the inanity that conservatives are stupider than liberals, based upon nothing more substantial than projection.  Since such contentions simply confirm both liberals and conservatives in their prejudices about each other, it’s essentially just another rotten tomato in the left-right food fight.

Mr. Maiman uses the occasion of the death of author and liberal gadfly Gore Vidal to wonder how we all got so stupid in the course of a generation.
We had a moment some 50 years ago when intellectualism and literacy were prized in American culture, when writers, artists and scientists were also genuine celebrities, fixtures on talk shows and in socialite columns, not only for what they wrote, painted or theorized, but because they were thinkers cultivated by a nation that aspired to intelligence.
Since Mr. Maiman is a Baby Boomer himself, it’s a puzzle to the RB that he actually believes this goofiness.  The RB was ten when John F. Kennedy brought Camelot to Washington DC, but he remembers—and histories of the period confirm—that the glitz was more important that the commitment to serious inquiry.  

The 60s were the zenith of the Industrial Age with its blind belief in the physical sciences as the epitome of all necessary knowledge.  The Kennedy-Johnson years celebrated the muscularity of the era by assiduously promoting the Washington-New York power axis, built upon the unprecedented wealth of America’s industrial might, which had dominated the global economy since the beginning of World War II.  As ruthlessly portrayed in the AMC series Mad Men, the America of that period was already being seduced by the bling of the emerging consumer economy that the profits of the time made possible on a national scale.

Among the “innovations” of the period was the creation and spread of what we now deride as “the mainstream media.”  But the dynamics of the age required significant accumulations of capital and therefore political power to wring as much value out of industrial production as possible.  Profitability was based on mass production and distribution, and by the mid-1960s the mass consumer market impacted almost all elements of the American middle class.  As Wired magazine editor Chris Anderson demonstrated cogently in his insightful book The Long Tail, the communications technology of the times dictated that only large corporate organizations, such as the three national television broadcast networks, nationally distributed magazines like Life and Time, and large-circulation daily newspapers in every urban center, could operate profitably.

At the same time, the limitations of these media dictated that only items of mass appeal could be profitably sold to an emerging national market; out of these contingencies was born the mass culture of the Boomers childhood and adolescent experience.  Anderson writes:
The past century of entertainment has offered an easy solution to these constraints. Hits fill theaters, fly off shelves, and keep listeners and viewers from touching their dials and remotes. Nothing wrong with that; indeed, sociologists will tell you that hits are hardwired into human psychology, the combinatorial effect of conformity and word of mouth. And to be sure, a healthy share of hits earn their place: Great songs, movies, and books attract big, broad audiences.
But most of us want more than just hits. Everyone's taste departs from the mainstream somewhere, and the more we explore alternatives, the more we're drawn to them. Unfortunately, in recent decades such alternatives have been pushed to the fringes by pumped-up marketing vehicles built to order by industries that desperately need them.

Hit-driven economics is a creation of an age without enough room to carry everything for everybody. Not enough shelf space for all the CDs, DVDs, and games produced. Not enough screens to show all the available movies. Not enough channels to broadcast all the TV programs, not enough radio waves to play all the music created, and not enough hours in the day to squeeze everything out through either of those sets of slots.

This is the world of scarcity.
The intellectual culture Mr. Maiman remembers is just another result of the mass marketing of the Washington-New York megalopolis’ preferences, with an eager assist from Hollywood.  The dynamics of scarcity lent themselves perfectly to this dominance, for the concentrations of people and capital meant that this market could overshadow all the smaller ones.  Even so, there is little evidence that this eastern seaboard market, much less American mass culture, ever actually prized “intellectualism and literacy” as cultural achievements.

This is not to deny that the elites of the Washington-New York axis didn't tell themselves this comforting myth; there is plenty of evidence to indicate they actually believed their own PR.

The reality on the ground for most Americans was quite different.  Mr. Maiman quotes Susan Jacoby, Baby Boomer “atheist and secularist” and author of The Age of American Unreason, as acknowledging “America’s endemic anti-intellectual tendencies,” which, she says—accurately, in the RB’s humble opinion—“have been grievously exacerbated by a new species of semiconscious anti-rationalism, feeding on and fed by an ignorant popular culture of video images and unremitting noise that leaves no room for contemplation or logic.”

Two trends, not acknowledged by Mr. Maiman (although Ms. Jacoby gets one of them) have been at work here.  The first are the rapidly shifting media of communications from literary instruments—books, newspapers, and magazines—to visual instruments—movies, television, and now Internet productions.  The second is the postmodern overthrow of Reason as a central dynamic of human analysis and decision-making.  This was enabled by the occupation of European and American universities by the postmodern deconstructionists, embraced by the Boomers to our everlasting shame.

(In fact, most of the officials of the Church of the All Powerful State were trained in these institutions.)

Unfortunately, the conspiracy to dethrone Reason has been greatly aided by the replacement of the literary realm by the visual imagery domain.

The Decline of Literacy

As far back as 1985, social critic Neil Postman analyzed this phenomenon in his chilling Amusing Ourselves to Death, where he observed that the areas of the brain stimulated by reading are quite different from those stimulated by visual imagery.  He chronicled at length the impact on society of a shift from literary culture to one dependent upon television.  In 1994 in A Is for Ox, Claremont College Professor of the History of Ideas Barry Sanders looked at the impact of the old visual media of movies and television and the new media made available by computers and the Internet, on the very sense of personhood that had developed in human culture out of the Renaissance and the Enlightenment. 

More recently, Nicholas Carr reviewed the latest research in brain structure and chemistry as impacted by decades of saturation of the culture by these same media.  In his 2010 book The Shallows: What the Internet Is Doing to Our Brains, Carr takes Postman a step further by demonstrating the accuracy of his predictions of 25 years ago.  The elasticity of the brain that neuroscientists have been studying means that brain development follows those areas that are habitually and regularly stimulated.  He chronicles the shift underway in brain chemistry from those that support logic and cognitive analysis to those that support emotion and instant gratification.

In the period of oral culture which predominated until the invention of the printing press helped buttress the emergence of individuality from tribal consciousness during and after the Renaissance, the brain needed to simply be capable of maintaining an awareness of the stories that formed the basis for human culture.  “In a purely oral culture,” Carr writes, “thinking is governed by the capacity of human memory.  Knowledge is what you recall, and what you recall is limited to what you can hold in your mind.” 

With the introduction of reading, the brain had to develop not only the capacity to interpret the symbols of writing, but to cultivate logic, rigor, and self-reliance.  “The written word liberated knowledge from the bounds of individual memory and freed language from the rhythmical and formulaic structures required to support memorization and recitation.  It opened to the mind broad new frontiers of thought and expression.”

Quoting classical scholar Walter J. Ong, Carr says that
literacy “is absolutely necessary for the development not only of science, but also of history, philosophy, explicative understanding of literature and of any art, and indeed for the explanation of language (including oral speech) itself.”  The ability to write is “utterly invaluable and indeed essential for the realization of fuller, interior, human potentials,” Ong concluded.  “Writing heightens consciousness.”
The brave new world of the Information Age with its Googlized networks and linkages requires a different activity from the brain than reading demands.  Combining the shorter attention span that “surfing the ’net” stimulates with the emotional centers that visual stimuli activate, we find over time that our exercise of logic, reason, and self awareness can atrophy with alarming speed.  We numb our awareness of our bodies, our thoughts, and our immediate environments as we project ourselves into the artificial realms made available in exponentially expanding programs.  We then experience reaction to the stimuli much more than thoughtful analysis of what is being presented, and of our relationship to it.

The brain structures necessary to manage this new set of activities are, not surprisingly, different from those required for literary-based thought processing.  “One of the greatest dangers we face as we automate the work of our minds,” Carr says, “as we cede control over the flow of our thoughts and memories to a powerful electronic system, is the one that informs the fears of both the scientist Joseph Weitzenbaum and the artist Richard Foreman: a slow erosion of our humanness and our humanity.”

The stupidity in our public sphere that Maiman laments is partially a product of this subtle shift in our individual and collective brain structures.  We have shifted from the deeper challenges of reading and thoughtfulness to the shallower stimuli of vision imagery and quick glosses over factoids.  What appears to be stupidity is actually in large part the transition out of literary culture into the new computerized world.

But it’s not the stupidity’s full story.

The Postmodern Rejection of Reason

Long before the shift to a post-literary environment, leftist intellectuals had begun a powerful assault against the assumptions of the classical liberal order that was the organizing principle of American society and, to a lesser extent, most of Western Europe.  Starting with the French existentialists in the immediate postwar period, who were inspired by Nietzsche and Kierkegaard, increasing numbers of the soi-disant intelligentsia on both continents began an overt rejection of Reason as the basis for the social order, which necessarily promoted emotion in its place. 

Stephen Hicks, in his great primer on the subject entitled Explaining Postmodernism: Skepticism and Socialism from Rousseau to Foucault, traces the philosophical roots of this supplanting of Reason by emotion in both America and Europe. 
From the postmodern anti-realist metaphysics and anti-reason epistemology, the postmodern social consequences follow almost directly.  Once we set aside reality and reason, what are we left with to go on?  We can, as the conservatives would prefer, simply turn to our group’s traditions and follow them.  Or we can, as the postmodernists will prefer, turn to our feelings and follow them. If we then ask what our core feelings are, we connect with the answers from the past century’s dominant theories of human nature.  From Kierkegaard and Heidegger, we learn that our emotional core is a deep sense of dread and guilt.  From Marx, we feel a deep sense of alienation, victimization, and rage.  From Nietzsche, we discover a deep need for power.  From Freud, we uncover the urgings of dark and aggressive sexuality.  Rage, power, guilt, lust, and dread constitute the center of the postmodern emotional universe.

Postmodernists split over whether those core feelings are determined biologically or socially, with the social version running as the strong favorite.  In either case, however, individuals are not in control of their feelings: their identities are a product of their group memberships, whether economic, sexual, or racial.  Since the shaping economic, sexual, or racial experiences or developments vary from group to group, differing groups have no common experiential framework.  With not objective standard by which to mediate their different perspectives and feelings, and with no appeal to reason possible, group balkanization and conflict must necessarily result.

Nasty political correctness as a tactic then makes perfect sense.  Having rejected reason, we will not expect ourselves or other to behave reasonably.  Having put our passions to the fore, we will act and react more crudely and range-of-the-moment.  Having lost our sense of ourselves as individuals, we will seek identities in our groups.  Having little in common with different groups, we will see them as competitive enemies.  Having abandoned recourse to rational and neutral standards, violent competition will seem practical.  And having abandoned peaceful conflict resolution, prudence will dictate that only the most ruthless will survive.
The postmodern takeover of American academia began at Berkeley and Columbia with the “Free Speech” movement in the early 1960s, and spread rapidly during the antiwar protests that held our campuses in thrall until the Nixon administration ended the draft in 1972.  This coincided with the publication a year earlier of Saul Alinsky’s Rules for Radicals, a blueprint for subversion of organizations that decidedly included universities. 

This occurred also in part because of the rapid spread of the leftist critique of “Eurocentrism” and the values of its Enlightenment, which came to be dismissed as the ideals of “dead white men.”  Assumptions about the “Western canon” based upon the breakthroughs of Greek philosophy and Christianity were replaced by imposition of “multiculturalism” with courses pandering to the oppressed masses like “Queer Lit” and “Wymins Studies.”  Many of those rejecting the old philosophies and political structures were committed to eliminating capricious legal and social oppression of people populating arbitrary categories; unfortunately this noble movement over time devolved into a witch hunt against anything offending the multiculturalist ethos.  The liberated simply became a new oppressor class; college campuses began to mimic George Orwell’s Animal Farm where all animals are equal, but some are more equal than others.

This rejection of Reason and the tradition that created and advanced it left only emotionalism in its place.  If I don’t have a reasoned argument to make, I can at least display my feelings and work to manipulate the situation by playing the right feelings off against my opponent.  Thus political correctness has become a species of bullying by which objections to its assertions based on logic are cavalierly dismissed as “hurtful” or “patriarchal” or any other such emotion-based claim.

The obvious problem with this turn of events is that the rejection of reason and its analog in the rule of law can only result in “might makes right” as the basis for resolving disagreements, particularly in the political realm.  The post-Cold War devolution of a bipartisan consensus on civility in the public realm has been fueled by the postmodernist disdain of rational debate and discussion in favor of feeling.

The impact of this was on full display in the recent ploy by Nevada Senator Harry Reid in his calculated and cynical claim that Mitt Romney didn’t pay income taxes for ten years based solely on alleged hearsay.  Mr. Reid was simply encouraging the presumed emotion of jealousy and anger among voters who don’t or can’t evade taxes; facts were unnecessary for and probably in the way of accomplishing this goal.  Further, as a political tactic it was simply a slug to the jaw with no other aim but to knock Mr. Romney off his stride and diminish his chances of victory.

This wave of “stupidity” that Mr. Maiman laments has been swelling for decades, and the Baby Boomer are primarily responsible for it.  We eagerly agreed that “if it feels good, do it,” while proudly rejecting authority.  Walter Russell Mead said it starkly in a major post late last year entitled, “Listen Up, Boomers: the Backlash Has Begun”:
Too many Boomers high and low clung to the ideology of youth we developed back when we didn’t trust anybody over thirty and believed that simply by virtue of our then-recent vintage we represented a unique step forward in planetary wisdom and human capability; those illusions are pardonable in a twenty year old but contemptible in those whose advancing years should bring wisdom.  Too many of us clung to that shiny image of youth and potential too long, and blighted our promise because we were hypnotized by it. This is of course narcissism, our greatest and most characteristic failing as a generation, and like Narcissus our generation missed greatness because of our fascination with our glittering selves.

What begins in arrogance often ends in shame; there are some ominous signs that the Boomers are headed down that path. Sooner or later, the kids were going to note what a mess we have made of so many things, and now, it seems, the backlash has begun.
Now the world in which the Boomers came of age is breaking down, and no serious leadership has yet emerged with a vision for actual progress.  The level of denial and the echo chamber of the MSM that supports it just grows apace.  Here in California the citizenry is apathetic and sullen, but continues to suffer in relative silence its sclerotic Democratic leadership and the completely irresponsible Republican abdication.  Polls currently show Boomer Barack Obama—the beau ideal of the multicultural fantasy—favored to win re-election in spite of his hostility to our founding principles and his allegiance to the postmodernist progressivist state.

Until we wake up—which is unlikely to occur this side of global disaster—we will sink abysmally into the ever-widening swamp of collective stupidity, made all the more ironic because it is a swamp entirely of our own making.

In the meantime, citizens who love their country and are unwilling to wait for the inevitable have the opportunity to organize a new way of addressing the problems of our political economy.  We can resist the temptations of focusing too much upon the meaningless presidential campaign and engage ourselves in the imperative of statesmanship, whose application is too important to be left to the politicians.  Our founding principles are the bedrock upon which we can do this.  A recommitment to individual sovereignty under the rule of law and consent of the governed is the starting point for constituting a postmodern, postindustrial society that liberates the creative potential of the citizenry for the good of all.

No comments:

Post a Comment