Sunday, March 30, 2014

Five Names

Many years ago, I came across an interesting mental exercise, intended, if I recall, as a means of improving one’s abilities for seeking out one’s most important personal values. The exercise involved selecting five persons, living or dead, who one would like to have on hand as personal counselors or advisors: a sort of “life committee” consisting of role models that would guide one in setting goals and making important, consequential choices. I found it to be a fascinating exercise, and have often shared it with other people, curious to learn which persons they would select to be part of their life committee. The choices that I have encountered have ranged across the globe, and across many centuries. I have noticed that people do tend to pick members of their committees that share something in common with themselves: their gender, their race, their nationality, their religious faith, etc. And in my own personal choices, I followed the same pattern: all of the members of my committee were male and Caucasian, and three of them were Americans. The members of my life committee were the following, in order of the chronology of when they actually lived: 1) St. Paul, 2) George Washington, 3) Clarence Darrow, 4) Winston Churchill, and 5) Saul Alinsky. I will describe, in turn, the reasons for each of my particular choices:

I have always been fascinated with St. Paul. He has been hailed both as the architect of Christianity and as its most eloquent prophet, but also condemned by some as the person who had distorted the original message and mission of Jesus and, in doing so, paved the way for some of the darker paths that Christianity has gone down in the past two millennia. Paul, of course, had originally been a persecutor of the small sect that had been followers of Jesus, bent on destroying it, but had then undergone a sudden, transformational conversion, after which he joined and eventually became a leader among the new sect, providing guidance to the many new churches that were being formed throughout Greece and Italy. Having read many accounts and studies of Paul’s life (which, admittedly, can only be based on the scant writings by and about Paul that have survived from that time), I have come to believe that Paul’s conversion experience was the result of a sort of cognitive dissonance that had been growing within Paul for much of his life and had reached a breaking point. He was a man that had wanted to devote himself to religious service, but, perhaps because he had been a Jew raised outside of Judea in the region of Asia Minor, had not been able to receive the level of training that would allow him to join the ranks of the Pharisees in Judea, and so attempted to fulfill his need to engage in religious service by zealously opposing what appeared to be heretical sects. The dissonance arose from the fact that, rather than bringing him closer to God, these acts of persecution only alienated him further from the true form of spiritual life that he aspired to. He then found – in the very heretical sect that he had been pursuing – a pathway to his own salvation. For the appeal of the Christian message extended beyond the Jews who had embraced it: it also answered the call of a growing number of Gentiles throughout the Roman Empire who found only spiritual poverty in their own pagan religions. As St. Augustine, centuries later, colorfully explicated in his masterwork, City of God, there was a pronounced depravity among the gods and goddesses that populated the pagan pantheon of Greece and Rome, and in the time of the early Christians, this depravity was only exacerbated by the tendency of decadent Roman emperors, such as Caligula, to declare themselves gods. Many of the spiritually-starved Gentile subjects of Rome found, in Judaism, a moral God that was worthy of worship and allegiance, but were unable to find, from their perspective, a feasible means of entering into a relationship with that God. Christianity answered that call, and for Paul, who counted himself as much a citizen of Rome as a Jew, to align himself with the Christian mission was to restore to him a more wholesome and fulfilling spiritual calling. Paul became “the apostle to the Gentiles”, and in that role, he indelibly altered the course of human history. I have always been awed by the actual imprint that Paul has made upon civilization. Many philosophers, poets, and dramatists have entertained dreams of having their writings revered by posterity, but none have come close to the posterity of St. Paul. Consider that in just about every civilized city of the Western world, for the past two millennia, on any given Sunday, there is at least one, and probably dozens, or even hundreds, of congregations where excerpts of one or more of Paul’s dozen or so letters (“epistles”) to the early churches are read and then expounded upon in sermons. That is a monumental legacy that even the most ambitious of thinkers would never dare aspire to.

George Washington has always been a personal hero of mine, as much for what he did as for what he did not do. His early career as a soldier actually started rather inauspiciously, when, during the French and Indian War, as a colonial officer from Virginia, he led two retreats after unsuccessful attempts to repel French troops from British territory which they had claimed for France. After this unpromising beginning in military service, he resigned himself to the life of a country squire in a plantation in Virginia. It was not until more than twenty years later, when the American colonies were on the brink of war with England, that he appeared at the assembly of the Continental Congress, in uniform, and volunteered his services as a military officer. His career from that point, of course, has become the stuff of legend, as he led the revolutionary army to ultimate victory, winning independence for the colonies, and then presided over a committee that drafted a constitution, and finally served as the new nation’s first president. Many were shocked when, upon the successful conclusion of the war, Washington resigned his commission and retired again to his plantation in Virginia, apparently resisting any temptation to use his military power to gain a permanent hold in leading the affairs of the new country. (It is said that when King George III inquired what General Washington would do now that he had won the war, and was told that the General would return to his farm, the King replied, “If he does that, he will be the greatest man in the world.”) And some were equally surprised when, upon the completion of his second four-year term as president, Washington declined to run again, and ceded the office to his elected successor, John Adams. But one of Washington’s boyhood heroes had been the Roman general Cincinnatus, an aristocratic farmer living in the fifth century B.C. who had been granted dictatorial powers by the Roman Senate twice during his lifetime, the first time to repel foreign invaders, and the second to quell a domestic crisis, and on both occasions, after succeeding in his mission, he resigned the dictatorship and returned to his farm. For Washington, Cincinnatus represented the epitome of heroism, and he more than emulated the model in his own life and career.

Winston Churchill certainly had his vices: He drank constantly and heavily, smoked several cigars a day, and relished a good hearty meal. I suspect that many in today’s effete society would find such vices insufferable, and would disapprove of the man for these alone. But he had one important, enduring virtue, and that was an implacable resolve in the face of adversity. Recently, in America, there has been a resurgent interest in what has now come to be called its “greatest generation”: those men and women who defended the country in World War II. But I think it would be well for Americans to remember and to feel a special gratitude for Britain’s own “greatest generation”, because for two harrowing years, while Americans were still embracing an ill-conceived isolationism, Britain stood virtually alone in staving off the Nazi menace. I have always felt a deep admiration for those RAF pilots who, during the Battle of Britain, had to take to the skies to confront German aerial bombers and fighters, knowing that if they survived the day, they would have to take to the air again the next day to fight again. These pilots were inspired, in turn, by a prime minister who famously said,
We shall go on to the end. We shall fight in France, we shall fight on the seas and oceans, we shall fight with growing confidence and growing strength in the air, we shall defend our island, whatever the cost may be. We shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender . . .
Churchill’s oratory had great power not just because of its elegance, but because they were spoken by a man who had demonstrated throughout his entire career, not only a great personal courage, but a steely resolve and uncompromising commitment to those goals and principles which he cherished the most – among them the rule of law, and defiance in the face of unprovoked aggression.

Clarence Darrow was a Chicago attorney and ardent champion of civil rights, aligning himself with causes associated with social reform, and defending clients who, because of the heinous nature of their alleged crimes, were considered by the general populace to be unworthy of having their say in court. (Because of this latter activity, Clarence Darrow was called “the attorney to the damned”.) Darrow is probably best remembered for two of his cases, both of which have been portrayed in movies, the first being the defense of “thrill killers” Nathan Leopold and Richard Loeb, and the second being the “Scopes Monkey Trial.” In his defense of Leopold and Loeb, two teenagers from wealthy Chicago families who killed a younger boy in a sadistic fashion, merely for sport, Darrow pitted himself against a public that could not conceive of anything less than violent death as an appropriate fate for the killers. His closing argument, in which he did not contest their guilt, but merely pleaded for mercy, is perhaps one of the most famous in legal history, and has been dramatized in at least two movies (Compulsion, starring Orson Wells, and Darrow, starring Kevin Spacey). At the heart of his argument is that the response of society to such a terrible crime must be based upon a desire for justice, rather than pleasure in seeing the perpetrators killed, lest society itself succumb to the same evil that motivated the crime itself. In the “Scopes Monkey Trial” (immortalized in the play and movie, Inherit the Wind), Darrow championed the rights of a young schoolteacher who had been charged with a crime for teaching evolution in a school where such teaching had been prohibited. While he technically lost the case (in spite of his scathing critique of the opposition’s arguments for supporting the law), the worldwide attention that this trial drew to the attempts of certain states to stifle science education on religious grounds eventually reversed the tide of this trend in the United States.

I first became aware of Saul Alinsky, a 20th Century American community organizer and social activist, when I discovered his book, Rules for Radicals, as a very young man. I was at that stage in life where many are looking for a lofty ideal to build their life aspirations upon and a means for attaining that ideal. Alinsky’s Rules for Radicals provided substance for both finding the ideal and the means to attain it. The ideal was no less than social justice, by effectively organizing those without power against those who had it and were abusing it. I’m not sure I ever actually even finished the book (though I still have it in my library), but what I had read had made such an impression upon me that Saul Alinsky remained forever embedded in my mind as somebody who I would choose as one of my personal life guides. He was tremendously successful in mobilizing communities in Chicago, and his methods continue to inspire those who are endeavoring to effect social change, across the entire political spectrum. Among those currently prominent in American politics, both Barack Obama and the conservative “tea party” movement have drawn upon the lessons of Saul Alinsky.

As I’ve thought about my five choices over the years, I’ve asked myself what it was in particular about these five that had so resonated with me, and what, if anything, they had in common. I realized, as I reflected upon their lives and legacies, that what had made their lives meaningful to me was that each, through his particular choices and actions, have made future generations better off in some way. St. Paul had taken a small and struggling cult in Judea and, through his personal dedication and the power of his inspired writing, turned it into a religious mass movement that has provided a spiritual mooring for billions of human beings over a span of nearly two thousand years. George Washington, through his courage, and his self-control, established precedents that ensured the success of the infant republic, which in turn would serve as a model for future republics throughout the world. Winston Churchill defended civilization itself against organized tyranny, and his bold tenacity has inspired leaders of many nations in the generations that followed his own. Clarence Darrow used the rule of law to defend those who had difficulty finding an advocate in society, and to stave off those who would attempt to silence principled activists of social conscience through legal means. And Saul Alinsky taught scores of reformers and champions of social justice – through his actions as well as his writings – how to effectively achieve their goals. I truly believe that the world is a better place because these men lived and acted the way that they did.

Of course, each of them had their shortcomings. Paul, who perhaps as a Roman citizen did not perceive a common cause with the Judeans – including the Jewish Christians – engaged in the Judean struggle against Roman tyranny, endeavored to distance his brand of Christianity from them, in favor of a more pro-Roman variety that enabled him to spread his message throughout the empire without hindrance, and in so doing might have set the tone for the anti-Semitic elements of Christianity that have persisted even into the modern age. George Washington championed an isolationism for America that did not serve it well in the early 20th Century – particularly during the opening years of World War II. Winston Churchill’s dogged stand against tyranny and oppression could be rather myopic when it was applied to subjects of the British Empire, particularly India. Clarence Darrow apparently resorted to extra-legal means to advance his goals at times, which nearly ruined his career, and he also failed to appreciate, in his crusade against the abuses of religious fundamentalism, that an equally slavish devotion to an amoral scientism could lead to societal outcomes at least as pernicious as those produced by religion. And Saul Alinsky, who in his methods believed that activism was most effective when it was directed against a perceived common enemy, perhaps did not appreciate the fact that such methods – when indiscriminately applied – can sometimes become indistinguishable from the tactics of fascism.

But can we ever demand perfection of our heroes? We are, after all, a race of human beings, not gods. And if, at the end our lives, we can say that we have improved, even in some small way, the lot of those who follow us, then that is no mean accomplishment.

I invite you to do the same exercise and determine who your own life counselors would be, and ask yourself what it was about their lives and accomplishments that resonated with you. I suspect that your life will be greatly enriched as a result of the exercise.

Thursday, February 27, 2014

Through a Glass Darkly

Is the world really as it seems? How can I know that what’s “out there” (reality) is the same – or even similar to – what’s “in here” (my mind)? Many wise persons have stated that there is a difference, and that the difference has consequences. In a famous parable, the philosopher Socrates (as related in Plato’s Republic), compares our experiences to those of prisoners bound in a dark cave, who can only see shadows moving about on the wall in front of them, cast there by a fire that is blazing behind them. These unfortunate souls have no idea of what is really producing the shadows on the wall, and so must content themselves with trying to make some sense of these fleeting images. Occasionally, according to Socrates one of these prisoners might have the good fortune to break free, and emerge from the dark cave into the light of the surrounding world. At first, the escapee will be overwhelmed by what he sees, and will find it impossible to comprehend it. But after he finally becomes accustomed to the light, and familiar with his new surroundings, he returns to the cave, to describe his enlightening experiences to his comrades still bound within. However, returning into the darkness, he is disoriented, and finds it difficult to make his way about. Even worse, his description of what he has seen and discovered is incomprehensible to the other prisoners, and in fact his words, along with his faltering steps, lead them to conclude that he has gone mad. Socrates suggests that this is what happens to anyone in our world who chances to escape the bounds of our limited experience and gets a taste of the greater reality. We think them mad, delusional, or impaired in some other way. St. Paul, too, talked of our experiences during lifetime as like “looking through a glass darkly”, and said that only after we shake the bonds of this mortal existence can we see “face to face”.

This question, about the correspondence between what we think we experience and know, and what actually exists and can be known, is one that philosophers have contended with for centuries. They have given a name to it: epistemology. The ancients believed that we experienced only a strata of reality, and that other intelligent beings coexisted with us among other strata, as dimly aware of our existence as we were of theirs. Science has given us a modern version of the same idea: We can see only a limited range in the spectrum of light, and can hear only a limited range of sound frequencies. Our sense perceptions in fact, really do only provide us with a limited sampling of the exterior world. And while the elves, fairies, angels, and demons of the ancients are no longer a part of this world view, we are told instead that there are microorganisms of which we have no direct perception but which, like these mythical entities, can from time to time disrupt our lives in very tangible and even calamitous ways.

But science on its own cannot provide a satisfactory explanation of why what’s really “out there” has any sort of meaningful correspondence with what is inside of our minds. After all, machines can be made to register light frequencies, or sound waves, or pulsations of pressure, but these “perceptions”, either singly or in combination, do not produce anything like an experience – let alone an experience that corresponds to the machine’s external environment. Even if we concede that the complexity of the living, organic, body and brain has somehow managed this feat – translating perceptions into authentic experience – the fact still remains that the perceptions are limited ones, and so the experience itself must be only an approximation, rather than a reflection, of reality. Philosophers continue to struggle to find a suitable explanation for why there can be any sort of correspondence between the contents of our minds and the reality that comprises the world around us.

The philosopher Robert Nozick provided an interesting insight – though, as he admits, it constitutes less than a concrete explanation, let alone a proof, for why such a correspondence might exist – based on the theory of evolution. Living creatures, in their struggle to survive and reproduce, would have to evolve some sort of mechanism for perceiving their environment in a meaningful sort of way, and would do so in a way that was parsimonious: developing just enough of a sensory apparatus – along with an ability to process these sensations – to be able to sustain themselves and defend themselves against predators. This would explain why only certain wavelengths of light were perceived, and certain bandwidths of sound, along with a limited attentive focus, and a pragmatic, selective retention of memories. We don’t see, hear, comprehend, and remember everything, because the brainpower required, and energy required to sustain such a brain, would be simply inefficient. Hence, all living creatures get by through “sampling” their surroundings, leaving out much more than they take in. We are all, in a sense, partially blind, fumbling about in a manner just well enough to get by. It is a practical explanation, though, as Nozick admits, it undermines that popular truism that we all only use just a fraction of our total mental capacity. If Nozick is right, we are pretty much using all of the parsimonious capacity that nature has allotted to us.

One wonders, if this is true, how we can communicate with our fellow beings at all. The answer is that we all tend to be blind to the same things: within species – and, for that matter, within cultures – we share identical limitations, so that the fragments of reality that we do take in, we tend to share in common with those other beings with which we are most in contact. And conversely, we have a tendency, when part of group, to block out the same things, and perhaps even collectively to forget the same things as well.

But “seeing through a glass darkly” still presents its problems, and these are not inconsequential. Like the viruses and bacteria that can make us miserable, without having even a dim awareness of the consequences of their actions, we, too, through our activities, often do things that have profound – and even devastating – impacts that we are completely unaware of. I remember once, while living in an apartment, the misery of having a next door neighbor who seemed oblivious to the fact that the loud music which he often played traveled easily through our common wall, regularly disrupting my life. Even when I complained of the noise, he seemed unable or unwilling to believe that what he was doing was causing displeasure to somebody else. To my great embarrassment, I discovered that I was guilty of the same lapse in sensitivity years later, when a neighbor who lived in an apartment below mine complained that the music that I was playing traveled through the floor of my apartment, into her own, disrupting her life.

Our limited awareness often blinds us to the consequences of our actions. I have even wondered if that might be the real “judgment” that we will face after death, if we truly move into a state of being that is liberated from the shackles of a limited consciousness. Perhaps, after death, we will be able to perceive and experience in a very real and compelling way the impact that every action we ever took in life had on other beings: feeling their grief, their pain, and their bitterness over wrongs we had committed against them. If many or most of these impacts are negative, and we truly can feel the weight of their consequences on others, this might constitute a “hell” that is unbearable to experience, and which actually compels us to want to somehow atone for or correct our negative actions. And if, on the other hand, like the character George Bailey in the movie It’s a Wonderful Life, we touched many lives in positive, compassionate, and loving ways, then the lifting of our blinders after death will allow us to fully experience the joy that we were responsible for: a sort of “heaven”. Of course, for most of us, if this actually does happen, we will have a mix of both heaven and hell. If the “negative” side of the balance sheet is significant, would we yearn for some sort of tangible way to correct it? Would we be given the opportunity to do so, by being reborn into the world as a new human being?

This, as I understand it, is actually something of the rationale behind the Hindu concept of reincarnation: we go through a chain of several lives, fixing ourselves and any wrongs that we committed in past lives, until we finally liberate ourselves from the law of “karma”, or cause and effect. If I caused harm to you in a past life, I might be given the opportunity to redress the wrong when I encounter you again in a future life, or perhaps atone for it in a less direct way, if our chains of lives never actually intersect again. There seems to be a problem, however, with this mechanism, if it does exist, since most of us are born with no recollections of past lives. It would seem then, that without the benefit of remembering our past sins and mistakes, we will be doomed to repeat them: slipping on the same banana peel over and over and over again. Perhaps we remember them at some subconscious or preconscious level, so that there are karmic motivations in the actions and choices that we make in our present lifetimes that we are completely unaware of.

And how culpable are we anyway, for things that we have done that have caused harm to others, if we were unaware – or imperfectly aware – of the consequences of our actions? After all, we are all doomed, in our mortal lives, to “see through a glass darkly” and can never fully perceive or comprehend what the implications are of everything we do. Doesn’t this absolve us of most of the negative consequences of our actions?

I remember reading, as a young man, the autobiography of Albert Speer, who was the armaments minister in Nazi Germany. Albert Speer was, by his own account, a loving husband and father, he was not a Nazi, and he was not even an anti-Semite. His job, and his sole focus of attention, was on armaments production, and he was a diligent, hard-working, and industrious manager. He probably resembled, in personality, lifestyle, and demeanor, a successful executive in any Fortune 500 company today. And because of his managerial effectiveness, he rose to become, for a time, the second most powerful man in Germany. Can he be absolved of the Nazi crimes of genocide, because, as he claimed, he had no knowledge that the large scale, systematic murders were taking place? Speer, in his autobiography, says that he began to hear rumors of the death camps, and was on the verge of investigating the rumors, until a friend and colleague warned him, earnestly, that this was something that he didn’t want to know about. And so he abandoned his plan to learn more about it.

I wonder, sometimes, if the artful avoidance of certain questions or investigations has allowed me to shield myself from any terrible things that I – or my country – may have been responsible for. There are the seemingly little things, like the kinds of foods that I choose to buy and eat, or the companies I support with my purchasing dollar. But there are larger things as well. I remember having a conversation with a colleague a few years ago, and the subject of Africa came up. I expressed regret over the fact that Africa has just seemed incapable of entering into a path of genuine economic development, and wondered if this was the lingering effects of colonialism. My colleague, with some irritation, retorted that apologists have trotted out the “colonialism excuse” for Africa’s continued stagnation for too long, and that it was time for Africans to finally take responsibility for their own destiny. I must confess that at that time I was inclined to agree with her. But recently I happened to attend a conference that was addressing the subject of “conflict minerals”, which are metals that, like conflict diamonds, are extracted in Africa under brutal conditions, as rival militias in certain areas subject the locals to virtual slavery in order to mine these materials and profit from them. The country where this is occurring is the Democratic Republic of the Congo. And I learned, at this conference, that the United States has had an active hand – as recently as the late 20th century – in both propping up dictators there who didn’t have the best interests of their people at heart, and in toppling governments that did. The resources there, after all, are quite valuable, and any interruption in their flow might have threatened “the American way of life”. Happily, America’s policies there are more enlightened now, but the electronics companies, such as cell phone manufacturers, who are primary users of metals refined from conflict minerals in their products, are only just beginning to investigate their supply chains to determine if they are supporting inhuman enslavement and brutality elsewhere in the world.

It may be true that we all must resign ourselves to “looking through a glass darkly”, but I suspect that sometimes, with just a little effort, we can clear the glass – at least a bit. If only it was done more often, and more diligently, during one’s lifetime, perhaps that final meeting, “face to face”, would not be such an unpleasant one.

Tuesday, January 28, 2014

The Past Imperfect

            There was an item in the news earlier this month that two of the world’s most powerful telescopes, the Hubble and the Spitzer, are operating in tandem to gather images of the universe in its relative infancy, by focusing on galaxies more than 12 billion light-years away (and hence, sending images to us more than 12 billion years old, in a universe that is currently estimated to be about 13.7 billion years old), and that there are plans for another telescope to gather even older images in 2018, corresponding to events that occurred a mere hundreds of millions of years after the Big Bang.


This is just the most recent example of an interesting phenomenon that occurs as our civilization continues to evolve: we develop greater and greater capabilities for recapturing our past.  In 1993, moviegoers were entertained by Jurassic Park, about an enterprising group of scientists who were able to resurrect extinct dinosaurs through DNA sequencing and cloning technology, and in the years since, there has been serious discussion about doing exactly that – at least with more recent species lost to extinction, such as the woolly mammoth.  And DNA sequencing has allowed us to better understand both how species have evolved and how our own human ancestors diversified and migrated, forming the races, tribes, and nations of modern times.

Even in our personal lives, modernity has been giving us an increasing capability to retain and capture our earliest past.  The field of psychiatry known as psychoanalysis, when it came into vogue at the end of the Victorian era, suggested that we might resolve our most serious psychological issues and lead more productive, happy lives if we delve deeply enough and far enough back into our life histories, unearthing and resolving conflicts involving our relationships as young children with our parents, and its practitioners engaged in techniques that made it possible for us to do so.  Technology has certainly helped us to preserve more of our personal and social history, with the evolution of photography, sound recording, and now, both sound and video recording with the simple use of a smart phone.  The capability for recording, and storing, records of our individual and collective lives has increased immensely in just the past generation.

Why is it, as we mature and move forward in time, that we have a growing desire to recapture the past?  The desire to preserve can certainly become pathological, as currently illustrated in the American television program Hoarders, about persons who retain nearly everything, and throw little if anything away.  They seem to be desperate to hold onto anything that has ever come into their lives.  I must confess that when I hear of stories like this, I look at my own life and say “There but for the Grace of God go I,” because there are some things that I have been very reluctant to throw away.  It has been almost impossible for me to let go of any book that I have ever owned, and so I find myself having to put an additional bookshelf into my home about once every five years.  (Perhaps Kindle will now save me from eventually walling myself in with bookshelves, while at the same time making it even easier for me to retain every book that I have ever read.) 

In many, if not most, cases, I think that the physical objects we hold onto provide tangible counterparts to important events in our lives.  Clearly this is the case with wedding rings, or college diplomas, or birth certificates of children.  They give our memories of these events substance: something that we can look at, and reach out and touch, so that they are not just merely thoughts in our minds – thoughts which will pass away when we pass away.  Of course, the meaningfulness of these physical objects is far from universal, and their value is often completely lost on others, even those close to us.  (Hence the ordeal of having to sit through a presentation of somebody else’s stack of vacation photographs.)  Many years ago, during an unhappy period of my life, I was driving one morning to a workshop that I had to attend, and stopped at a fast food restaurant for breakfast.  The restaurant happened to be giving away stuffed animals as part of a promotion for a new movie, and so I took one before resuming my trip.  And because it was right around that time that the circumstances of my life improved rather dramatically, the stuffed animal came to be permanently linked with a happy memory for me.  So to this day, the tiny, smiling “lucky Simba” sits on a shelf in my bedroom.  It will probably still be there on the day that I die, and when the “junk” in my home is committed to the flames, like the “Rosebud” sled in Citizen Kane, the stuffed toy will be cast away without the slightest suspicion that it meant anything to anybody.  When I was a boy, attending with my parents a holiday party at my grandfather’s house, I noticed a large Bible sitting in a prominent place on a shelf in his living room.  It just so happened that I had embarked on an ambitious project that year to read the Bible from cover to cover, and so, in order to impress my grandfather, I asked him if I could pick it up and read it.  To my shock (as well as that of my parents, and the others in attendance), he angrily shouted at me not to touch it.  A while later, it was explained that this Bible had been a prized possession of my grandmother, who had recently passed away, and my grief-stricken grandfather had never wanted it moved from the place where she had kept it.  Of course I didn’t understand his feelings then . . . but I do now.

What then, is it that compels us to capture more of the past, and to retain it, through material objects?  I think that we are always endeavoring to give our individual and collective pasts a more enduring existence that we hope will survive us, somehow, after the ephemeral imprints of our memories fade away.  And, by capturing more of our pasts, we hope to compile a meaningful story of our existence, with a beginning, middle, and end, which will endow it with a significance that will transcend the transitory nature of our time on earth.  Individually, and collectively, as nations and as a species, we want to believe that we are part of a drama that has an ultimate purpose – a destiny to be fulfilled, and by better understanding the most distant reaches of our past, we hope to be better able to trace out the trajectory of that drama.

In the Japanese film After Life, recently deceased persons are directed to find a single happy memory, which they will then be able to re-experience for eternity.  Is that what a real heaven might be: to collect a sort of “greatest hits” compilation of our memories, and be able to relive them for eternity?  For the German philosopher Nietzsche, such a prospect, “eternal recurrence”, presented an ongoing challenge to live a meaningful life:
What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: 'This life as you now live it and have lived it, you will have to live once more and innumerable times more' ... Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: 'You are a god and never have I heard anything more divine.' [The Gay Science, §341]
In the same vein, might we be forced to undergo a sort of trial in the afterlife, as in the American film Defending Your Life, and learn through this that the real secret of fulfillment had been to overcome one’s fears, and live life to the fullest?  Will we have second, third, and multiple chances to do so, as in the movie Groundhog Day?


Perhaps, with the continuing advance of technology, we will someday be able to memorialize everything that passed through our minds in a more permanent, substantial way.  And then it will be possible for others to recall each and every one of our lives, and review and examine them completely.  But even if this comes about, what would compel anyone to do so?  The sheer number of individual human existences seems to undermine the special value of what each of them had lived and experienced.  Still, there is something precious about every human existence, and perhaps when the capability is realized to see each one in their fullest, then future lives will be enriched by reviewing them, examining them, and drawing tangible lessons about how they spent their limited spans of time on this planet.  Maybe, in this manner, future human beings will find the blueprint for living lives that are truly worth preserving in memory, and even reliving, over and over and over again.

Tuesday, December 24, 2013

Three Books

Recently, I had the opportunity to see again the 1960 film version of H.G. Wells’ classic science fiction work, The Time Machine. The film starred Rod Taylor as the time traveler, H. George Wells, and in the final scene, after George disappears from his home and returns to an era in the distant future where he had discovered that civilization had lapsed, one of his friends who he left behind discovers that he has taken three books with him. It is unclear which books he has chosen, and the friend wonders aloud to his housekeeper which three books they would have selected, had the choice been theirs to make.

It is an interesting question, not unlike the one that I posed in my very first blog entry one year ago, when I wondered what lessons our own civilization might like to leave behind to some other civilization in the distant future – one perhaps coming out of a dark age, having either just a dim memory, or no recollection at all, of this one that preceded it.

As I watched that concluding scene of The Time Machine, I couldn’t resist wondering what three books I might have chosen, to serve as a legacy and a lesson to some people where civilization had lapsed. It brought to mind a life project that I embarked upon as a young man when, while in college, I came across a list of the two hundred greatest philosophy books ever written. I kept that list, and set for myself the goal of reading those books during the course of my remaining life, believing that if anything came close to comprising the collected wisdom of our civilization, then this must be it.

I must confess that, in the decades since setting that goal, I have fallen far short of it, having only read fifty-one of the two hundred books on that list. And I also have to confess that I don’t think that the ones that I have read have made me a better, or even a wiser, person than anyone who may have never read a philosophy book in his or her entire life. Still, the experience, at times, was an exhilarating one, and I’ve come to the conclusion that a really great work of philosophy is one that quickens the mind of the reader, enticing it to consider new and different ways of looking at the world, and existence, and of one’s role and place in the universe. Sadly, only a small number of the books that I encountered had this effect, while many had the opposite effect, with their pedantry actually dulling the mind, rather than exciting it. But the good ones made the entire venture worthwhile, and I have never regretted the time that I devoted to it.

Of the great ones that I encountered – the ones that quickened the mind, and opened entirely new vistas – I would include the following: Plato’s Republic, in which the legendary Socrates attempts to make a case for why a person should act justly, rather than otherwise, which is not based on fear of divine or human punishment. Although he was not completely successful in this attempt, the questions that he poses (in classic Socratic fashion) to his youthful audience, and the stories that he weaves, are profound and enlightening. The Republic also touched on the issue of how a society and government should be ordered, and Aristotle, in his Politics, addresses this issue as well, in a more systematic, but equally illuminating, manner. The works of Plato and Aristotle really do constitute a golden age of philosophy, and I have never come across any by these authors that is not worth reading and contemplating. It seems that, in the centuries following theirs, philosophy descended into a sort of dark ages of its own, with writers engaging in tendentious debates about inconsequential things, until its revival in the Age of Enlightenment. This renaissance began with writers like Hume and Berkeley, but to me it is the works of Rene Descartes – his Discourse on Method and Meditations on First Philosophy – that kick-started philosophy in the modern age in a very exciting and refreshing way, as he attempted to resurrect the search for ultimate truth from the ground up, relying upon first principles derived from reason and simple, direct introspection. The rebirth of philosophy in the modern age found its greatest light, however, in the German philosopher Immanuel Kant, a man who has been called – and very deservedly so, I think – the greatest philosopher since Aristotle. In the debate that had been raging in his time over whether ultimate reality rested in mind (idealism) or matter (materialism), Kant’s unique and revolutionary insight – in his Critique of Pure Reason – was that while there might be an ultimate “something” out there, we can never know what that “something” is, since our minds play an active role in mediating how external reality is presented to us and becomes a part of our perceived awareness. Reading Kant’s Critique was a dizzying experience: I – like most readers of it, I suspect, in my time as well as his – was not able to completely comprehend it, but still had a sense on every single page that something very profound, very important, and very exciting was being presented. Someone once said that all philosophy is a commentary to Plato: it seems to me that all philosophy in the past two centuries has been a commentary to Kant. Building on Kant’s insights, Arthur Schopenhauer, in his World as Will and Idea, attempted to build a bridge between these and the wisdom of Eastern schools of thought. Schopenhauer is often branded as a philosopher of pessimism, but his pessimism is really no different than that embraced in the first of the Buddha’s Four Noble Truths: that in existence there is suffering.

Sadly, it seems that since that second golden age of the nineteenth century, philosophy has been descending again into pedantic, arid controversies that dull the mind rather than quicken it, but there are a few lights in the twentieth century that were a joy to read, or at least inspired awe. One such awe-inspiring work was Alfred North Whitehead’s Process and Reality, which represented a herculean attempt, and perhaps a successful one (as with Kant’s Critique, I have to admit that my capacity for understanding the work was limited), to create a systematic, holistic model of existence that incorporated the most important insights of all of the great philosophers who preceded him. A more accessible, but equally inspiring, writer of the twentieth century was the French philosopher Albert Camus, who in such works as The Rebel, The Stranger, The Plague, and the Myth of Sisyphus (it was the first of these which had been on the list, but all are worthy rivals for a place on it), addressed the challenge, the burden, and the tragedy of contending with existential freedom. And, finally, the most recent of the works that made it on that list of two hundred, John Rawls’ A Theory of Justice, presented a novel approach to designing a just society, by envisioning a thought experiment in which its architects, while crafting its rules, were unaware of what their stations in life (rich, poor, male, female, etc.) would be after the project was completed. A book that did not make the list, but which constitutes an ingenious critique of and counterpoint to Rawls’ conclusions, is Robert Nozick’s Anarchy, State, and Utopia. The two should really be read together. I am glad that I did. (And I believe that Nozick’s Philosophical Explanations really deserved a place on the list of two hundred as well. Its final section, on the meaning of life, is one of the most insightful, profound, and provocative treatments of that subject that I have ever read.)

These are just some of the more memorable works that I encountered, as I worked my way through the list, randomly selecting titles, and I am sure that there are many others, which I may someday read, or may never get to, that have the same potential to awe and to enlighten. But would any of these be included among the three books that I would select, to be preserved after the memory of this civilization has faded?

There are other books that would be of more practical value. As an extreme case, I think of the many books that have been written to provide advice on personal success. I have a shelf full of these, such as Napoleon Hill’s Think and Grow Rich, Dale Carnegie’s How to Win Friends and Influence People, and Robert Collier’s The Secret of the Ages. Most of these books begin with the promise that by following the insights and principles contained within them, one can transform one’s life, and find success in both the personal and professional sphere. But I cannot think of a single one of them that affected my life in a profound, transformational, and permanent way, with perhaps one exception. This was a little book entitled The Richest Man in Babylon, by George Clason. Written in the form of an extended parable, it contains some very simple maxims on how to manage one’s money. I took them to heart, and put them into practice, and have always appreciated the wisdom embodied in them. Still, I couldn’t possibly imagine including even this book as one of the three written legacies to be left behind by this civilization to serve as a guiding light to another.

There are so many other types of works to consider – great novels, romances, poetry, works of religion – which might serve as a testament to our civilization, and provide an echo of its greatest moments. But in the end, my choices were still guided by pragmatism, more than anything else. What, I asked myself, would be of most practical benefit to some future age, where our own civilization had been forgotten?

My first selection would be a book on general science that contains the foundational principles and discoveries of biology and physics, and, ideally, some rudimentary mathematics as well. Now I am cheating here a bit, since I don’t actually have such a book, and so wouldn’t be able to take it off of my shelf, if, like George Wells in the movie, I was about to embark on my final one-way trip to the distant future in a time machine. I believe I still have my college physics textbook, which was pretty comprehensive in scope, so in a pinch I would probably take that. But a quick search on Amazon.com tells me that I could buy a book on general science and have it delivered to me in three days, so only a moderate delay would be required to have this book available.

My second selection would be a one-volume edition of world history, because in my opinion one of the most important legacies to be left behind by any civilization is a complete record of both its triumphs and its failures. There is much truth, I believe, in the familiar quotation that those who forget the lessons of history are doomed to repeat it. Here, I would be better prepared, as in my personal collection I have at least two one-volume histories of the world: one that was published in 1906 (which I referred to in my last blog entry, “Time’s Arrow”), and a Columbia History of the World published in 1981, which, while more recent, is by now also a little dated. I even have a book conveniently titled The Lessons of History by Will and Ariel Durant, but it is rather short in length, and therefore light on actual history. And so here, too, I might be tempted to delay my selection of an actual book until I can find one that brings the story of world history a little more up to date.

My third and final selection, and one that I actually have in my possession, is a one-volume collection of the complete works of Plato. This would seem to be the least practical of my three choices, and, for that reason, might appear to be the weakest. But I believe that there is a need for philosophy in civilization, and that it is essential that certain fundamental questions about the nature and purpose of our existence be asked. Plato, and in Plato’s works, Socrates, raised the most important of these questions, and addressed each of them with a depth of insight and lack of prejudice that continues to be unrivaled by any other great thinker before or since. And because of the foundational nature of the questions addressed, a study of Plato, in some distant future civilization, would provide fertile soil for the growth of other great ideas, germinating in future great minds, perhaps rivaling or even surpassing those that had graced our own civilization.

One book that many – at least many living in the western hemisphere – might find to be conspicuous in its absence is the Bible. I know that others would consider this to be an essential – perhaps the most essential – book to be included as part of our legacy to the future. I disagree. And I won’t defend my omission by resorting to the charge made by many agnostics and atheists: that religion has done more harm than good in the world, or that, at the very least, it has been responsible for much of the mischief (wars, pogroms, repressions, and resistance to scientific advances) that has permeated our history. Rather, I contend that the search for God, and for a relationship with God, is one that is dynamic, and defined by the person, or the culture, that is engaged in it. If our Bible, and the other great religious works that have appeared among the extant and recorded civilizations of our planet, truly represent the inspired word of God, then I have to believe that any future civilization, with no memory of ours, will have their own prophets and channels for receiving God’s inspired words. And these words will be expressed in their language, and in the context of their own unique history, culture, and development. It will speak to them in ways that our own never possibly could. Similarly, if the great poems, dramas, romances, songs, and collective dreams of our people must someday be forgotten, we can take some consolation in the fact that if there is a future age, then it will produce great poets, dramatists, composers, dreamers, and prophets that will move and inspire their audiences in ways that our own works perhaps never could.

It would be a great consolation to know that we will leave at least some of our words as a legacy for that future age, but whatever words we leave behind for the inhabitants of that civilization, I have every confidence that they will be able to provide the music, at least as unique, as inspired, and as beautiful, as any that we ever produced.

Saturday, November 30, 2013

Time's Arrow

During this, the week of the U.S. Thanksgiving holiday, I would like to talk about something that I have been feeling thankful for. It is that the world is getting better.

Now that seems like a rather shocking – or at least very naïve – thing to believe, given the many terrible things and disturbing trends occurring in the world today, and I am the first to admit that I run the great risk of having this read by someone at some distant future time, wondering what could have possibly made me believe it. If so, this would not be the first time that great optimism for the future would be shown to have been unwarranted in the worst possible way.

When I was young – perhaps still a child – my mother bought a book on world history for me from a rummage sale. The book was very old, with a tattered cover, and sat on my shelf for years, unread, though I kept it with me through the years, promising myself that I would get to it someday. When I finally did get around to reading it, and reached the final chapter (this was about sixteen years ago), I was intrigued by some statements by the author which represented his appraisal of the destiny of the world in the coming decades. “Throughout the last century,” it says, “the sentiment of the brotherhood of man has been greatly deepened and strengthened. This new moral sentiment constitutes a force which is working irresistibly in the interest of a world union based on international amity and good will.

“It is most significant,” the passage continues, “that at the same time these movements towards world unity have characterized progress in the political and moral realms, wonderful discoveries and inventions in the physical domain – the steam railway, the steamship, the telegraph, the telephone, wireless telegraphy, and a hundred others – have brought the isolated nations close alongside one another and have made easily possible, in truth made necessary, the formation of the world union.”

This passage – and the book that contained it – was published in 1906.

What a sad irony that within a decade of this book’s printing, the advanced nations of the world descended into the bloodiest and most devastating war in the history of civilization, and that this was followed by an even more devastating war in the very next generation. Alongside these wars, atrocities and mass murders were committed upon millions of persons by their own governments, including in nations that had been considered among the most enlightened and civilized in the world. And by the end of the century, weapons of war had been developed capable of destroying the entire planet, with the growing danger that fanatics just crazy enough to use them might someday acquire access to them.

But we must not be too quick to judge the faulty vision of that writer in 1906: after all, many events in the century preceding that book had occurred which would inspire one to optimism, including the universal abolishment of slavery, the growth of woman’s suffrage movements, and a dizzying array of new inventions and technologies that had been unimaginable just a hundred years earlier, such as the telephone, the horseless carriage, the electric light bulb, the phonograph, and the heavier-than-air flying machine. The growing “brotherhood of man”, in addition to manifesting itself in temperance movements and other social welfare initiatives, was also evidenced by the growth of international trade, linking the advanced economies of the world so tightly that it seemed that a war of any kind would be so self-destructive as to not be warranted for any reason.

Given the blood-stained record of the twentieth century in retrospect, can we really afford to be optimistic, or bold enough to say that things, in general, have gotten better? I believe that we can.

I believe this, because in the midst of all of the calamities, tragedies, and outrages of our civilization, there seem to be real marks of forward progress, and not just in the area of scientific invention and technological advancement. The history of blacks in the U.S. exemplifies this very well. Many if not most of them lived as slaves in the early decades of the country’s history, until the institution was abolished with the passage of the Thirteenth Amendment to the U.S. Constitution, after the Civil War. But freedom did not bring equal treatment under the law. Voting rights were routinely denied to blacks in many states. In the two World Wars, units of American black soldiers were segregated from white soldiers, and the U.S. armed forces were not integrated until 1948. Black soldiers in these wars justifiably might have wondered what they were fighting for, since many of their relatives back home were being excluded from a decent education, banned from certain establishments, and forced to drink from separate drinking fountains and to use separate bathrooms. Civil rights would finally come after decades of domestic struggle, and in 1989, Americans would see General Colin Powell rise to become the first black Chairman of the Joint Chiefs of Staff. Two decades after that, America elected its first black President.

I have seen other, dramatic signs of progress and change within my lifetime. I remember a story that my mother told me of how, when she was a child, living in a suburb of Chicago, she used to walk past a country club which had a prominent sign that read “No Jews or dogs allowed”. Such a thing would be unthinkable today. I remember, too, in my own childhood, living in a culture that believed that women were incapable of working in many of the professional occupations held by men. A popular riddle exemplified these prejudices: A man and his child are in an automobile accident; the man dies, and the child, alive but seriously injured, is rushed to the hospital. The attending doctor in the emergency room that evening takes one look at the victim and says, “I can’t operate on this child – he’s my son.” Very few persons back then were capable of arriving at the solution to this riddle, which seems so obvious today: that the doctor was the child’s mother. And I remember the racial and ethnic derogatory words that were so casually and regularly used by persons of all ages – words which are now rarely heard, if ever at all.

Similar tales could be told in Europe, and Asia, and Latin America: of the growth of liberality, and breaking down of old barriers based upon gender, racial, and ethnic prejudices. And it does seem that these developments are just the latest in what has been a long and sometimes halting progression which has been a central feature of the story of human civilization. But the progression has had disturbing undercurrents.

One undercurrent is that not all of the gains are necessarily permanent ones: there is always the risk of a retrogression. I remember well a striking example of this back in 2001. A news program recounted the shameful treatment of Japanese-Americans after the U.S. entry into World War II, as many families at that time were resettled into detainment camps. The program condemned this policy, of course, and its narrator wondered how such a thing had ever been possible, even by a government and citizenry shocked and terrified by the Japanese attack on Pearl Harbor. The conclusion seemed to be that the persons of that generation were simply more bigoted and prone to racial paranoia than we are now, in this more enlightened age. And then, a short time after this documentary aired, the twin towers of the World Trade Center were brought down on September 11. Panic ensued, radical airport security measures were introduced, and when it was revealed that racial profiling was now being used in security screenings, with particular attention being paid to persons who appeared to be of Arab descent, the policy was roundly applauded by a frightened and insecure populace.

Another undercurrent is that, alongside the gains of progress, there is a countervailing, dangerous trend that increasingly threatens to undermine all of these gains. Schools are no longer segregated, but now many have metal detectors at their entrances, to protect the students from being knifed or shot. The Chicago of the nineteenth century was one in which members of any particular white immigrant ethnic group – German, Polish, Irish, and Italian – dare not venture into a neighborhood belonging to one of their rival ethnics, for fear of being beaten up or killed. Now, anyone can venture into any neighborhood of downtown Chicago or its adjacent suburbs without any fear of reprisal, and the only ethnic markers, if they exist at all, are the food and drink specialties exhibited in the neighborhood restaurants and pubs. The ethnic differences - both personal and geographic - have blurred beyond distinction, and a typical native Chicagoan numbers among his ancestors representatives of several ethnic groups. And yet, today, a little further south, among the poorer, predominantly black neighborhoods of the Chicago suburbs, murders due to gangland violence – with even children numbered among the victims – are at epidemic proportions. The Prohibition-era gangs of Chicago and other large cities are a thing of the past, but modern gangs deal in drugs that are much more dangerous and addictive than alcohol. Every mark of progress seems to be accompanied by an underlying countercurrent of violence and barbarism.

The evolution of Halloween as it is celebrated in the U.S. provides a very telling example of this strange phenomenon of forward and backward movements occurring together. According to folk history, the celebration of the holiday had its roots in a sort of ritualized extortion practiced by marginally delinquent youths upon potential adult victims, as the youths threatened vandalism to their property unless the youths were given some sort of reward, as exemplified in the demand: “Trick or treat”. But this evolved into the harmless holiday ritual – the one that I remember in my own youth – of groups of children going from door to door in store-bought costumes, knocking on the doors of neighbors, and getting little treats of candy from the amused homeowners – most of whom had their own children also roaming the neighborhoods in costume. Today, the ritual survives, but hardly any children now roam the neighborhood without their parents in tow, standing nervously nearby, terrified that if their children were left to do this unaccompanied, the children could be abducted or otherwise molested by adult predators.

It is a strange paradox, that as the world – or at least the more civilized nations of the world – seems to become progressively more enlightened, it also becomes progressively more dangerous, to the same degree. Slavery – at least state-sanctioned slavery – is universally abolished, but human trafficking is now a world-wide epidemic. And while democracy seems to be on the rise, so, it seems, is extremism among larger and more increasingly armed groups of people, fueled by religious fanaticism, anger at perceived slights or injuries suffered at the hands of others, or simply the desire to subvert the regional balance of power by any means necessary. Our modern economies are capable of providing more things to more people more efficiently than ever before in the history of our civilization, but the gulf between rich and poor is growing menacingly large. Technology continues to produce dazzling new miracles on many fronts, but with industrial and technological progress has come negative environmental consequences that risk the sustainability of the world’s entire ecosystem.

It would be Pollyannaish to downplay these negative undercurrents which pervade the march of civilization, or to not acknowledge the fact that a severe economic crisis, terrorist act, large-scale war, or effective demagogue playing upon lingering resentments and prejudices could send the edifices of our civilization crashing down, at least for a time. We have fallen down many times over the centuries, and every nation has chapters within its own history that represent shameful episodes that it must contend with: episodes that it would like to blot out of its collective memory, but knows that, to truly continue on any kind of forward march to progress, it must never blot out. And yet, in the face of all of this, it seems that after every time we pick ourselves up, we are a little better, and a little wiser, and there is an enduring, permanent improvement to our collective ethos that we retain. It is this that leaves me feeling grateful, and hopeful – that no matter how dangerous the challenges we face in the future, we have a better, broader, more resilient character that will enable us to deal with them more effectively, and more wisely, than those before us were ever capable of doing.

Thursday, October 31, 2013

Let There Be Light

I thought it might be fitting, on Halloween, to describe a personal experience that I once had of Hell:

It happened when I was a young man, of twenty-one or twenty-two years of age. This was a period of my life when I was experimenting with drugs. I think that it is safe for me to admit this, given that decades have now since passed, and also because even U.S. Presidential candidates have admitted to using drugs in the days of their youth. And for me, at that time, “experimenting” is the appropriate phrase to use. I was never addicted to any particular drug, but was fascinated by the effects that drugs had upon the mind. Some affected mood, others bolstered self-confidence and induced more spontaneous, self-expressive behavior, while others seemed to alter perception itself, tempting the user to believe that he or she was encountering profound, enlightening insights. And, given my propensity for an analytical frame of mind, I actually recorded, in a notebook, the various effects of different drugs upon me. I did not do this dispassionately: In my youth I suffered from a painful shyness, which persisted into adulthood, and, even though I had passed out of my teens, the lingering effects of adolescence only magnified the angst that I felt because of it. I wondered if there was some perfect combination of drugs that might make me more comfortably assertive, spontaneous, and expressive. It really was a sort of “Jekyll and Hyde” experiment that I was engaging in: a “personality dialectic”, in which I hoped to release those elements of my personality which had been suppressed, and which only emerged intermittently, generally as the result of a unique combination of social circumstances and/or mood-altering substances.

After several trials, with several different drugs, I finally decided upon a specific “cocktail” of drugs: a combination which I felt would produce the desired effect. My plan was to ingest these at home, and then travel to a nightclub which was about a half hour’s drive from my home. All went according to plan, at least up to the point where I arrived at the club. I sat down at the bar and waited, excitedly, for the drugs to take full effect.

But then something horribly wrong began to happen. I noticed it first when I realized that the music playing in the background no longer seemed to have any rhythm or recognizable, coherent melody. And a bartender who was speaking to me was completely incomprehensible, as if he were speaking in a foreign language. Then I noticed that the bottle of beer that I had ordered was lying horizontally on the bar. I managed to set the bottle upright (I think), but immediately retreated outside to the parking lot, and headed back to the van that I had driven to get there.

What happened next almost defies description. There was blackness, just blackness. Only gradually did I become aware of the fact that I even existed. But I had no idea where I was. I could see nothing, hear nothing, feel nothing. Even worse, I had no idea what I was. I was this entity, in the middle of nowhere, that didn’t know who it was, what it looked like, what its history was, or how to even find answers to any of these questions. In this empty void, I tried to convince myself that there was something out there beside myself, although there was no evidence to support this belief. I cried out, in a language without words, to this “thing”, begging it to make itself known, and to tell me who – or at least what – I was. But there was only silence, and the void. Words cannot convey the terror that I felt, and the lonely isolation. I was a being with no identity, no history, no belonging, and no connection.

I don’t know how long I was in this state, because there was no standard by which to judge the passage of time. In desperation, I tried to conjure up a memory: if not of myself, then at least of some other being that had known me, and had interacted with me. I reasoned that if I could remember such a being, than through that being’s reactions, I could surmise who or what I was. Finally, a recognizable image appeared in my mind. It was the image of my mother. And, seeing what she looked like, I began to piece together what I might look like: a human being, with a face, two arms, and two legs. The images of friends then began to follow, and memories soon returned in their wake. I remembered the name that I had been called by these others, and soon was able to reconstruct, in my mind, a complete image of myself, and a history of what I had been.

Not long after this restoration of identity had been completed, I was able to restore my sense of perception as well, and locate myself within my van, within the parking lot of that nightclub. After another stretch of time, I found the strength and willpower to position myself on the driver’s seat, start the van, drive back home, and get into bed.

It took me a couple of weeks before I was completely back to normal (during that time I experienced difficulties with both sleeping and “taking in” the world around me), but eventually was able to restore a normal sense of equilibrium to my life. The irony was that I really had succeeded in what I had set out to do that night: I had managed to destroy the personality that seemed so awkward and ill-suited to me at the time, but in its absence there was apparently nothing left to replace it with.

And the experience also left me with a revelation of what a real “hell” would be like: a state of existence that is completely separate and unconnected from anything or anyone else. Hell, I realized, is separation –total isolation; no communion with any other sentient being.

As I have reflected on this nightmarish experience, in the many years since it happened, I have occasionally wondered: isn’t this the supposed goal of many “enlightened” spiritual practices - to annihilate the self? But upon further reflection, I realized that this is not, in fact, what had happened to me. It was not the “self” that was annihilated, but rather any and all connections that this self had with any external reality. It was left completely and utterly alone and isolated, without even the consolation of memories of connectedness to ground its being. Enlightenment traditions, on the other hand, seem to counsel a sort of dissolution of the self, along with a merging with some greater reality. In my own personal experiences of meditation, when I have managed to quiet the mind, and attain a quiescent state in which distracting trains of thought subside, leaving only a sort of empty, non-reflective awareness of the world, the ensuing feeling of peaceful bliss does not arise from having severed my connections with everyone and everything around me. Rather, it stems from feeling more grounded and connected, with everything, and less wedded to an abstract concept of the self. And yet, no matter how far into this meditative state I have gone, I have never lost a sense of who I was, or of my own personal history, or of where I was at the moment. And so I can only conclude that what I experienced, during that altered, drug-induced state so many years ago, was a sort of anti-enlightenment: the opposite of what it is that so many spiritual, meditative disciplines exhort us to attain.

But I have also wondered: wouldn’t God have experienced something like what I had during that great stretch of time (an eternity, in fact), before the universe was created: a sense of being an entity in a void, with no identity, no past, and no connection with anything else? And during this infinite stretch of complete, empty solitude, how would an entity know that it even had the power to change this situation? After all, if things had been this way for an eternity, what evidence would there be that anything different was even possible? To me, such an existence would not only be unbearably lonely, but unbearably terrifying as well. I recently put this question to a friend of mine, and he replied that such a scenario would not present a problem or a difficulty to God, since God is perfect. Now, such an answer is rather trite, but I have to confess that there is a certain logic in it. After all, any being that perceived a sense of lack in its existence could surely not be perfect.

And yet, there was – according to so many religious traditions – a moment when a perfect, supreme being willed the universe into existence, a moment when the Creator declared, in the words of the Old Testament, “Let there be light”. It seems unthinkable that such an act would occur without an underlying need or desire to perform it, but “need” or “desire” are verbs that would be entirely incongruent with an uncaused First Cause, or unmoved Prime Mover. One can understand, when confronting this puzzle, why the Gnostics believed that the “god” that created this universe, and who identified himself as the “creator”, was in fact created by some higher, more sublime Being. Perhaps the Kabbalists are closer to the truth with their theory that Godhood manifests itself through a series of emanations, called “sefirot”, which arise from a primordial source know as “Ein Sof”, a word which has been interpreted to mean “nothingness”, but also “without end or limit”.

In the year preceding the one during which I performed my terrifying experiment, I had written the following poem, which I titled “In the Beginning. . .”:

In the beginning there was Change.
God created Change in his
own image . . .

And yet God remained a static
force during an eternity before
Change.

There was no change of time
nor change of place
No change of mind
nor change of face

In Change there was a
Beginning.
I suppose that it is unthinkable – maybe even blasphemous – to imagine that a Supreme Being once experienced the horrifying loneliness of being an isolated entity with no history, and no self-concept, as I did during that bad drug experience. All that I can conclude is that such an experience, if permanent, would truly be an unimaginable hell for any ordinary conscious being. And, as an ordinary, conscious being, I am now permanently grateful that I am not alone in this huge universe, and will leave the question of how this existence came about to greater minds than my own.

Monday, September 30, 2013

The Great Divide

I had the great privilege of attending a ceremony recently in Washington, D.C. at which Olympia Snowe was given the 2013 Paul H. Douglas Award for Ethics in Government. (The late Paul H. Douglas, a man of strong moral convictions, a liberal who championed fiscal conservatism, and an ardent crusader for civil rights in the mid-twentieth century, was once described by the Rev. Martin Luther King as “the greatest of all the Senators”.) Senator Snowe had a long and distinguished career, serving in both houses of the U.S. Congress. As I listened to the speeches of some of her peers in government, along with her own acceptance speech, I could perceive a common theme that emerged among them. This was that Senator Snowe, and others like her, were able to achieve great things in government because they were willing to work with members of the opposite political party to achieve important goals. “Compromise” was a word that came up more than once during the ceremony, and it was not used in a pejorative sense. Rather, it described the ability of Snowe and other legislators to make small sacrifices in return for significant gains: pieces of legislation that – while not entirely satisfying the original objectives of either party –nevertheless represented tangible and important contributions to the nation that could find support in both parties.

How different things seem now, in a Congress where “compromise” has become a dirty word. Factions regularly prefer to hold the entire government hostage through their intransigence in such important matters as long-term national debt reduction, rather than work with elements in both parties to affect a workable compromise. In a recent marathon 21-hour speech, a senator dredged up the name of Neville Chamberlain, suggesting that to compromise with his political opposition on a budget bill was comparable to that British prime minister’s policy of appeasement with Adolph Hitler.

When did “compromise” become such an ugly word in politics? It has certainly been an element of the U.S. political tradition, going back to the drafting of the Constitution itself. That was an instance where the perfect was recognized as the enemy of the good, and the founders – after several weeks of intensive, old-fashioned “horse-trading” – produced an instrument of government that merely succeeded in satisfying, rather than impressing, most of them. As Benjamin Franklin put it, shortly after the document was completed:

I confess that there are several parts of this Constitution which I do not at present approve, but I am not sure I shall never approve them: For having lived long, I have experienced many instances of being obliged by better information or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others. . . .

I doubt too whether any other Convention we can obtain may be able to make a better Constitution. For when you assemble a number of men to have the advantage of their joint wisdom, you inevitably assemble with those men, all their prejudices, their passions, their errors of opinion, their local interests, and their selfish views. From such an Assembly can a perfect production be expected? It therefore astonishes me, Sir, to find this system approaching so near to perfection as it does; and I think it will astonish our enemies, who are waiting with confidence to hear that our councils are confounded like those of the Builders of Babel; and that our States are on the point of separation, only to meet hereafter for the purpose of cutting one another's throats. Thus I consent, Sir, to this Constitution because I expect no better, and because I am not sure, that it is not the best. . . .

On the whole, Sir, I cannot help expressing a wish that every member of the Convention who may still have objections to it, would with me, on this occasion doubt a little of his own infallibility -- and to make manifest our unanimity, put his name to this instrument.
Sadly, the wisdom of a Benjamin Franklin – or a Paul H. Douglas or Olympia Snowe – seems to be in increasingly short supply in the U.S. Congress, which is now become that confounding Babel of discordant, uncompromising selfish interests, some of whom are driven by simple self-aggrandizement, others by an almost fanatical devotion to ideology, and others by a craven timidity, fearing that any overture to compromise will prematurely end their political careers.

Meanwhile, the country continues to careen toward disaster, with an unsustainable growth in national debt, an underemployed youth that cannot afford a decent college education without throwing themselves into a hopeless mountain of debt, a crumbling infrastructure, and a shrinking middle class that is leaving in its wake a growing divide between the very rich and the very poor.

We can only hope that a growing number of our political representatives will learn – and learn quickly – that brinkmanship is not statesmanship, that compromise in politics is not the same as compromise with a dictator, terrorist, or foreign enemy, and that the higher ground is only reached when we are able to understand and work with others who are not like us, who do not share all of our particular views, but who nevertheless want to bring about a future that is better for all of us.