Monday, June 30, 2014

Who Is "Number 1"?

There was an item in the news recently that the government of Thailand, currently under martial law, banned a screening of Nineteen Eighty-Four, the film version of George Orwell’s classic dystopian novel about totalitarianism.  That book has become very popular in Thailand recently, after the military seized power from Thailand’s democratically-elected government last month.

Democracy has actually become unfashionable in recent years, as its shortcomings seemed to have been highlighted by the failure of the “Arab Spring” revolutions, by the apparent incapacity of the United States Congress to effectively work together in addressing any of the most serious problems that the country is facing, and, in contrast, by the continued success of China to raise the standard of living of its citizens and continue on a path toward becoming the world’s largest economy, in spite of the fact that it is a very undemocratic country.

Is democracy passé?  Is it a quaint, archaic concept that has not stood the test of time in facing the challenges and rigors of modern civilization?  And, if so, then what should take the place of democracy?  Was Plato correct, when he argued in The Republic that an ideal society would be governed by those who were best endowed to exert authority over others, rather than by the arbitrary whims of a voting populace?

I, for one, am very much in agreement with Winston Churchill, who declared, “It has been said that democracy is the worst form of government except all those other forms that have been tried.”  Of course, Churchill was also quoted as saying, “The best argument against democracy is a five-minute conversation with the average voter.”  There is wisdom in both of these remarks, and, with respect to the second, it is frustrating to see how easily the voting public can be distracted by crafty politicians from addressing themselves to the most important issues at hand, if, indeed, they are willing to spend the time to familiarize themselves with these issues at all.  But even if Plato had been right – that that government is governed best which is governed by the best – then how does one insure that the best actually gain the reins of power?  And, having somehow succeeded at this, how can we the governed be assured that this ruling elite does not become intoxicated by the power that they enjoy, and direct the machinery of the economy and the state exclusively to maintaining themselves in power, and maximizing their enjoyment of it, regardless of the consequences to the rest of us?  These, I think, are ultimately intractable problems, and comprise the reasons why democracy, for all of its limitations, is still the system that comes closest to adequately providing for the welfare of all of the persons who comprise a society.

Democracy is a very fragile institution, however, and depends upon both the capacity and the willingness of most or all of its citizens to defend the institutions and customs that support it.  It is a tribute to the institution that it can survive and endure in the wake of widespread voter apathy, but the ultimate test of its survival is the extent to which its citizens will put themselves at risk to defend it when it is under attack, either from subversion within its borders, or from invasion without.

When the institutions of democracy begin to fail, it rests upon the citizenry to arrest the damage, and to repair it.  But in order to do so, it must 1) realize that the damage has occurred and/or is occurring and 2) have the willpower to address it.  The first is a problem of perception, the second a problem of will.  But there is an important third element, which is just as essential to preserving the trappings of democracy:  in the face of injustice, where resistance is required, it must be known against whom or what the force of resistance must be applied.

In totalitarian states, and other repressive regimes, particularly ones with a relatively short history, the face of oppression is generally a very prominent one, because the locus of power is very transparent.  Topple the dictator, and there is a very good chance (but far from a certainty) that the institutions of oppression will fall away with him.  On the other hand, long established institutions of power are harder to contend with, because they often are more nebulous.  Who, or what, is it that must be toppled, or pushed back, or constrained?

            I remember a poster that a boss of mine had in his office, back in the early 1980s.  It was a picture of Leonardo da Vinci, and underneath it was the caption, “We are in control.”  I always wondered what that caption meant.  Who was in control?  Did it mean that it was the inventors and artists of our society who truly ran things?  That seemed rather unlikely to me.  Perhaps, I speculated, it was a reference to the individuals who manned the engines of our civilization: the capitalist entrepreneurs and industry barons who were lionized in the novels of Ayn Rand.  The question, “Who is in control?” has often taken on a special urgency when the “controllers” have been perceived as being responsible for the oppression of various segments of humanity.  The idea of a collegial elite, meeting in secrecy to make critical decisions about the fate of civilization, has been a recurring popular one, and there have been various suspect groups and institutions.  In the 1960s, in America, the power elite were referred to as the “Establishment”, although it was unclear who the denizens of this particular group were.  (The injunction popular at that time – “Don’t trust anybody over the age of thirty” – fell out of fashion when the revolutionaries and social reformers of that era moved well past that age.)  In the 1970s and 1980s, other associations were held under suspicion, including the Trilateral Commission and the Bilderberg Group.  Wilder conspiracy theories have focused on historical fraternities, such as the Freemasons, Rosicrucians, or Illuminati.  The image of a shadowy conspiracy of elderly and middle-aged Caucasian men, sitting around a table in a darkened, paneled room, making decisions on all matters of importance, including the outcomes of certain sporting events, even made its way into popular television, in such programs as The X-Files.

In the 1960s, a British television program, The Prisoner, starring Patrick McGoohan, painted a particularly compelling portrait of the evils of institutionalized oppression.  The title character, played by McGoohan, was an ex-operative of his government who wakes up one day to find himself living in an Alice-in-Wonderland society – called simply “The Village” – peopled by citizens who have numbers rather than names, and who content themselves by wiling away their time in completely inane activities.  Privacy is non-existent in this little “village”, because surveillance cameras are literally everywhere.  Ostensibly, the title character is being held here simply as part of an elaborate form of interrogation, with his keepers wanting to know what had prompted him to resign from his occupation as a government secret agent.  But as the series progressed, a much more insidious goal became apparent – both to him and to us the viewers: that of inducing him to accept and even embrace his new role as a citizen of this dystopian community.

In each episode, the Prisoner is confronted by a new nominal head of the Village, known simply as “Number 2”, and each of these brings a fresh technique for trying to get the Prisoner to crack.  “Who is Number 1?” the Prisoner logically asks, suspecting that there is a single, unchanging locus of power hiding behind the scenes, pulling the strings of this parade of puppet rulers.  The question is asked at the beginning of every episode, and no answer is given.

In one of my favorite episodes of the series, the Prisoner makes a keen observation about the inhabitants of the Village that he believes will allow him to upend the entire structure of power there.  He realizes that there are two types of inhabitants in this society: jailers and inmates.  Although everyone has a number, and everyone dresses alike, there are clear differences in the behaviors of these two classes of people.  The inmates are fearful, submissive, and eager to avoid creating disturbances which might bring negative attention upon themselves.  The jailers, on the other hand, are haughty and overbearing, and take it upon themselves to ensure that the rules and customs of the Village are being observed by everyone around them.  The Prisoner discovers that by mimicking the behavior of the jailers, he is treated like a peer by the other, genuine, jailers, and treated with deference by the inmates.  He reasons that he can use his newly won social prestige to orchestrate a general social revolution, by counting on the loyalty of the obsequious inmates.  His plan backfires, however, as he discovers that the inmates do not trust him – or his intentions – because they have come to genuinely believe that he is one of the jailers, trying to deceive them into revealing themselves as potentially disloyal.

In the penultimate episode of the series, the reigning Number 2 resorts to one final, seemingly fool-proof, tactic to break the Prisoner’s will: he submits him to a sort of psychoanalysis, in order to “cure” him of his anti-social, non-conformist behaviors.  The intense analysis turns into a personal battle of wills between the two men, and when it is actually Number 2 who breaks down, the Prisoner is allowed to ask for anything that he wants.  He requests to be taken to Number 1: to find out who or what the real power is that has been manipulating this macabre society.

If the ending of this series was ultimately an unsatisfying one, it is because there really is not a nice, neat answer to the question: “Who is Number 1?”.  Who is it that actually holds the reins of power?  Who is making the really important decisions that affect all of our lives, and our collective destiny?

In my life experiences working in the corporate world, and for other organizations, I have found that those who are in control – who are making the important decisions – are not necessarily those at the top of the organizational chart.  In one particularly extreme case, several years ago, I worked for a company that brought on a president who was completely oblivious to the machinations of power all around him, and remained so until his relatively brief tenure there ended.  I remember sitting in a meeting, ostensibly being run by this man, where the plans for a very important project were being crafted.  The meeting itself consisted of a series of rather inane slide presentations, discussing the project in very general terms.  And all during the meeting, two rival factions within the company, consisting of men and women at various levels in the organization, were hashing out the real features of the project, as they huddled together in small groups outside of the meeting room, particularly during breaks.  The president had no idea that a herculean power struggle – which actually became extremely contentious – was going on just outside of the perimeters of this meeting, and when the final slide presentation had ended, he lauded the group of attendees, saying how proud he was to be in charge of such a talented, harmonious team.  When the project eventually did take final form, he had nothing to do with its actual development, or even with deciding which of the various rival features were adopted.

In every organization that I have ever been a part of, I’ve noticed that there is a genuine architecture of power, and that one must look beyond job titles and reporting responsibilities to find it.  Like the “Village” with its jailers and inmates, there are persons who are actively making decisions, orchestrating changes, and behaving as if they have a personal stake in the outcome, while others are seemingly content to just show up for work, avoid bringing unpleasant attention upon themselves, and dutifully follow any assignments that are given to them.  Of course, complicating things is that the architecture of power is never a rigid one – it is fluid, ever-changing.  I have seen many “heirs-apparent” – executives who are seemingly next in line for the top leadership position in the organization – suddenly ousted from the organization entirely.  Even political tyrants with absolute or near-absolute power – as history has shown over and over again – can find themselves unexpectedly divested of all of their power – and in many cases their lives as well.

And, to complicate things even further, often power is wielded by persons who are not even part of the official power hierarchy.  In the Persian and Byzantine Empires, eunuchs could exercise a great deal of influence in the royal court.  In modern corporations, consultants often play a critical role in carrying out important projects or even determining the future course of the organization, in either a temporary or ongoing basis, and in fact there is often a “revolving door” relationship between the two, with successful consultants becoming executives at corporations they were once hired to serve, and retiring executives joining consulting firms.  A similar relationship often exists between lobbyists and political organizations.

Perhaps, at each of the apexes of the power hierarchies of our civilization, there isn’t a person, or a caste, or a cabal.  At times, it seems that we are all just pawns, being swept along by the tide of socio-historical forces, and that the most powerful among us have merely deluded themselves into believing that they are shaping destiny rather than merely acting as its most prominent agents.  Some post-modernist philosophers, as I described in my blog entry “Apocalypse Then” (April 2013), believe that we are – or are moving toward – a civilization in which no person will genuinely be in control of anything, because the desires, goals, and beliefs of all humanity will be completely shaped and conditioned by the impersonal machinery of civilization itself.

And yet, both the triumphs and tragedies of history – ancient as well as modern – provide ample evidence that human beings can and do exercise power in ways that go against the tide.  Despotic regimes are toppled within days, while other societies that seemed to have been following, for generations, a trajectory toward greater freedom and tolerance suddenly descend into nightmares of oppression and chaos.  Within every corporate organization, within every political and religious movement, and within every government, there are real people, exercising real power, for a variety of different ends, in ways that are not completely transparent or comprehensible.


The exercise of power is older than civilization itself, but the science of power – the understanding of its architecture and its sociology – is still a relatively young one, hardly past the phase of observation and rudimentary explanation.  There should be a renewed sense of urgency in advancing this science, because the technology of power has been growing at an alarming rate over this past century.  In the United States, we recently discovered just how little privacy we really do have.  The all-seeing eyes of Big Brother in Orwell’s 1984, and of the Village in The Prisoner, are no longer elements of science fiction.  But surveillance is just a part of the exercise of power.  There is also the imposition of control.  And the psychology of compliance is one feature of the technology of power that has been advancing at a particularly alarming rate.

Monday, May 26, 2014

Thoughts on the Future of the Electricity Industry

(The following is a slightly modified and abridged version of a dinner speech that I gave at the Rutgers Center for Research in Regulated Industries annual Eastern Conference in Pennsylvania earlier this month.)

What I am going to do is to sketch out, in broad strokes, an outline of the core issues that the American electricity industry is facing, and what I feel are the methods that it will have to adopt to effectively face these issues. And I’m going to do this in a very personal sort of way, by highlighting some of the life lessons and experiences that have come to mind as I’ve pondered the challenges confronting it.

Let me begin with an experience that I had as a very young man, when I was working my way through college as a lab technician for a metallurgical company. Now the manager of this laboratory prided himself on having state-of-the-art equipment, and apparently equipment manufacturers had picked up on this, because it seemed that we had a steady stream of salesmen passing through that laboratory, each trying to convince him that they had the next big thing in laboratory devices. One of the more laborious tasks that we technicians regularly had to perform was to polish little test pieces of metal so that we could examine them under a metallograph. This was done by pressing them – one piece at a time – down onto a rotating polishing wheel, and it would sometimes take several minutes to get the necessary flawless finish that would enable testing of the piece. And then, one day, a salesman came into our lab with something that he promised would change our lives. It was an automatic polisher! Several little metal pieces could be attached to mechanical arms, which would then polish them, all at the same time, and no lab technician would ever have to sit hunched over a polishing wheel again. It sounded wonderful. It sounded fantastic. It sounded too good to be true. We lab technicians were given a demonstration of the machine. About a half dozen pieces were polished at once. One of the lab technicians was given one of these pieces after the machine had finished, and he was asked to render his opinion. “Terrific!” he said. “Let’s buy it!” I picked up another of the pieces and looked at it. It was terrible. The piece had not been polished properly at all, and was far from suitable for inspection under a metallograph. I brought this to the manager’s attention. The salesman protested that the machine had probably just not had its settings properly calibrated. I challenged him to a contest: his machine versus me. He could have as many tries as he liked, while calibrating his machine, but he had to eventually demonstrate that his machine could produce suitably polished test pieces as rapidly as I could. After several trials, with the machine still not able to properly polish a single piece, the salesman eventually gave up, saying that the machine just wasn’t suited for the particular kind of lab work we did. He was sent packing. Meanwhile, that other lab technician – the one who said we should buy the thing – pulled me aside and apologized for his early endorsement. “I was just trying to be a good company man,” he explained. “A good company man,” I thought to myself, “What did he mean by that? How was giving his rubber stamp approval to a machine that would have cost us thousands of dollars, and would have been completely worthless, doing a favor to the company?” I remembered this incident when I heard the president of an electric utility give a speech recently, talking about many of the dubious “innovations” that others have been trying to foist on his industry, and particularly when he quoted Stanford economist Thomas Sowell, who said, “Much of the social history of the Western world, over the past three decades, has been a history of replacing what worked with what sounded good.”

Three decades sounds about right, because it was approximately three decades ago when Coca Cola nearly made one of the most devastating product “innovations” in its entire history. Having been convinced by a third party – its competitor, Pepsi – that its formula was no longer a winning one – in spite of its continued dominance in the soft drink market – Coca Cola redesigned its signature brand, labeling it “New Coke”. Public reaction was swift and negative, and had Coca Cola not quickly realized its error and reintroduced its signature brand as “Coke Classic”, this misstep might have resulted in complete disaster for the company. Here, then, was a case of replacing something that worked with something that sounded good. Coca Cola’s product designers assured upper management that, based upon taste tests, the new formula would be preferred over Coke’s “classic” one as well as the one used by its principal rival, Pepsi. What management failed to realize was that the long-running success of Coca Cola was due to a product brand that had been thoroughly embraced by a loyal customer base which expected a corresponding loyalty from its provider. The switch to New Coke constituted a betrayal to them of the worst sort, and one which they could only grudgingly forgive when the original brand was restored to them.

Now in the midst of all of this clamoring for change in the electricity industry, its leaders must be careful to not lose sight of what their own special “brand” is, and thereby risk losing it – and with it, everything that contributed to the levels of customer satisfaction that the industry may have enjoyed in the past. And I do think that it has a brand: its own version of a “secret sauce” or formula that worked for its customers. That brand, quite simply, was the ability of a customer to flip on the switch to any electrical appliance in their home, and know with almost perfect confidence that the appliance would operate. It was a combination of simplicity and reliability. There were no complex procedures involved in bringing electricity into the home – no market transactions, or negotiations, or elaborate sequences of necessary steps to make it happen. Electricity, quite frankly, has always been something that we have never had to think about. We don’t have to care about where it comes from, or how it gets into our home, or whether we’ll have enough of it from minute to minute, or even hour to hour. We flip a switch, and the light comes on. That’s all there is to it. End of story. It is the same magic that underlies all of our most precious services – the utilities: natural gas, water, the telephone, and electric service.

This is electricity’s brand, and if electricity providers depart from it, they do so at their own great risk. I had an experience of this first hand, when I worked for a natural gas utility several years ago. We introduced a customer choice program: not because our customers demanded it, but because we became convinced by third parties – like Coke did in the mid-1980s – that it was a change that would be for the better. Customers were given the option to choose a different natural gas supplier, while still receiving delivery service from us. I’ll never forget the experience that I had one afternoon, when I was invited to speak to a group of senior citizens about my company and some of the new services that it was offering. I waxed eloquent about our new customer choice program, saying that it was a bold and wonderful step into the future, and how it would improve the quality of the lives of all of our customers. After I gave my talk, and invited questions from the audience, the last woman who stood up with a question said this: “I don’t see what’s so great about your ‘choice’ program. Since you’ve introduced it, I’ve gotten a barrage of calls from gas telemarketers, confusing me with offers that I don’t understand. And all of this time, my gas bill has actually gone up rather than down. Your program has been nothing but a source of grief to me.” It was a real wake-up call: here was something that most customers didn’t want, and at least some of them genuinely resented. It was a change that sounded good, but ultimately sounded better than it actually was.

Now at this point I know that it is sounds as if I am arguing against change – that change would not be a good thing when it comes to electricity. But nothing could be further from the truth. I believe that change is going to have to occur. Let me explain why: There was a historical event in North America, called the Great Blackout of 1965. It was a massive power outage that affected parts of Canada and the northeastern United States. It was not just the geographical scope, but the duration of the outage that made it so memorable. Over thirty million customers were left without power for nearly thirteen hours. Thirteen hours! Now the duration of that outage doesn’t produce the same reaction of shock and horror from those reading or hearing about this event as it did, say, ten or twenty years ago. At least it doesn’t from me. For most of the past few years, if the worst outage that I had during the entire year was only thirteen hours in length, rather than a few days in length, I would have counted that as a good year. Sadly, we have seen a marked drop in electricity reliability, in an era when continuous electricity service is more important than it has ever been. Thirty or forty years ago, if we found ourselves without electricity, we might spend the time sitting on the front porch with a glass of lemonade, talking with our neighbors. But now, not just our business life, but our social and leisure life as well is contingent upon being continuously connected electronically to a network. Why has electricity reliability declined? We have an aging infrastructure in this industry, just as we do in the rest of the country. The American Society of Civil Engineers gives the nation a grade of “D” for the quality of its infrastructure. It gives the electricity industry a “D+”. I guess that means the industry is “above average”, but that sounds like a dubious honor. But we’ve also seen an increasing frequency of very disastrous weather events in this country which have caused widespread outages. My vocabulary for these calamities has expanded in just the past few years, with words like “derecho” and “polar vortex”.

And this leads me to the second reason that change has to occur: the environment. Producing electricity is a dirty business. That has always been true, and electricity producers have already made great strides in cleaning up its power plants. But more needs to be done, and the growing consensus that greenhouse gas emissions are moving our climate along a trajectory to disaster only adds to the urgency of this task. One-fourth of all non-natural greenhouse gas emissions that have been produced since the beginning of the industrial revolution have come from the United States alone. And currently one-fourth of all non-natural greenhouse gas emissions produced in the United States come from electricity power plants. I know that there is a lingering debate about what the real impact of these emissions are, upon temperatures, and upon climate in general, but among the scientific community there really is an overwhelming consensus that climate change is real, and that it is dangerous. I know that I’m a believer, and so are many if not most of the CEOs of our electric utilities in the U.S. As Jim Rogers, former President and CEO of Duke Energy, once said, climate change is a serious problem, electricity power producers are a significant part of the problem, and electricity providers have to be a part of the solution.

This, I think, is the essence of what is driving change in our industry, from the customers’ perspective. But there are all sorts of other drivers that have come into the national conversation on this issue. Managers of investor-owned utilities are concerned about flat or declining sales, and how they will be able to maintain earnings growth. National policymakers, think tanks, and other third parties have become intoxicated with the idea of a decentralized grid, with electricity supply and delivery being managed by just about everybody, using solar-paneled roofs, microturbines, windmills, electricity storage, and price-responsive devices. But ultimately, it is what the customer wants that will drive the really important and substantial changes to the electricity system. And what I believe that the customer wants is a more reliable, and a cleaner, electricity service. That’s it. In spite of all of this talk of “smart” this or “smart” that, “cyber” this or “cyber” that, “prices to devices”, et cetera, et cetera, when you get to the real base of it, that’s really what our customers – and our citizens – are looking for. And of course a customer is always sensitive to price. It really comes down to a very simple formula: Achieve a desired level of reliability and clean energy at as low a cost to the customer as possible. And every public policy initiative, regulatory action, and business decision made by electricity providers should be done in the context of this formula.

Now I know that there is this other conversation going on, about how there is a new breed of customers that are more “tech savvy”, and who want to play a greater role in managing their electricity service, as they do in other areas of their lives. And I don’t dispute that there probably are some people out there who would love to have a “smart app” with which they can turn on or shut off their water heater at any time of the day, in response to hourly electricity prices, or have a “smart toaster”, or a “smart thermostat”. I suppose that I could eventually warm up to the idea myself of being able to remotely run the electric appliances in my home, so that potential burglars, seeing certain lights being turned on throughout the day, would not realize that I am away, and it would be nice to be able to run my heating and air conditioning units remotely so that the house will be at an optimal temperature by the time that I get home from work. The rise of smart phones and smart phone "apps" is often pointed to as a prominent example of how people want to use modern information technology in ways that were unimaginable just a few years ago. And this is true. Up to this point, I have been stressing the fact that a significant part of the value proposition for electricity is that customers don’t have to think – or do – too much about it. They flip a switch and it’s there. There was a time when the same thing could have been said about telephone service. So what is it that motivates somebody to invest more time (and money) in a service that they are receiving, rather than less? Why are all these smart phone “apps” so popular, and is there a similar potential hiding somewhere in electricity service?

I have been a student of business transformation, and it was questions like these that eventually led me to a fundamental insight. The greatest product innovations in our economy have one thing in common: they have moved customers from a condition of “bad time” to one of “good time”.

What do I mean by this? I believe that the single most important feature of each of our lives is how we spend our time, and there is a continuum stretching from extremely unpleasant ways, to extremely pleasant ways, to spend time. Let me give you a few examples. Bad time includes the performance of drudgery: scrubbing floors, mowing the lawn, or doing some mundane task over and over and over again. It includes unpleasant interactions with other human beings, like a rude clerk in a store, an annoying coworker, or an insensitive customer service representative. It includes long waits, whether in line or on one of those annoying calls to a customer service number, where a recorded message comes on every thirty seconds or so, saying “Your call is very important to us”: a lie which makes the long wait even more unbearable. Another example of bad time is when we have to drive all over town to try to find something that we want or need. Good time, on the other hand, corresponds to those experiences that we like to savor, and preserve in memory. Of course, the greatest of these involve happy times with significant persons in our lives, such as family members, spouses, or close friends. Entertainments – music, television, and movies – are also important examples of good time. But good time also includes new experiences, such as novel encounters, interesting new information, or other discoveries that are of interest or practical value to us. Even a pleasant interaction with a customer service representative, or the website of a company, might be counted as an example of good time.

All of the greatest product or service innovations have moved us from bad time to good time. Think of the vacuum cleaner, or the washing machine, or the dishwasher. Or think of the radio, and television, and the internet. Even bank ATMs count as a significant example of this. I remember some people actually saying, when ATMs first came out, “Oh, they’re so terrible – they’ve replaced the experience of interacting with real human beings with that of interacting with machines!” Well, I can tell you firsthand, I would much rather deal with an ATM than with a rude or indifferent bank teller, and I would definitely prefer using an ATM rather than waiting in a line at a bank. And as we move to more recent times, and the great successes which have emerged in the past couple of decades, the same phenomenon can be observed. It is certainly true that Borders Books had made the process of book-buying more pleasurable: they added chairs in their stores where people could read books at their leisure, a cafeteria where people could buy coffee and snacks, and even allowed their customers to do other things, like play Chess, in their stores. But Amazon.com did something even better. They made it unnecessary for book buyers to even leave their homes! Borders improved the quality of book-buyers’ time by giving them a more pleasant environment to shop, but Amazon improved it even more by making it unnecessary to make a shopping trip at all. And what about those “apps” on smart phones? What is the value in those? I think that the Blackberry was the first machine that demonstrated to customers how they could improve the quality of their time just about anywhere. We have all been in meetings, or other events, where we feel that our time would be served better by doing something else. Blackberry made that possible, by allowing us to check our e-mails, or even go onto the internet to catch up on the news. They gave us a means by which we could move from bad time to good time – or rather, inject good time right into the midst of bad time. And contemporary smart phones have only expanded on that service, allowing us to communicate with our peers via text messaging, play games, or even listen to music.

Every major product innovation – and every major industry overhaul – came about as the result of somebody figuring out a way to move customers from bad time to good time, or from good time to better time. I remember a personal experience of being put into bad time. Blockbuster (remember them?) called me one evening to tell me that I had never returned a rented movie, and that I owed a large fine for holding it past its due date. I argued with that person for twenty minutes, explaining that I had returned the movie weeks ago, and finally, after putting me on hold for several minutes while she checked her records, the Blockbuster employee came back on the phone and told me that I was right. And then she hung up. No apology – she just hung up. That experience, of course, made me furious. But when Blockbuster subjected somebody else to a similar experience, he did more than just get angry. I’m talking about a gentleman named Reed Hastings. He got $40 in overdue fines from Blockbuster for holding a movie too long. You might recognize the name: he was one of the cofounders of Netflix.

Have utilities ever put their customers into a “bad time” experience? I’ve already explained how one of my former company’s customers felt about our choice program. She definitely had a “bad time” experience. And, closer to home, I remember when my neighborhood was out of power for several days a couple of summers ago, after a severe storm. As we watched all of the surrounding neighborhoods get their lights back on, while we remained in the dark, our feelings of despair and frustration grew with each passing hour. Finally, when I was walking to the subway station one morning, I noticed that somebody had posted a sign facing a major street which bordered our neighborhood. It was addressed to our electric utility and said, “Please don’t forget about us. We’re your paying customers, too!” That desperate sign was evidence that a lot of people had been experiencing really, really bad time.

And so, as the movers and shapers of the electricity industry look to their future, they have to ask themselves what needs to be done to move their customers from bad time to good time, or from good time to better time. It’s really as simple as that. Whatever the eventual winning strategy is for future success, I am convinced that it will answer that basic call, better than any other strategy that had been tried or proposed. Like Coca Cola, the electric industry should never forget what its longstanding secret recipe for success had been, and should continue to be: give customers access to all of the electricity that they will ever need, any time, and in a way that they don’t have to think too much about it. And yes, there may be customers who want to take a more active role in managing their electricity supply. There may even be customers who want to produce their own electricity. And we all want an electric system that will not do irreparable damage to our environment, either locally or globally. The successful utility will be there, for all of us, finding ways to give us what we want, and in so doing, move us into a happier state. We have to be careful, though – all of us, including regulators and other policymakers – and avoid being lured into believing that something that sounds good should replace something that works. There are already many versions in the electricity industry of the “automatic polisher” that I described earlier – a device that sounded good, but that would have ultimately been more expensive, more time-consuming, and more unpleasant than the systems already in place. The successful entrepreneur, the successful innovator, and the successful incumbent provider have always succeeded by focusing on what is truly important and valuable from the perspective of their customers, and then finding the optimal way to improve the quality of their customers’ time and their lives – and keep them in that happy place.

Wednesday, April 30, 2014

American Atlantis

This month I’d like to return to some of the themes that started this series. Recently, the Intergovernmental Panel on Climate Change released a new status report on global warming, and the news was not good, as the report again concluded, with fresh evidence, that the consequences of inaction in stemming the production of greenhouse gas emissions will be catastrophic. Meanwhile, the search for a missing airliner brought to light the sorry state of the world’s oceans, as various supposed signs of the aircraft’s remains turned out to be massive agglomerations of floating garbage. The impending ecological collapse of the world’s oceans, of which greenhouse gas emissions is only one of several causes, would lead to the collapse of the world’s ecosystem as a whole, and that would be a disaster that no form of technological adaptation could protect us from.

Another recent news item was that astronomers have discovered, in a distant star system, a planet similar to the Earth that has a very high probability of sustaining life. This is just the latest of a growing number of these types of discoveries, and many scientists have concluded that there are an uncountable number of earth-like planets throughout the universe, and that therefore it is very likely that life has evolved on other worlds, including intelligent life. As I have pondered over this, I have wondered if planets such as these (including our own) are, in a sense wombs, fostering the growth and development of a life force that eventually becomes – through its most intelligent species – self-aware. It would seem that a successful birth would constitute the ability of that species to free itself from the bonds of its mother world, thereby ensuring its permanent survival, and its ability to grow and thrive beyond the confines of its home and its “mother”. But as with human births, success is not ensured. A species that was unable to free itself in this manner – a species that exhausted the resources within its womb without being able to escape it – would be like a stillbirth. And, in the worst cases, the tragedy would kill child and mother alike, leaving the world incapable of supporting new life that might succeed where this species had failed.

We seem to be reaching a critical state in our own “birthing” as a species, and I see a number of distinct outcomes. First, there is always the possibility that our collective wisdom will actually catch up with our technological capabilities, we will restore a healthy equilibrium within our ecosystem, and a collective pattern of behavior will be adopted in which we foster the health and sustainability of all systems of life on this planet. If and when we do develop the means to leave the bounds of this planet in any permanent sort of way, it will not be because we are fleeing from it, but rather because we have proven ourselves capable of living harmoniously and sustainably within the ecosystem of which we are a part, and can take that wisdom with us in our colonizing expeditions into the expanses of interstellar space. Second, there is the possibility that while we are headed for catastrophe, it is not one that will destroy us as a species, but will merely throw us into another dark age. According to Plato, the Egyptian high priests of his day believed that the civilizations of the world, including their own, had already gone through several such cycles: civilization, catastrophe, dark age (in which all of the collective learning and wisdom of the previous civilization had been forgotten), and re-emergence. A third outcome is that the human species will produce a catastrophe on a global scale of such severity that it – along with many other species of life on the planet – will be destroyed, but the earth, after a sufficiently long expanse of time, will be able to regenerate a new ecology, with entirely new life forms, including, perhaps, one that will become self-aware: a younger “sibling” (but one that need not be humanoid) which will have the opportunity to succeed in the birthing process where we failed. And of course, the greatest catastrophe would be one which left the earth completely barren and lifeless, with no capacity to produce new life forms of any kind. (There is a fifth possibility, which is the stuff of science fiction, and which I call “Intervention”. In this scenario, there are other intelligent species in the universe, more advanced than our own, at least one of which is aware of our existence, and this other species will, when it becomes apparent that global catastrophe is inevitable, forcibly prevent this catastrophe from occurring. Assuming that it is a species more enlightened than our own, I imagine that it would refrain from doing this until the last possible moment, because it would understand the deep trauma that such an action would have on a world civilization which believes that its members are the only sentient beings in the universe.)

Given America’s unique place in modern civilization – having the world’s largest economy, the most powerful military, and an ubiquitous presence and level of global influence due to American television and cinema – it is rational to assume that the fate of our civilization will be inextricably linked to the fate of America. And if the outcome is an unpleasant one, then there will be many among the survivors (if any) who will lay much of the blame on the United States for it. They will point out that in spite of harboring only one-twentieth of the world’s population, the U.S. consumed at least one-fifth of its energy output and its extracted natural resources, and that about one-fourth of cumulative man-made greenhouse gases emitted since the dawn of the industrial revolution came from the U.S. Of course, those countries in the world currently under the sway of religious extremism would add the less tangible – but to them more damning – complaint that America’s moral decadence poisoned civilization, and planted the seeds of its eventual universal decay. If the second of my five outcomes is the one that lies in our future, then it is not unlikely that America will loom large in the dimmed memory of the great civilization that had fallen, like the legendary Atlantis, believed to have been the most powerful nation of its day, which, after reaching the height of its powers and the depths of its moral decadence, collapsed and disappeared in the wake of a single, powerful, cataclysmic event.

In this future Dark Age, in a world stripped of technology, and with only scant surviving records of the time before the catastrophe (much of which might be incomprehensible to most of the population), the history passed on from generation to generation would consist mainly of a collection of legends and tall tales, colorfully told by village elders. Children would be told that a great and powerful kingdom once existed where all of the races of the world lived together in harmony. The people were not oppressed: there were no tyrants, and no fear of foreign invasion. The civilization of that age had developed a powerful sorcery, with flying boats in which people could soar through the air at very high speeds, crossing a continent or an ocean in just a matter of hours. A person living in that time could communicate with any other person, any place in the world, instantly. Cures had been found for many of the most debilitating diseases. People living in this kingdom did so in great comfort, in shelters that were never too cold or too hot, and they had abundant food from all types of animals, even though most of them had never seen a farm – never had to raise a crop or hunt and kill prey. Of course, there would be even more far-fetched tales, shared by some of the more colorful storytellers. America, they would claim, was so powerful that it possessed weapons that could destroy entire cities. And some of its men, they would insist, had even walked on the surface of the moon. 

The legends would tell how this mythical nation had risen to global preeminence after playing a critical role in a war of epic proportions, affecting all of the peoples on the earth, in which the forces of good and evil had been pitted against each other. In the aftermath of the final victory, this nation offered hope, as the guarantor of freedom, prosperity, and universal goodwill for all future generations, all over the world. But then something went wrong. To many peoples living outside of its borders, it seemed that it was exercising its power abroad not to support these ideals, but to maintain the level of comfort and security that its own citizens had grown accustomed to. And these citizens seemed to inculcate a growing insularity to the rest of the world, limiting their attention to their own escapist entertainments, and in petty personal dramas that were only of real importance to themselves. As they distracted themselves in these ways, the richest and most powerful among them pursued a cynical course of self-aggrandizement. A growing divide emerged between the wealthy and the poor, and the wealthy, through their pawns who led the nation on their behalf, came up with increasingly draconian ways to keep the poor in check. Those who would not be satisfied with the meager alms and entertainments offered to them were imprisoned, or marginalized in other ways. But as the ranks of the poor and destitute grew, it became increasingly difficult to keep their growing discontent at bay. 

And this American Atlantis found itself facing external problems as well. Although it was by far the most powerful nation of any on the planet, it soon found its resources overextended, as former allies, disillusioned by its policies, could no longer be counted upon to support it, while the ranks of its foreign enemies continued to grow. It found itself increasingly isolated, and shocked at its growing impotence to effectively manage crises beyond its borders and, eventually, within its borders as well. As the end times approached, it tried to withdraw from the affairs of the world, and concentrate its powers on domestic matters. But by then, these, too, had become irrevocably tattered, divided between a wealthy minority and a mass of discontented, impoverished, angry peasants, with an increasingly militarized police force keeping the precarious peace.

The legends would be unclear on how the final fall happened. It seemed to come about as the result of both global calamities and internal upheaval. All that would be certain to posterity is that when this great nation fell, there were few that mourned its passing. But there had been no room for celebration either, because with the collapse of this former beacon of hope, and guarantor of peace and safety, the world itself descended into chaos. Within a single generation, the magic disappeared, and the destitute survivors were left to ponder only how to fill their bellies from one day to the next.

Centuries had passed since the great cataclysm, and the brief Golden Age that had preceded it. The nations that existed then had become only unrecognizable names to the survivors now, and their boundaries long forgotten. But many still held a special reverence for this great semi-mythical land – this America – that had once been the crown jewel of the diadem of human civilization’s greatest era. Some even fancied themselves direct descendants of the remnant of its survivors, and felt a special sense of mission to reclaim its past glory. 

What had been the cause of America’s greatness? Did it simply come about as the result of a fortunate endowment of resources? Or was there a way of life, a code, a religion, or some other system of ethics and practices that was responsible? Was it the wisdom of its leaders that shaped its destiny, or the character of its people? What did its people believe? What precepts did they live by? How did they conduct their day-to-day lives? How were their ideals and standards different than those held and practiced by the uncountable generations of peoples that had lived before them?

And, of equal importance, what had gone wrong? Had America merely been a victim of its own success, following the inexorable path of growth, dominance, and decline that beset every other empire still vaguely remembered in the patchwork history of this future age? Had hubris destroyed it, or corruption brought on by an excess of wealth and power? Was there some defect latent in the American character that had lain dormant during the younger and more vigorous phase of its history, which had finally emerged when there were no longer any external checks to keep it at bay? Had something been lost: an idea, or an ideal, or a set of guiding principles? Or, on the other hand, had some dangerous innovation taken place – the introduction of some pernicious new idea or practice that acted like a toxin or a virus, invading the body, weakening it, sickening it, and then destroying it, as well as all with whom it came into contact?

In this distant, post-apocalyptic Dark Age, the primitive, semi-barbaric remnant might despair of ever rediscovering the powerful magic once practiced by their ancestors, if they believed that such magic had ever really existed at all. But the more intelligent among them – the more enlightened – would want to try to preserve and study whatever could be retrieved from the oral and written remnant of that Golden Age, and its America, hoping that with patient diligence, they might someday understand the wisdom in this legacy, and by doing so bring back to the world a method for resurrecting at least a rudiment of the former greatness, purged of the poisons that had brought that greatness to a disastrous and calamitous end.

As I envision this dark scenario of a post-apocalyptic future, I can’t help but remember the climactic scene in Dickens’ A Christmas Carol, when Ebenezer Scrooge is presented with a vision of his own end by the Ghost of Christmas Future. He is shown a group of businessmen who have learned of the death of a “wretched man”, and say that they will attend the funeral only if lunch is provided. He is also shown former servants plundering the bedroom fixtures of the deceased. None, it seems, genuinely lament the passing of this man. And finally, when Scrooge is taken to the neglected tombstone with his own name written upon it, he begs the silent spirit to tell him if the things that he has been shown are images of an inevitable future, or if it is still possible to change it. The book has a happy ending, as Scrooge awakens to find himself safely in his own bed, alive and well, on Christmas morning, happily prepared to live the life of a repentant man. I can only hope that we – as a nation and as a civilization – after fully appreciating the disastrous course that we have currently set ourselves upon, will have a similar epiphany, and find ourselves facing a new morning, with fresh possibilities for writing a happier ending, and a corresponding resolve to do so.

Sunday, March 30, 2014

Five Names

Many years ago, I came across an interesting mental exercise, intended, if I recall, as a means of improving one’s abilities for seeking out one’s most important personal values. The exercise involved selecting five persons, living or dead, who one would like to have on hand as personal counselors or advisors: a sort of “life committee” consisting of role models that would guide one in setting goals and making important, consequential choices. I found it to be a fascinating exercise, and have often shared it with other people, curious to learn which persons they would select to be part of their life committee. The choices that I have encountered have ranged across the globe, and across many centuries. I have noticed that people do tend to pick members of their committees that share something in common with themselves: their gender, their race, their nationality, their religious faith, etc. And in my own personal choices, I followed the same pattern: all of the members of my committee were male and Caucasian, and three of them were Americans. The members of my life committee were the following, in order of the chronology of when they actually lived: 1) St. Paul, 2) George Washington, 3) Clarence Darrow, 4) Winston Churchill, and 5) Saul Alinsky. I will describe, in turn, the reasons for each of my particular choices:

I have always been fascinated with St. Paul. He has been hailed both as the architect of Christianity and as its most eloquent prophet, but also condemned by some as the person who had distorted the original message and mission of Jesus and, in doing so, paved the way for some of the darker paths that Christianity has gone down in the past two millennia. Paul, of course, had originally been a persecutor of the small sect that had been followers of Jesus, bent on destroying it, but had then undergone a sudden, transformational conversion, after which he joined and eventually became a leader among the new sect, providing guidance to the many new churches that were being formed throughout Greece and Italy. Having read many accounts and studies of Paul’s life (which, admittedly, can only be based on the scant writings by and about Paul that have survived from that time), I have come to believe that Paul’s conversion experience was the result of a sort of cognitive dissonance that had been growing within Paul for much of his life and had reached a breaking point. He was a man that had wanted to devote himself to religious service, but, perhaps because he had been a Jew raised outside of Judea in the region of Asia Minor, had not been able to receive the level of training that would allow him to join the ranks of the Pharisees in Judea, and so attempted to fulfill his need to engage in religious service by zealously opposing what appeared to be heretical sects. The dissonance arose from the fact that, rather than bringing him closer to God, these acts of persecution only alienated him further from the true form of spiritual life that he aspired to. He then found – in the very heretical sect that he had been pursuing – a pathway to his own salvation. For the appeal of the Christian message extended beyond the Jews who had embraced it: it also answered the call of a growing number of Gentiles throughout the Roman Empire who found only spiritual poverty in their own pagan religions. As St. Augustine, centuries later, colorfully explicated in his masterwork, City of God, there was a pronounced depravity among the gods and goddesses that populated the pagan pantheon of Greece and Rome, and in the time of the early Christians, this depravity was only exacerbated by the tendency of decadent Roman emperors, such as Caligula, to declare themselves gods. Many of the spiritually-starved Gentile subjects of Rome found, in Judaism, a moral God that was worthy of worship and allegiance, but were unable to find, from their perspective, a feasible means of entering into a relationship with that God. Christianity answered that call, and for Paul, who counted himself as much a citizen of Rome as a Jew, to align himself with the Christian mission was to restore to him a more wholesome and fulfilling spiritual calling. Paul became “the apostle to the Gentiles”, and in that role, he indelibly altered the course of human history. I have always been awed by the actual imprint that Paul has made upon civilization. Many philosophers, poets, and dramatists have entertained dreams of having their writings revered by posterity, but none have come close to the posterity of St. Paul. Consider that in just about every civilized city of the Western world, for the past two millennia, on any given Sunday, there is at least one, and probably dozens, or even hundreds, of congregations where excerpts of one or more of Paul’s dozen or so letters (“epistles”) to the early churches are read and then expounded upon in sermons. That is a monumental legacy that even the most ambitious of thinkers would never dare aspire to.

George Washington has always been a personal hero of mine, as much for what he did as for what he did not do. His early career as a soldier actually started rather inauspiciously, when, during the French and Indian War, as a colonial officer from Virginia, he led two retreats after unsuccessful attempts to repel French troops from British territory which they had claimed for France. After this unpromising beginning in military service, he resigned himself to the life of a country squire in a plantation in Virginia. It was not until more than twenty years later, when the American colonies were on the brink of war with England, that he appeared at the assembly of the Continental Congress, in uniform, and volunteered his services as a military officer. His career from that point, of course, has become the stuff of legend, as he led the revolutionary army to ultimate victory, winning independence for the colonies, and then presided over a committee that drafted a constitution, and finally served as the new nation’s first president. Many were shocked when, upon the successful conclusion of the war, Washington resigned his commission and retired again to his plantation in Virginia, apparently resisting any temptation to use his military power to gain a permanent hold in leading the affairs of the new country. (It is said that when King George III inquired what General Washington would do now that he had won the war, and was told that the General would return to his farm, the King replied, “If he does that, he will be the greatest man in the world.”) And some were equally surprised when, upon the completion of his second four-year term as president, Washington declined to run again, and ceded the office to his elected successor, John Adams. But one of Washington’s boyhood heroes had been the Roman general Cincinnatus, an aristocratic farmer living in the fifth century B.C. who had been granted dictatorial powers by the Roman Senate twice during his lifetime, the first time to repel foreign invaders, and the second to quell a domestic crisis, and on both occasions, after succeeding in his mission, he resigned the dictatorship and returned to his farm. For Washington, Cincinnatus represented the epitome of heroism, and he more than emulated the model in his own life and career.

Winston Churchill certainly had his vices: He drank constantly and heavily, smoked several cigars a day, and relished a good hearty meal. I suspect that many in today’s effete society would find such vices insufferable, and would disapprove of the man for these alone. But he had one important, enduring virtue, and that was an implacable resolve in the face of adversity. Recently, in America, there has been a resurgent interest in what has now come to be called its “greatest generation”: those men and women who defended the country in World War II. But I think it would be well for Americans to remember and to feel a special gratitude for Britain’s own “greatest generation”, because for two harrowing years, while Americans were still embracing an ill-conceived isolationism, Britain stood virtually alone in staving off the Nazi menace. I have always felt a deep admiration for those RAF pilots who, during the Battle of Britain, had to take to the skies to confront German aerial bombers and fighters, knowing that if they survived the day, they would have to take to the air again the next day to fight again. These pilots were inspired, in turn, by a prime minister who famously said,
We shall go on to the end. We shall fight in France, we shall fight on the seas and oceans, we shall fight with growing confidence and growing strength in the air, we shall defend our island, whatever the cost may be. We shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender . . .
Churchill’s oratory had great power not just because of its elegance, but because they were spoken by a man who had demonstrated throughout his entire career, not only a great personal courage, but a steely resolve and uncompromising commitment to those goals and principles which he cherished the most – among them the rule of law, and defiance in the face of unprovoked aggression.

Clarence Darrow was a Chicago attorney and ardent champion of civil rights, aligning himself with causes associated with social reform, and defending clients who, because of the heinous nature of their alleged crimes, were considered by the general populace to be unworthy of having their say in court. (Because of this latter activity, Clarence Darrow was called “the attorney to the damned”.) Darrow is probably best remembered for two of his cases, both of which have been portrayed in movies, the first being the defense of “thrill killers” Nathan Leopold and Richard Loeb, and the second being the “Scopes Monkey Trial.” In his defense of Leopold and Loeb, two teenagers from wealthy Chicago families who killed a younger boy in a sadistic fashion, merely for sport, Darrow pitted himself against a public that could not conceive of anything less than violent death as an appropriate fate for the killers. His closing argument, in which he did not contest their guilt, but merely pleaded for mercy, is perhaps one of the most famous in legal history, and has been dramatized in at least two movies (Compulsion, starring Orson Wells, and Darrow, starring Kevin Spacey). At the heart of his argument is that the response of society to such a terrible crime must be based upon a desire for justice, rather than pleasure in seeing the perpetrators killed, lest society itself succumb to the same evil that motivated the crime itself. In the “Scopes Monkey Trial” (immortalized in the play and movie, Inherit the Wind), Darrow championed the rights of a young schoolteacher who had been charged with a crime for teaching evolution in a school where such teaching had been prohibited. While he technically lost the case (in spite of his scathing critique of the opposition’s arguments for supporting the law), the worldwide attention that this trial drew to the attempts of certain states to stifle science education on religious grounds eventually reversed the tide of this trend in the United States.

I first became aware of Saul Alinsky, a 20th Century American community organizer and social activist, when I discovered his book, Rules for Radicals, as a very young man. I was at that stage in life where many are looking for a lofty ideal to build their life aspirations upon and a means for attaining that ideal. Alinsky’s Rules for Radicals provided substance for both finding the ideal and the means to attain it. The ideal was no less than social justice, by effectively organizing those without power against those who had it and were abusing it. I’m not sure I ever actually even finished the book (though I still have it in my library), but what I had read had made such an impression upon me that Saul Alinsky remained forever embedded in my mind as somebody who I would choose as one of my personal life guides. He was tremendously successful in mobilizing communities in Chicago, and his methods continue to inspire those who are endeavoring to effect social change, across the entire political spectrum. Among those currently prominent in American politics, both Barack Obama and the conservative “tea party” movement have drawn upon the lessons of Saul Alinsky.

As I’ve thought about my five choices over the years, I’ve asked myself what it was in particular about these five that had so resonated with me, and what, if anything, they had in common. I realized, as I reflected upon their lives and legacies, that what had made their lives meaningful to me was that each, through his particular choices and actions, have made future generations better off in some way. St. Paul had taken a small and struggling cult in Judea and, through his personal dedication and the power of his inspired writing, turned it into a religious mass movement that has provided a spiritual mooring for billions of human beings over a span of nearly two thousand years. George Washington, through his courage, and his self-control, established precedents that ensured the success of the infant republic, which in turn would serve as a model for future republics throughout the world. Winston Churchill defended civilization itself against organized tyranny, and his bold tenacity has inspired leaders of many nations in the generations that followed his own. Clarence Darrow used the rule of law to defend those who had difficulty finding an advocate in society, and to stave off those who would attempt to silence principled activists of social conscience through legal means. And Saul Alinsky taught scores of reformers and champions of social justice – through his actions as well as his writings – how to effectively achieve their goals. I truly believe that the world is a better place because these men lived and acted the way that they did.

Of course, each of them had their shortcomings. Paul, who perhaps as a Roman citizen did not perceive a common cause with the Judeans – including the Jewish Christians – engaged in the Judean struggle against Roman tyranny, endeavored to distance his brand of Christianity from them, in favor of a more pro-Roman variety that enabled him to spread his message throughout the empire without hindrance, and in so doing might have set the tone for the anti-Semitic elements of Christianity that have persisted even into the modern age. George Washington championed an isolationism for America that did not serve it well in the early 20th Century – particularly during the opening years of World War II. Winston Churchill’s dogged stand against tyranny and oppression could be rather myopic when it was applied to subjects of the British Empire, particularly India. Clarence Darrow apparently resorted to extra-legal means to advance his goals at times, which nearly ruined his career, and he also failed to appreciate, in his crusade against the abuses of religious fundamentalism, that an equally slavish devotion to an amoral scientism could lead to societal outcomes at least as pernicious as those produced by religion. And Saul Alinsky, who in his methods believed that activism was most effective when it was directed against a perceived common enemy, perhaps did not appreciate the fact that such methods – when indiscriminately applied – can sometimes become indistinguishable from the tactics of fascism.

But can we ever demand perfection of our heroes? We are, after all, a race of human beings, not gods. And if, at the end our lives, we can say that we have improved, even in some small way, the lot of those who follow us, then that is no mean accomplishment.

I invite you to do the same exercise and determine who your own life counselors would be, and ask yourself what it was about their lives and accomplishments that resonated with you. I suspect that your life will be greatly enriched as a result of the exercise.

Thursday, February 27, 2014

Through a Glass Darkly

Is the world really as it seems? How can I know that what’s “out there” (reality) is the same – or even similar to – what’s “in here” (my mind)? Many wise persons have stated that there is a difference, and that the difference has consequences. In a famous parable, the philosopher Socrates (as related in Plato’s Republic), compares our experiences to those of prisoners bound in a dark cave, who can only see shadows moving about on the wall in front of them, cast there by a fire that is blazing behind them. These unfortunate souls have no idea of what is really producing the shadows on the wall, and so must content themselves with trying to make some sense of these fleeting images. Occasionally, according to Socrates one of these prisoners might have the good fortune to break free, and emerge from the dark cave into the light of the surrounding world. At first, the escapee will be overwhelmed by what he sees, and will find it impossible to comprehend it. But after he finally becomes accustomed to the light, and familiar with his new surroundings, he returns to the cave, to describe his enlightening experiences to his comrades still bound within. However, returning into the darkness, he is disoriented, and finds it difficult to make his way about. Even worse, his description of what he has seen and discovered is incomprehensible to the other prisoners, and in fact his words, along with his faltering steps, lead them to conclude that he has gone mad. Socrates suggests that this is what happens to anyone in our world who chances to escape the bounds of our limited experience and gets a taste of the greater reality. We think them mad, delusional, or impaired in some other way. St. Paul, too, talked of our experiences during lifetime as like “looking through a glass darkly”, and said that only after we shake the bonds of this mortal existence can we see “face to face”.

This question, about the correspondence between what we think we experience and know, and what actually exists and can be known, is one that philosophers have contended with for centuries. They have given a name to it: epistemology. The ancients believed that we experienced only a strata of reality, and that other intelligent beings coexisted with us among other strata, as dimly aware of our existence as we were of theirs. Science has given us a modern version of the same idea: We can see only a limited range in the spectrum of light, and can hear only a limited range of sound frequencies. Our sense perceptions in fact, really do only provide us with a limited sampling of the exterior world. And while the elves, fairies, angels, and demons of the ancients are no longer a part of this world view, we are told instead that there are microorganisms of which we have no direct perception but which, like these mythical entities, can from time to time disrupt our lives in very tangible and even calamitous ways.

But science on its own cannot provide a satisfactory explanation of why what’s really “out there” has any sort of meaningful correspondence with what is inside of our minds. After all, machines can be made to register light frequencies, or sound waves, or pulsations of pressure, but these “perceptions”, either singly or in combination, do not produce anything like an experience – let alone an experience that corresponds to the machine’s external environment. Even if we concede that the complexity of the living, organic, body and brain has somehow managed this feat – translating perceptions into authentic experience – the fact still remains that the perceptions are limited ones, and so the experience itself must be only an approximation, rather than a reflection, of reality. Philosophers continue to struggle to find a suitable explanation for why there can be any sort of correspondence between the contents of our minds and the reality that comprises the world around us.

The philosopher Robert Nozick provided an interesting insight – though, as he admits, it constitutes less than a concrete explanation, let alone a proof, for why such a correspondence might exist – based on the theory of evolution. Living creatures, in their struggle to survive and reproduce, would have to evolve some sort of mechanism for perceiving their environment in a meaningful sort of way, and would do so in a way that was parsimonious: developing just enough of a sensory apparatus – along with an ability to process these sensations – to be able to sustain themselves and defend themselves against predators. This would explain why only certain wavelengths of light were perceived, and certain bandwidths of sound, along with a limited attentive focus, and a pragmatic, selective retention of memories. We don’t see, hear, comprehend, and remember everything, because the brainpower required, and energy required to sustain such a brain, would be simply inefficient. Hence, all living creatures get by through “sampling” their surroundings, leaving out much more than they take in. We are all, in a sense, partially blind, fumbling about in a manner just well enough to get by. It is a practical explanation, though, as Nozick admits, it undermines that popular truism that we all only use just a fraction of our total mental capacity. If Nozick is right, we are pretty much using all of the parsimonious capacity that nature has allotted to us.

One wonders, if this is true, how we can communicate with our fellow beings at all. The answer is that we all tend to be blind to the same things: within species – and, for that matter, within cultures – we share identical limitations, so that the fragments of reality that we do take in, we tend to share in common with those other beings with which we are most in contact. And conversely, we have a tendency, when part of group, to block out the same things, and perhaps even collectively to forget the same things as well.

But “seeing through a glass darkly” still presents its problems, and these are not inconsequential. Like the viruses and bacteria that can make us miserable, without having even a dim awareness of the consequences of their actions, we, too, through our activities, often do things that have profound – and even devastating – impacts that we are completely unaware of. I remember once, while living in an apartment, the misery of having a next door neighbor who seemed oblivious to the fact that the loud music which he often played traveled easily through our common wall, regularly disrupting my life. Even when I complained of the noise, he seemed unable or unwilling to believe that what he was doing was causing displeasure to somebody else. To my great embarrassment, I discovered that I was guilty of the same lapse in sensitivity years later, when a neighbor who lived in an apartment below mine complained that the music that I was playing traveled through the floor of my apartment, into her own, disrupting her life.

Our limited awareness often blinds us to the consequences of our actions. I have even wondered if that might be the real “judgment” that we will face after death, if we truly move into a state of being that is liberated from the shackles of a limited consciousness. Perhaps, after death, we will be able to perceive and experience in a very real and compelling way the impact that every action we ever took in life had on other beings: feeling their grief, their pain, and their bitterness over wrongs we had committed against them. If many or most of these impacts are negative, and we truly can feel the weight of their consequences on others, this might constitute a “hell” that is unbearable to experience, and which actually compels us to want to somehow atone for or correct our negative actions. And if, on the other hand, like the character George Bailey in the movie It’s a Wonderful Life, we touched many lives in positive, compassionate, and loving ways, then the lifting of our blinders after death will allow us to fully experience the joy that we were responsible for: a sort of “heaven”. Of course, for most of us, if this actually does happen, we will have a mix of both heaven and hell. If the “negative” side of the balance sheet is significant, would we yearn for some sort of tangible way to correct it? Would we be given the opportunity to do so, by being reborn into the world as a new human being?

This, as I understand it, is actually something of the rationale behind the Hindu concept of reincarnation: we go through a chain of several lives, fixing ourselves and any wrongs that we committed in past lives, until we finally liberate ourselves from the law of “karma”, or cause and effect. If I caused harm to you in a past life, I might be given the opportunity to redress the wrong when I encounter you again in a future life, or perhaps atone for it in a less direct way, if our chains of lives never actually intersect again. There seems to be a problem, however, with this mechanism, if it does exist, since most of us are born with no recollections of past lives. It would seem then, that without the benefit of remembering our past sins and mistakes, we will be doomed to repeat them: slipping on the same banana peel over and over and over again. Perhaps we remember them at some subconscious or preconscious level, so that there are karmic motivations in the actions and choices that we make in our present lifetimes that we are completely unaware of.

And how culpable are we anyway, for things that we have done that have caused harm to others, if we were unaware – or imperfectly aware – of the consequences of our actions? After all, we are all doomed, in our mortal lives, to “see through a glass darkly” and can never fully perceive or comprehend what the implications are of everything we do. Doesn’t this absolve us of most of the negative consequences of our actions?

I remember reading, as a young man, the autobiography of Albert Speer, who was the armaments minister in Nazi Germany. Albert Speer was, by his own account, a loving husband and father, he was not a Nazi, and he was not even an anti-Semite. His job, and his sole focus of attention, was on armaments production, and he was a diligent, hard-working, and industrious manager. He probably resembled, in personality, lifestyle, and demeanor, a successful executive in any Fortune 500 company today. And because of his managerial effectiveness, he rose to become, for a time, the second most powerful man in Germany. Can he be absolved of the Nazi crimes of genocide, because, as he claimed, he had no knowledge that the large scale, systematic murders were taking place? Speer, in his autobiography, says that he began to hear rumors of the death camps, and was on the verge of investigating the rumors, until a friend and colleague warned him, earnestly, that this was something that he didn’t want to know about. And so he abandoned his plan to learn more about it.

I wonder, sometimes, if the artful avoidance of certain questions or investigations has allowed me to shield myself from any terrible things that I – or my country – may have been responsible for. There are the seemingly little things, like the kinds of foods that I choose to buy and eat, or the companies I support with my purchasing dollar. But there are larger things as well. I remember having a conversation with a colleague a few years ago, and the subject of Africa came up. I expressed regret over the fact that Africa has just seemed incapable of entering into a path of genuine economic development, and wondered if this was the lingering effects of colonialism. My colleague, with some irritation, retorted that apologists have trotted out the “colonialism excuse” for Africa’s continued stagnation for too long, and that it was time for Africans to finally take responsibility for their own destiny. I must confess that at that time I was inclined to agree with her. But recently I happened to attend a conference that was addressing the subject of “conflict minerals”, which are metals that, like conflict diamonds, are extracted in Africa under brutal conditions, as rival militias in certain areas subject the locals to virtual slavery in order to mine these materials and profit from them. The country where this is occurring is the Democratic Republic of the Congo. And I learned, at this conference, that the United States has had an active hand – as recently as the late 20th century – in both propping up dictators there who didn’t have the best interests of their people at heart, and in toppling governments that did. The resources there, after all, are quite valuable, and any interruption in their flow might have threatened “the American way of life”. Happily, America’s policies there are more enlightened now, but the electronics companies, such as cell phone manufacturers, who are primary users of metals refined from conflict minerals in their products, are only just beginning to investigate their supply chains to determine if they are supporting inhuman enslavement and brutality elsewhere in the world.

It may be true that we all must resign ourselves to “looking through a glass darkly”, but I suspect that sometimes, with just a little effort, we can clear the glass – at least a bit. If only it was done more often, and more diligently, during one’s lifetime, perhaps that final meeting, “face to face”, would not be such an unpleasant one.

Tuesday, January 28, 2014

The Past Imperfect

            There was an item in the news earlier this month that two of the world’s most powerful telescopes, the Hubble and the Spitzer, are operating in tandem to gather images of the universe in its relative infancy, by focusing on galaxies more than 12 billion light-years away (and hence, sending images to us more than 12 billion years old, in a universe that is currently estimated to be about 13.7 billion years old), and that there are plans for another telescope to gather even older images in 2018, corresponding to events that occurred a mere hundreds of millions of years after the Big Bang.


This is just the most recent example of an interesting phenomenon that occurs as our civilization continues to evolve: we develop greater and greater capabilities for recapturing our past.  In 1993, moviegoers were entertained by Jurassic Park, about an enterprising group of scientists who were able to resurrect extinct dinosaurs through DNA sequencing and cloning technology, and in the years since, there has been serious discussion about doing exactly that – at least with more recent species lost to extinction, such as the woolly mammoth.  And DNA sequencing has allowed us to better understand both how species have evolved and how our own human ancestors diversified and migrated, forming the races, tribes, and nations of modern times.

Even in our personal lives, modernity has been giving us an increasing capability to retain and capture our earliest past.  The field of psychiatry known as psychoanalysis, when it came into vogue at the end of the Victorian era, suggested that we might resolve our most serious psychological issues and lead more productive, happy lives if we delve deeply enough and far enough back into our life histories, unearthing and resolving conflicts involving our relationships as young children with our parents, and its practitioners engaged in techniques that made it possible for us to do so.  Technology has certainly helped us to preserve more of our personal and social history, with the evolution of photography, sound recording, and now, both sound and video recording with the simple use of a smart phone.  The capability for recording, and storing, records of our individual and collective lives has increased immensely in just the past generation.

Why is it, as we mature and move forward in time, that we have a growing desire to recapture the past?  The desire to preserve can certainly become pathological, as currently illustrated in the American television program Hoarders, about persons who retain nearly everything, and throw little if anything away.  They seem to be desperate to hold onto anything that has ever come into their lives.  I must confess that when I hear of stories like this, I look at my own life and say “There but for the Grace of God go I,” because there are some things that I have been very reluctant to throw away.  It has been almost impossible for me to let go of any book that I have ever owned, and so I find myself having to put an additional bookshelf into my home about once every five years.  (Perhaps Kindle will now save me from eventually walling myself in with bookshelves, while at the same time making it even easier for me to retain every book that I have ever read.) 

In many, if not most, cases, I think that the physical objects we hold onto provide tangible counterparts to important events in our lives.  Clearly this is the case with wedding rings, or college diplomas, or birth certificates of children.  They give our memories of these events substance: something that we can look at, and reach out and touch, so that they are not just merely thoughts in our minds – thoughts which will pass away when we pass away.  Of course, the meaningfulness of these physical objects is far from universal, and their value is often completely lost on others, even those close to us.  (Hence the ordeal of having to sit through a presentation of somebody else’s stack of vacation photographs.)  Many years ago, during an unhappy period of my life, I was driving one morning to a workshop that I had to attend, and stopped at a fast food restaurant for breakfast.  The restaurant happened to be giving away stuffed animals as part of a promotion for a new movie, and so I took one before resuming my trip.  And because it was right around that time that the circumstances of my life improved rather dramatically, the stuffed animal came to be permanently linked with a happy memory for me.  So to this day, the tiny, smiling “lucky Simba” sits on a shelf in my bedroom.  It will probably still be there on the day that I die, and when the “junk” in my home is committed to the flames, like the “Rosebud” sled in Citizen Kane, the stuffed toy will be cast away without the slightest suspicion that it meant anything to anybody.  When I was a boy, attending with my parents a holiday party at my grandfather’s house, I noticed a large Bible sitting in a prominent place on a shelf in his living room.  It just so happened that I had embarked on an ambitious project that year to read the Bible from cover to cover, and so, in order to impress my grandfather, I asked him if I could pick it up and read it.  To my shock (as well as that of my parents, and the others in attendance), he angrily shouted at me not to touch it.  A while later, it was explained that this Bible had been a prized possession of my grandmother, who had recently passed away, and my grief-stricken grandfather had never wanted it moved from the place where she had kept it.  Of course I didn’t understand his feelings then . . . but I do now.

What then, is it that compels us to capture more of the past, and to retain it, through material objects?  I think that we are always endeavoring to give our individual and collective pasts a more enduring existence that we hope will survive us, somehow, after the ephemeral imprints of our memories fade away.  And, by capturing more of our pasts, we hope to compile a meaningful story of our existence, with a beginning, middle, and end, which will endow it with a significance that will transcend the transitory nature of our time on earth.  Individually, and collectively, as nations and as a species, we want to believe that we are part of a drama that has an ultimate purpose – a destiny to be fulfilled, and by better understanding the most distant reaches of our past, we hope to be better able to trace out the trajectory of that drama.

In the Japanese film After Life, recently deceased persons are directed to find a single happy memory, which they will then be able to re-experience for eternity.  Is that what a real heaven might be: to collect a sort of “greatest hits” compilation of our memories, and be able to relive them for eternity?  For the German philosopher Nietzsche, such a prospect, “eternal recurrence”, presented an ongoing challenge to live a meaningful life:
What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: 'This life as you now live it and have lived it, you will have to live once more and innumerable times more' ... Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: 'You are a god and never have I heard anything more divine.' [The Gay Science, §341]
In the same vein, might we be forced to undergo a sort of trial in the afterlife, as in the American film Defending Your Life, and learn through this that the real secret of fulfillment had been to overcome one’s fears, and live life to the fullest?  Will we have second, third, and multiple chances to do so, as in the movie Groundhog Day?


Perhaps, with the continuing advance of technology, we will someday be able to memorialize everything that passed through our minds in a more permanent, substantial way.  And then it will be possible for others to recall each and every one of our lives, and review and examine them completely.  But even if this comes about, what would compel anyone to do so?  The sheer number of individual human existences seems to undermine the special value of what each of them had lived and experienced.  Still, there is something precious about every human existence, and perhaps when the capability is realized to see each one in their fullest, then future lives will be enriched by reviewing them, examining them, and drawing tangible lessons about how they spent their limited spans of time on this planet.  Maybe, in this manner, future human beings will find the blueprint for living lives that are truly worth preserving in memory, and even reliving, over and over and over again.

Tuesday, December 24, 2013

Three Books

Recently, I had the opportunity to see again the 1960 film version of H.G. Wells’ classic science fiction work, The Time Machine. The film starred Rod Taylor as the time traveler, H. George Wells, and in the final scene, after George disappears from his home and returns to an era in the distant future where he had discovered that civilization had lapsed, one of his friends who he left behind discovers that he has taken three books with him. It is unclear which books he has chosen, and the friend wonders aloud to his housekeeper which three books they would have selected, had the choice been theirs to make.

It is an interesting question, not unlike the one that I posed in my very first blog entry one year ago, when I wondered what lessons our own civilization might like to leave behind to some other civilization in the distant future – one perhaps coming out of a dark age, having either just a dim memory, or no recollection at all, of this one that preceded it.

As I watched that concluding scene of The Time Machine, I couldn’t resist wondering what three books I might have chosen, to serve as a legacy and a lesson to some people where civilization had lapsed. It brought to mind a life project that I embarked upon as a young man when, while in college, I came across a list of the two hundred greatest philosophy books ever written. I kept that list, and set for myself the goal of reading those books during the course of my remaining life, believing that if anything came close to comprising the collected wisdom of our civilization, then this must be it.

I must confess that, in the decades since setting that goal, I have fallen far short of it, having only read fifty-one of the two hundred books on that list. And I also have to confess that I don’t think that the ones that I have read have made me a better, or even a wiser, person than anyone who may have never read a philosophy book in his or her entire life. Still, the experience, at times, was an exhilarating one, and I’ve come to the conclusion that a really great work of philosophy is one that quickens the mind of the reader, enticing it to consider new and different ways of looking at the world, and existence, and of one’s role and place in the universe. Sadly, only a small number of the books that I encountered had this effect, while many had the opposite effect, with their pedantry actually dulling the mind, rather than exciting it. But the good ones made the entire venture worthwhile, and I have never regretted the time that I devoted to it.

Of the great ones that I encountered – the ones that quickened the mind, and opened entirely new vistas – I would include the following: Plato’s Republic, in which the legendary Socrates attempts to make a case for why a person should act justly, rather than otherwise, which is not based on fear of divine or human punishment. Although he was not completely successful in this attempt, the questions that he poses (in classic Socratic fashion) to his youthful audience, and the stories that he weaves, are profound and enlightening. The Republic also touched on the issue of how a society and government should be ordered, and Aristotle, in his Politics, addresses this issue as well, in a more systematic, but equally illuminating, manner. The works of Plato and Aristotle really do constitute a golden age of philosophy, and I have never come across any by these authors that is not worth reading and contemplating. It seems that, in the centuries following theirs, philosophy descended into a sort of dark ages of its own, with writers engaging in tendentious debates about inconsequential things, until its revival in the Age of Enlightenment. This renaissance began with writers like Hume and Berkeley, but to me it is the works of Rene Descartes – his Discourse on Method and Meditations on First Philosophy – that kick-started philosophy in the modern age in a very exciting and refreshing way, as he attempted to resurrect the search for ultimate truth from the ground up, relying upon first principles derived from reason and simple, direct introspection. The rebirth of philosophy in the modern age found its greatest light, however, in the German philosopher Immanuel Kant, a man who has been called – and very deservedly so, I think – the greatest philosopher since Aristotle. In the debate that had been raging in his time over whether ultimate reality rested in mind (idealism) or matter (materialism), Kant’s unique and revolutionary insight – in his Critique of Pure Reason – was that while there might be an ultimate “something” out there, we can never know what that “something” is, since our minds play an active role in mediating how external reality is presented to us and becomes a part of our perceived awareness. Reading Kant’s Critique was a dizzying experience: I – like most readers of it, I suspect, in my time as well as his – was not able to completely comprehend it, but still had a sense on every single page that something very profound, very important, and very exciting was being presented. Someone once said that all philosophy is a commentary to Plato: it seems to me that all philosophy in the past two centuries has been a commentary to Kant. Building on Kant’s insights, Arthur Schopenhauer, in his World as Will and Idea, attempted to build a bridge between these and the wisdom of Eastern schools of thought. Schopenhauer is often branded as a philosopher of pessimism, but his pessimism is really no different than that embraced in the first of the Buddha’s Four Noble Truths: that in existence there is suffering.

Sadly, it seems that since that second golden age of the nineteenth century, philosophy has been descending again into pedantic, arid controversies that dull the mind rather than quicken it, but there are a few lights in the twentieth century that were a joy to read, or at least inspired awe. One such awe-inspiring work was Alfred North Whitehead’s Process and Reality, which represented a herculean attempt, and perhaps a successful one (as with Kant’s Critique, I have to admit that my capacity for understanding the work was limited), to create a systematic, holistic model of existence that incorporated the most important insights of all of the great philosophers who preceded him. A more accessible, but equally inspiring, writer of the twentieth century was the French philosopher Albert Camus, who in such works as The Rebel, The Stranger, The Plague, and the Myth of Sisyphus (it was the first of these which had been on the list, but all are worthy rivals for a place on it), addressed the challenge, the burden, and the tragedy of contending with existential freedom. And, finally, the most recent of the works that made it on that list of two hundred, John Rawls’ A Theory of Justice, presented a novel approach to designing a just society, by envisioning a thought experiment in which its architects, while crafting its rules, were unaware of what their stations in life (rich, poor, male, female, etc.) would be after the project was completed. A book that did not make the list, but which constitutes an ingenious critique of and counterpoint to Rawls’ conclusions, is Robert Nozick’s Anarchy, State, and Utopia. The two should really be read together. I am glad that I did. (And I believe that Nozick’s Philosophical Explanations really deserved a place on the list of two hundred as well. Its final section, on the meaning of life, is one of the most insightful, profound, and provocative treatments of that subject that I have ever read.)

These are just some of the more memorable works that I encountered, as I worked my way through the list, randomly selecting titles, and I am sure that there are many others, which I may someday read, or may never get to, that have the same potential to awe and to enlighten. But would any of these be included among the three books that I would select, to be preserved after the memory of this civilization has faded?

There are other books that would be of more practical value. As an extreme case, I think of the many books that have been written to provide advice on personal success. I have a shelf full of these, such as Napoleon Hill’s Think and Grow Rich, Dale Carnegie’s How to Win Friends and Influence People, and Robert Collier’s The Secret of the Ages. Most of these books begin with the promise that by following the insights and principles contained within them, one can transform one’s life, and find success in both the personal and professional sphere. But I cannot think of a single one of them that affected my life in a profound, transformational, and permanent way, with perhaps one exception. This was a little book entitled The Richest Man in Babylon, by George Clason. Written in the form of an extended parable, it contains some very simple maxims on how to manage one’s money. I took them to heart, and put them into practice, and have always appreciated the wisdom embodied in them. Still, I couldn’t possibly imagine including even this book as one of the three written legacies to be left behind by this civilization to serve as a guiding light to another.

There are so many other types of works to consider – great novels, romances, poetry, works of religion – which might serve as a testament to our civilization, and provide an echo of its greatest moments. But in the end, my choices were still guided by pragmatism, more than anything else. What, I asked myself, would be of most practical benefit to some future age, where our own civilization had been forgotten?

My first selection would be a book on general science that contains the foundational principles and discoveries of biology and physics, and, ideally, some rudimentary mathematics as well. Now I am cheating here a bit, since I don’t actually have such a book, and so wouldn’t be able to take it off of my shelf, if, like George Wells in the movie, I was about to embark on my final one-way trip to the distant future in a time machine. I believe I still have my college physics textbook, which was pretty comprehensive in scope, so in a pinch I would probably take that. But a quick search on Amazon.com tells me that I could buy a book on general science and have it delivered to me in three days, so only a moderate delay would be required to have this book available.

My second selection would be a one-volume edition of world history, because in my opinion one of the most important legacies to be left behind by any civilization is a complete record of both its triumphs and its failures. There is much truth, I believe, in the familiar quotation that those who forget the lessons of history are doomed to repeat it. Here, I would be better prepared, as in my personal collection I have at least two one-volume histories of the world: one that was published in 1906 (which I referred to in my last blog entry, “Time’s Arrow”), and a Columbia History of the World published in 1981, which, while more recent, is by now also a little dated. I even have a book conveniently titled The Lessons of History by Will and Ariel Durant, but it is rather short in length, and therefore light on actual history. And so here, too, I might be tempted to delay my selection of an actual book until I can find one that brings the story of world history a little more up to date.

My third and final selection, and one that I actually have in my possession, is a one-volume collection of the complete works of Plato. This would seem to be the least practical of my three choices, and, for that reason, might appear to be the weakest. But I believe that there is a need for philosophy in civilization, and that it is essential that certain fundamental questions about the nature and purpose of our existence be asked. Plato, and in Plato’s works, Socrates, raised the most important of these questions, and addressed each of them with a depth of insight and lack of prejudice that continues to be unrivaled by any other great thinker before or since. And because of the foundational nature of the questions addressed, a study of Plato, in some distant future civilization, would provide fertile soil for the growth of other great ideas, germinating in future great minds, perhaps rivaling or even surpassing those that had graced our own civilization.

One book that many – at least many living in the western hemisphere – might find to be conspicuous in its absence is the Bible. I know that others would consider this to be an essential – perhaps the most essential – book to be included as part of our legacy to the future. I disagree. And I won’t defend my omission by resorting to the charge made by many agnostics and atheists: that religion has done more harm than good in the world, or that, at the very least, it has been responsible for much of the mischief (wars, pogroms, repressions, and resistance to scientific advances) that has permeated our history. Rather, I contend that the search for God, and for a relationship with God, is one that is dynamic, and defined by the person, or the culture, that is engaged in it. If our Bible, and the other great religious works that have appeared among the extant and recorded civilizations of our planet, truly represent the inspired word of God, then I have to believe that any future civilization, with no memory of ours, will have their own prophets and channels for receiving God’s inspired words. And these words will be expressed in their language, and in the context of their own unique history, culture, and development. It will speak to them in ways that our own never possibly could. Similarly, if the great poems, dramas, romances, songs, and collective dreams of our people must someday be forgotten, we can take some consolation in the fact that if there is a future age, then it will produce great poets, dramatists, composers, dreamers, and prophets that will move and inspire their audiences in ways that our own works perhaps never could.

It would be a great consolation to know that we will leave at least some of our words as a legacy for that future age, but whatever words we leave behind for the inhabitants of that civilization, I have every confidence that they will be able to provide the music, at least as unique, as inspired, and as beautiful, as any that we ever produced.