Tuesday, October 24, 2017

The Man Who Saved the World

As I write this, another “doomsday” has come and gone, this one predicted to occur on September 23 by a man who calls himself a “Christian numerologist”.  But also in the news was the belated announcement of the passing of a former Russian army officer named Stanislov Petrov.  Stanislov did not make a name for himself by predicting doomsday.  He actually prevented one from occurring.


Image result for Stanislav Petrov


It happened during a very dark time in our history, when relations between the U.S. and the Soviet Union were probably at their worst since the Cuban Missile Crisis.  It was September 26, 1983, nearly four years after the Soviet Union had begun its invasion of Afghanistan, and less than a month after a Korean Airlines passenger jet had been shot down by Soviet missiles – an act that President Reagan declared to be a “massacre” and a “crime against humanity”.  (Among the passengers on this airliner was a U.S. congressman: Representative Larry McDonald of Georgia.)  Tensions were very high on bold sides of the Cold War, and it was in this charged atmosphere, on the night of September 26, that Stanislov Petrov, a Soviet lieutenant colonel who was manning the nuclear early warning system that evening, heard the alarm indicating that a U.S. nuclear strike against Russia had just been initiated.  A Russian satellite reported that five ballistic missiles with atomic bombs were headed for Russia at that moment.



What Stanislov was supposed to do was immediately notify his superior officers, who would probably have ordered a counterstrike, which in turn would have been the actual opening salvo of a nuclear war.  What he chose to do instead went against all of his military training, and against all of the expectations about how he was to perform his duties in a situation such as this.  He did nothing.  Or rather, he called his superiors and reported that the system was malfunctioning.  He did this, not out of fear (although he was of course terrified), or because he harbored some defeatist death wish against the Soviet regime, but because something about this alarm just did not feel right to him, and he realized that the costs of acting in error would be devastating to the world.  He also knew, however, that if the attack was genuine, then the missiles would be arriving in twelve minutes, and any Soviet counterattack would have to be initiated within this narrow timeframe.  Stanislov waited, in agonized silence, for those minutes to pass,



He was right, of course – there had been no U.S. nuclear attack in progress.  Later it was determined that the early warning system had erroneously identified sunlight reflecting off clouds over North Dakota (where some of America’s nuclear arsenal is housed) as a launch of missiles.  Ironically, Stanislov was not lauded by the Soviet military command for his actions.  Instead, he was reprimanded for not properly filling out the operations log that night.



If ever there were a moment in which the history of our civilization could have followed two entirely different courses, based upon one single act, this was it.  I was in college at the University of Illinois when this event occurred, and of course my fellow students and I were oblivious to the fact that a nuclear war had been narrowly averted, as everyone (except Stanislov, his closest friends, and the Russian military) would be for many, many years.  Two months later, on November 20, 1983, ABC television aired a movie titled The Day After, about a fictional nuclear war fought between the United States and the Soviet Union (which in the film was started, coincidentally, in September).  The movie showed, in horrific and explicit detail (while never actually explaining who initiated the nuclear strike) that there would be no “winners” after such a conflict, because the ecological devastation to the planet would be immense and long-lasting.

Image result for the day after movie
 

 But actual events followed an entirely different course.  Just two years later, in 1985, Mikhail Gorbachev was appointed General Secretary of the Soviet Union, and in 1986 Gorbachev was holding serious discussions with U.S. President Ronald Reagan about the mutual elimination of all nuclear weapons within the next ten years.  (Ironically, President Reagan’s movement away from a hawkish stance toward the “evil empire” had been influenced at least in part by The Day After, which he had watched, commenting that the movie was “very effective and left me greatly depressed”.  It is unknown whether Gorbachev also saw the film, but it was allowed to be shown in the Soviet Union in 1987.)  Under Gorbachev’s leadership, radical programs were initiated that eventually led to the introduction of market reforms, a greater tolerance for freedom of speech and domestic dissidents, an end to the war in Afghanistan, the fall of the Berlin Wall, the breakup of the U.S.S.R., and the end of formal hostilities between Russia and the West.  It is hard to remember, now, the sense of general elation back then: that the decades of two enemy superpowers poised to destroy each other with massive arsenals of nuclear weapons had finally come to end, and that the new issue of concern was how to best spend the “peace dividend” which no longer had to be used for weapons of war.  This euphoric period of beating swords into plowshares proved to be all too brief and, in retrospect, only a respite from other hostilities and threats that would emerge in the ensuing decades, but it would have never happened at all had it not been for that Russian officer making a terrifying judgment call.



It is a stark object lesson that while rash, foolish or willfully evil acts of persons can produce devastating consequences on a large scale (as evidenced in the recent mass shooting tragedy in Las Vegas), courageous and sensible persons, too, can make a difference in the world in significant ways.  These are the persons who have the temerity to assert themselves in meeting rooms when a majority of seemingly intelligent and rational people endorse irrational courses of action.  Robert Boisjoly, an engineer who worked on the American space shuttle project, exhibited this kind of temerity when on January 27, 1986, he argued on a conference call with NASA management that the Challenger mission scheduled for launch the next day was in danger of catastrophic failure.  Boisjoly explained that he and his fellow engineers had discovered a problem with a component on the shuttle’s booster rockets that could catastrophically fail at low temperatures.  (The expected temperature at the time of launch was 30℉.)  Sadly, his arguments went unheeded, and the Challenger exploded shortly after lift-off.  Sometimes it is not people that the courageously sensible have to resist, but processes, procedures, customs, and laws that have been deemed to be necessary and appropriate, like that early warning system in the Soviet Union, or, decades earlier, the signs in the United States prohibiting black people from using certain restrooms or drinking fountains.



That it is difficult to be courageously sensible was proven in a very stark way by psychologist Stanley Milgram in the early 1960s, when he designed and conducted a series of experiments to test the extent to which a person would act in an injurious way, simply because another person in authority directed that it be done.  The general format of the experiment involved inviting a person to assist in conducting a learning experiment.  The person was shown a test subject in an adjacent room, strapped to a table with electrical wires attached to the body.  The “assistant” was instructed to ask a series of questions via microphone to the subject, and, whenever an incorrect response was received, to press a button which administered an electrical shock.  After each incorrect response, the voltage increased, until a maximum shock of 450 volts – which was clearly indicated as a danger level – was reached.  What the assistant did not know was that he or she was actually the subject of the experiment, and that the person in the other room was merely an actor, pretending to be shocked.  This actor would do a number of things to attempt to dissuade the assistant from administering further shocks, such as yelling out in pain, banging on the wall, and, eventually, not responding to the questions at all.  (Inactivity on the part of the subject being questioned, the director of the experiment explained to the assistant, was to be treated as an incorrect response.)  While many if not most of the subjects of this experiment showed clear signs of being uncomfortable when administering higher voltage shocks, 65% of them continued (at the urging of the person in authority) until they reached the maximum lethal level of 450 volts, even though by this point the “learner” appeared to be totally unconscious and unresponsive.



Sadly, it appears to be a common human trait to obey persons in authority, and to conform to procedures and customs, even when it is evident that these could produce – or are producing – terrible outcomes.  It is a somber spectacle of civilization that persons who are rational and compassionate in ordinary circumstances will allow themselves to be complicit in activities that are injurious to others, and even disastrous to many.  An example that has been playing itself out in recent years is the series of revelations that several men in positions of power or prominence had been sexually exploiting and abusing women (or, in cases involving clergymen and coaches, boys) who had been in their employ or who were otherwise vulnerable to their depredations.  What is most shocking about these revelations is how long the behavior of these abusers had been going on – in some cases, for decades.  Such a thing was only possible because men and women who were aware of the abusive behavior did nothing to oppose it, or even bring it to light, thereby being accomplices to the crimes in varying degrees.  For many of these enablers, the personal cost or risk of adverse consequences for exposing the crimes would have been very slight.  But even this was too high of a price for them to step forward and challenge the behavior, or even report it.  One can only imagine how many of the great evils of our civilization would have been prevented if more people had been willing to pay such a small price, when they had the opportunity to do so.  It only highlights the heroism of that apparent minority of persons who resist the tide, go against the grain, question and even resist authority, and chart an unpopular course that follows the higher dictates of their consciences, or just their common sense.

Image result for Stanislav Petrov


And so, I honor the memory of Stanislov Petrov, the man who was willing to pay the price of potential ostracism or retaliation in order to save the world from an irremediable catastrophe.  Forty years later, as the world watches two narcissistic, egomaniacal leaders taunting each other with the threat of nuclear warfare, Petrov’s heroism provides a striking historical contrast.  We can only hope that there are others like him, who will do the right thing at the vital moment when it counts.  We all must aspire to exhibit Petrov’s courage to swim against the tide, when doing so might prevent an unpleasant fate for others.  Now more than ever, it seems, human civilization requires many Stanislov Petrovs.

Monday, July 24, 2017

The Fool's Library

Books . . . they are the lifeblood of our civilization.  They form the topics of our conversation, the inspiration for public policy, the source of education, and the link to our collective past.  I have always been fascinated with what constitutes a “great book” and have compiled, during my lifetime, a number of lists of books that are suggested as “must reads” before one reaches the end of one’s lifetime.  I’m sure that I will never get to but a fraction of these, but I take some consolation in the fact that many books which have been touted as “classics” – even during my lifetime – were subsequently relegated to the Purgatory of irrelevance.  This phenomenon is even more pronounced when one observes the books that are immensely popular when they are released, such as those which make the “New York Times Bestseller List”, but fade, within a few short years, from the public consciousness.  In recent years, during my daily commute on the subway, I have often seen the phenomenon of several people reading the same book, such as The Girl with the Dragon Tattoo, or Fifty Shades of Grey.  I am fairly certain that, if I were to be riding on the subway twenty years from now, I would not see anybody reading these particular books, and would probably not encounter anyone who was even aware of their existence.  Similarly, when I think back to several popular psychology books in my youth, such The Peter Principle and Games People Play, I doubt that such books have had many readers in recent decades, and have probably all but vanished from collective memory.  (Other popular books from that era, such as Jaws, The Exorcist, and The Godfather have admittedly probably not seen much circulation in the ensuing decades only because the very successful movie adaptations overshadowed the original stories that inspired them.)

Image result for bookshelf

One could argue whether at least some of these books have been unfairly relegated to the dustbin of forgotten history.  But there are other books which more than deserve their place there.  In fact, I suspect that their authors even hope that these books are long forgotten.  These books constitute what I call a “fool’s library”: books that should never have been written in the first place.  Here are my leading candidates for the “fool’s library”: books that – to put it very politely – probably have an extremely low resale value.

Image result for kohoutek comet of the century
  •  Kohoutek: Comet of the Century (1973):  Discovered by Czechoslovakian astronomer Lubus Kohoutek in March of 1973, the comet which took his name gained the immediate enthusiastic attention of other scientists, who predicted that it would be the brightest comet in centuries.  This book was rushed into print to broadcast the claims of these scientists and stoke up excitement for the comet’s imminent approach to the sun in December of that year.  And there was definitely a stir of excitement about this much anticipated celestial event.  Even the ocean liner Queen Elizabeth II arranged for a special cruise with nearly 2,000 passengers for the express purpose of watching the spectacular comet.  Unfortunately, the “comet of the century” turned out to be a colossal non-event, “a celestial box office dud” in the words of The Wall Street Journal, as even on its best night, it appeared as just a tiny fuzzy object, and that only with the aid of binoculars.  The sad irony is that a genuinely impressive comet, Cornet West, appeared in the night sky less than three years later, in March of 1976, but was virtually ignored by the general public because scientists, having been embarrassed by their trumpeting over Kohoutek, decided to play down the arrival of this other one.  The Kohoutek craze had other ramifications:  David Berg, the leader of a religious cult called the Children of God, claimed that the comet was a harbinger of a doomsday event which would occur in the United States in early 1974, prompting many of his followers to flee to communes.  On a more positive note, the Kohoutek phenomenon inspired a number of songs by popular bands of the day, including Kraftwerk, Argent, Journey, R.E.M., and even Pink Floyd. 

The Jupiter Effect.jpg

  • The Jupiter Effect (1974):  Another celestial faux pas:  The authors, John Gribbin and Stephen Plagemann, observed that a rare alignment of the (then) nine planets was going to occur in 1982, and predicted that the combined gravitational pull of the planets would produce a number of devastating geologic disasters on the earth, including a massive earthquake along the San Andreas Fault in California.  I happened to be taking a course in college physics not long before this doomsday year approached, and was surprised when I calculated that the combined gravitational effect of these planets upon the earth was just a fraction of that of the moon.  The moon’s gravitational pull does have an effect on the earth of course, by producing the tides in oceans and lakes, but no one has ever blamed the moon for causing earthquakes or volcanic eruptions.  Needless to say, I was not surprised when 1982 came and went uneventfully, and only hope that nobody sold their homes and other personal belongings in preparation for Armageddon as a result of reading this book.


Image result for the beardstown ladies' common-sense investment guide


  • The Beardstown Ladies Common-Sense Investment Guide: How We Beat the Stock Market and How You Can Too (1994):  Here was a “David and Goliath” story that resonated with the masses.  What if a group of older women in a small Midwestern town formed an investment club, and then based their investment strategies on simple homespun wisdom and common sense?  And what if, over the ensuing decade, their investments then produced outstanding returns, far above those obtained by the stock market and many professional fund managers during the same time period?  This was what was claimed by members of the Beardstown Business and Professional Women’s Investment Club.  Their overall annualized investment return from the time they formed their investment club in 1983, through 1994, was calculated by them to be a staggering 23.4%, which would mean that the value of their investments was doubling every four years.  The book became a best-seller, and was followed by four others, as well as a program about them and their investment strategies on public television.  It was a great story, and an impressive achievement.  If only it were actually true.  The results were not supported by rigorous internal auditing methods, and when these were finally examined by an outside auditor, it was determined that the annualized return rate of the club’s investments from 1984 to 1993 was actually about 9.1%, which was significantly lower than the annualized return of the S&P 500 during that same time period of 14.9%.  Their results were better when calculated over the longer time frame of 1983 to 1997, at 15.3%, but again this was lower than the comparable S&P 500 return of 17.2%.  The bottom line: contrary to the title of their book, the stock market “beat” them, and, rather than following their advice, one would have done better by simply investing in an S&P 500 index fund.  (In fairness to the ladies, research has proven again and again that the market can rarely if ever be consistently outperformed by brokers, fund managers, or expert stock pickers.  In general, the best investment strategy is to put money into an index fund, thereby eliminating the cost of “expert” advice or fund management.)
Image result for the bible code

  • The Bible Code (1997):  This bestseller announced a tantalizing discovery:  Carefully hidden in the books of the Bible were (and are) predictions of major world historical events.  These predictions, it was claimed, can be discovered by looking at sequences of letters that appear interspersed throughout the Bible, such as, say, every fifth letter in a passage in Genesis.  Using this method, one can find in the Bible predictions of the Great Depression, World War II and Nazism, the moon landing, the Kennedy Assassinations (both John and Robert), and even the Watergate scandal.  The book caused quite a stir when it first appeared:  The Los Angeles Times declared that it was “a certifiable phenomenon,” and the Baltimore Sun went even further, averring that it constituted “the biggest news of the millennium – maybe of all human history even.”  It is rather embarrassing, in retrospect, how many people who should have known better, including scientists, gave credence to these claims, and even endorsed them, saying that the odds of these historical predictions appearing naturally were beyond astronomical.  The claims are actually based on a rather simple statistical fallacy.  Suppose, for example, that my sister calls me on the telephone just at the moment that I happen to be thinking about her.  Now the odds of such a coincidence, I might say, is probably more than a million to one.  But during the course of any day, which includes an uncountable number of experiences and situations, the probability that something will happen which I will consider to be an unusual coincidence is actually pretty good.  First-year statistics students receive a demonstration of this when they are presented with the “birthday surprise” problem:  “How many people,” they are asked, “must be in a room so that there is a greater than 50/50 chance that two of them will have the same birthday?”  When first confronted with this puzzle, many of these students, realizing that there are 365 days in the year, conclude that at least 183 people must be in the room for this to be true.  But that is not the case.  That would only be necessary if I was looking for somebody else in the room who shared my birthday.  But the odds that any two people in a group have the same birthday, without specifying what that birthday is in advance, are greater than 50% when there are only 23 people in the group.  (Try it yourself sometime, if you find yourself in a group of at least this size.  Ask everyone to write their birthdays on a card, and then compare the cards.  More often than not, you’ll find two people sharing the same birthday.  And if the group is just a little larger – say 30 people – you almost certainly will.)  By the same token, if I take a large book – say, for example the collected works of William Shakespeare – and am then allowed to systematically search sequences of letters separated by a fixed width of my choosing (and to experiment with different such widths), I will inevitably come across something, somewhere, that will spell a word or series of words that I think are significant.  And Hebrew letters make this particularly easy to do, since these letters can also be read as numbers (hence allowing me to identify years that go along with predictions) and also consist of consonants only (so that if I did a similar activity using only English consonants, and came across “m-s-s-l-n”, for example, I could read this as “Mussolini” or, ignoring the final “n” – because I can start and end the sequence anywhere I want – “missile”).  The “Bible Code” researchers gave themselves an additional degree of latitude, by checking for sequences both forwards and backwards (so that, in my previous example, if I came across “n-l-s-s-m”, I could still read it as “Mussolini” or “missile”).  And, sure enough, most of the predictions that are “discovered” in the Bible Code, rather than being complete sentences, are rather bits of “caveman speak”, such as “man on moon”, “great earthquake”, and “world war”.  The book really comes undone, however, with its collection of predictions about the future, including that of another world war, along with an atomic holocaust, in either 2000 or 2006, and the earth’s catastrophic collision with a comet in 2006, 2010, or 2012.  The author very cautiously remarks that these events only constitute possible futures, and that the Bible Code might present us with a key on how to avoid them.  Apparently we should all be grateful that somebody must have discovered that key and used it on our behalf.


Image result for Surviving the computer time bomb


  • Surviving the Computer Time Bomb: How to Prepare for and Recover from the Y2K Explosion (1999):  Actually, at least an entire shelf of the fool’s library, if not many shelves, could have been filled with doomsday books about the year 2000, and in particular the Y2K apocalypse.  There just seems to be something about the end of a millennium that attracts doomsday predictors of many stripes.  To put it indelicately, the year 2000, for many, was an apropos time for the “shit to hit the fan”.  As I mentioned in my previous blog entry (“House of Cards”, May 2017) many predictions were made around the time of the first Earth Day in 1970 about imminent ecological and sociological disasters stemming from overpopulation, and the year 2000 was often used as the target date for the culmination of these disasters.  “Millennials” (a word that now refers to one of our younger generations, but decades ago designated people who anticipated an apocalypse around the year 2000) sprang up from among the religious, who believed that the Creator finds the end of a millennium to be a particularly suitable time to do a final reckoning.  This is not new: similar millennial movements occurred in Western Europe around a thousand years ago.  (I discussed this phenomenon in “The Secret Doctrine”, December 2015.)  But as the year 2000 approached, the crisis that most captured the popular imagination was a technological one.  It was discovered that a limitation in software used in most computer applications throughout the globe was that this software had a relatively primitive way of storing and processing dates – recording years with only the final two digits – so that when the year 2000 came, computers would not be able to distinguish this from the year 1900, and, in many cases, this would result in computers failing entirely, potentially crippling entire sectors of the economy, such as banks and power utilities.  There was something sadly comical about New Years Eve 1999, when roving television news crews searched for something – anything – to fail after midnight, thereby giving them a story to lead with that morning.  Perhaps we can credit the “Y2K” hysteria with actually resulting in widespread solutions to this problem, because the year 2000 came and went uneventfully, with no significant shutdowns of any computer software, anywhere.  Ironically, there might have been one genuine negative consequence of “Y2K”, and this was that a mild recession occurred in 2001, which was blamed in part on a significant decline in infrastructure spending, which was in turn blamed on the glut in spending on computer hardware in the years leading up to 2000, which significantly reduced the need for further such spending in 2000 and 2001.  And, the “Y2K” hysteria inspired at least one disaster movie, which of course quickly faded from memory after the year passed without incident.




  • Apocalypse 2012: A Scientific Investigation into Civilization’s End (2007):  Like the last subject, this one, too, could probably fill a shelf full of books, not to mention a DVR-full of television programs about it.  The idea that civilization would face some kind of apocalypse in 2012, perhaps even total extinction, came about from a study of the calendar used by the ancient Mayan civilization, which reached its zenith in Central America around the middle of the first millenium.  The Mayan calendar is impressive in its accuracy: it has a cycle of about 5,125 years, and the Mayans believed that they were living in a fourth cycle, with each of the three previous cycles corresponding to failed attempts by the gods to create a viable world.  The final date of this fourth cycle, which began in 3,114 BC, was calculated by many scholars to be December 21, 2012.  But some scholars went even further, contending that this date was actually a prediction by the Mayans that the cycle would end with the annihilation of the earth, if not the entire universe.  As 2012 approached, the idea of a “Mayan apocalypse” got wider attention, fueled by pseudo-scientists and crackpots of just about every stripe.  Like the authors of The Bible Code, some claimed that the Mayan symbols that corresponded to dates on the calendar contained hidden meanings, which, if properly interpreted, revealed predictions of events that would occur on those dates.  And as with the authors of The Jupiter Effect, some claimed that on the end date of the Mayan calendar, the sun would actually be in a rare alignment with the center of our galaxy, and this is what would produce the destabilizing forces that would tear our world apart.  Finally, like the year 2000 doomsayers, those who predicted apocalypse in 2012 at least managed to inspire an entertaining disaster movie, and this one was a genuine blockbuster.  (I should add that I was personally inspired by this particular doomsday date, because it was the date that I chose to publish my very first blog: titled, appropriately enough, “An Emerald Tablet for the Mayan Apocalypse”.)


Image result for Condi vs. hillary


  • Condi vs. Hillary: The Next Great Presidential Race (2009):  Oops!

These then are some of my nominees for the book equivalent of the “Razzies”, the awards that are given for very bad movies.  I am sure that many if not all of their authors would like to forget that they ever wrote them (and hope that their friends and colleagues have forgotten as well).


But if you do happen to have any of these books, or others like them, on your own shelf – don’t be ashamed or even embarrassed.  (After all, who among us hasn't, from time to time, succumbed to the temptation to buy a book simply to see if its contents live up to the bold claims or promises on the cover?)  These books may actually be collectors’ items, like badly minted coins.  And if you have your own nominees – I’d love to hear them.  I'll be happy to share notable rivals to my own list in a future blog.

Monday, May 29, 2017

House of Cards


In the den where I am writing this, I have an old bookcase that I acquired many years ago.  Recently I discovered that most of its shelves are starting to give way, with the rear supports failing so that the back of the shelves are falling down.  The overall bookcase is still standing, and none of the individual shelves have yet to fail in supporting the books that are stacked on them.  But it is clear that in the not-too-distant future, if nothing is done, then the shelves will probably collapse like a house of cards, with each falling stack of books on a higher shelf pushing the one beneath it to the breaking point, and so on until the entire bookcase collapses.

I have had the repair of this bookcase on my “To Do” list for several weeks now, with a plan to go to a local hardware store, buy a sufficient number of metal brackets and screws, and properly reinforce each of the shelves so that the impending crisis will never come to pass.  Of course, this entails, in addition to visiting the hardware store, the removal of all of the many books from all of the shelves, and then the repair effort itself.  I have concluded that I will need to set aside at least a half day on this project, and yet, it seems that whenever I have at least a half day of free time, I always find something of more immediate importance to do.  I console myself with the thought that the bookcase has continued holding the books up until now, and so an additional delay of a day, a week, or several weeks will probably not be catastrophic.  But I know that if I continue in this vein, and the inevitable catastrophe comes, I will castigate myself for not having acted in a sufficiently timely manner to fend off the disaster, which would have been so much easier to resolve if I had attended to it before the collapse.

What is particularly disquieting to me, however, is that this personal example of courting disaster through procrastination seems to be playing itself out on a grander scale, with problems of much more serious consequence.  As an economist, the larger case that comes immediately to mind is the massive accumulation of debt that has occurred in this country – and the rest of the world – over the past several decades.  The U.S. national government debt alone, as a percentage of GDP (gross domestic product – a measure of the goods and services produced in the economy) topped 100% in 2012, and continues to grow, approaching the record level of 122% that was reached right after the end of World War II.  (This wartime run-up in government debt declined in the ensuing decades, falling to below 40% by the late 1960s, and didn’t start rising again until the Reagan era in the 1980s.)  And the U.S. is not alone: most of the developed economies of the world are near the 100% mark, and Japan is currently above 250%.  What is even more sobering, however, is the ratio of total debt – public and private – to GDP, which is well over 300%: a level not seen since the years leading up to the Great Depression.  People in those days fell into a borrowing frenzy, living way beyond their means, confident in the belief that the booming stock market would continue to buoy their artificially lavish lifestyles.  Many even borrowed money to invest in the stock market.  When the Crash came, this mountain of debt collapsed, but so did the livelihood of much of the population.  There had been hopes that the Great Recession of 2007-2009 would have a similar result, and that the fact that an economic catastrophe had been narrowly averted would mean that the “cure” this time would be much less painful than the one in the 1930s.  But while private debt levels did decline somewhat (government borrowing continued to grow, to cover such things as unemployment insurance and an $800 billion stimulus program), the decline was nowhere close to the massive deleveraging that happened in the 1930s, in spite of the rash of mortgage defaults, and when the recession ended, borrowing levels resumed their upward trend.  U.S. household debt has returned to the level it was at just before the Great Recession.  Economists have been warning for decades that this pile-up of debt is unsustainable, and very dangerous, but the very fact that they have been sounding the alarm for so long have made them seem to the general populace (not to mention politicians) like the boy who cried wolf.  Like my rickety bookcase, the economy seems to be hobbling along, and so why not continue borrowing more and more – this year, and next year, and the year after that?

Of course, the real house of cards that we have been stacking as a civilization is the strain on our environment.  I will not even address the most prominent one – the accumulation of greenhouse gases that threatens to raise the temperature of our planet to an unsustainable level.  Nor will I talk about the pollution of our oceans with plastic debris, or the fact that overfishing has depleted one-third of the world’s fish stocks beyond levels at which they can replenish themselves.  Instead, I would like to call attention to the more subtle signs that we are at risk of creating irreversible damage to the ecosystem.

There is, for example, the phenomenon of habitat destruction.  In the neighborhood where I grew up, we children had merely to pass through the row of houses that were on the other side of the street where I lived to move into a tract of undisturbed nature.  There were woodlands, swamps, and extensive fields, populated with snakes, frogs, salamanders, finches, pheasants, bats, and other wildlife.  It was not uncommon to find many of these in one’s own yard, since the boundary between neighborhood and woodland was so close to us.  And the tract itself was far from diminutive in size.  I remember once trekking through those woods and fields as a child in the company of two friends, and actually reaching a neighboring city without ever having to pass through somebody’s yard.

It’s all gone now, or nearly all gone.  Today, when I walk past that row of houses on the other side of the street where my childhood home is, I see – instead of a vast field bordered by patches of woodlands – a school playground, surrounded by houses.  It brings to mind another memory, back when I visited Orlando, Florida about twenty years ago.  The shuttle driver who was taking me and others back to the airport from our hotel pointed to an area in the distance, where we could see cranes and other large, mechanical equipment.  “Just a couple of years ago,” he said, “that was undisturbed wilderness.  Now it’s being leveled so that new residential neighborhoods can be built there.  This is happening all over.  The encroachment continues year after year, and there is no end in sight.”  I laugh (but not really) when I pass by the “nature preserve” near my present home in Maryland.  It is really not much larger than a public playground for children.

As undisturbed natural habitat diminishes, so does biodiversity.  The populations of species diminish in number, leaving pockets of small, isolated breeding groups.  Anecdotal news stories, such as a recent one reporting that there is only one male northern white rhinoceros remaining in the world, are only the proverbial tip of the iceberg.  Smaller populations provide more limited genetic pools that make the survivors of a species less adaptable to changes in the environment – both natural and civilization-induced.  When species of plants and animals are bred and cultivated for commercial purposes, this reduction in genetic diversity can be even more pronounced.  An extreme case was in the news recently when it was reported that a deadly fungus known as Tropical Race 4 was threatening the world’s commercial banana crops.  What makes these crops vulnerable is that nearly all of them (99%) are, literally, all copies of one banana – cloned from a plant that possessed the most appealing qualities to consumers.  With a complete absence of genetic diversity among them, if one of the banana plants succumbs to the disease, then it is a certainty that all of them will.  An earlier commercial variety of banana known as the Gros Michel nearly went extinct in the mid-20th century – from another deadly fungus (in fact a variant of the latest one) called Panama Disease – because it was a victim of its own universal popularity.  The Irish potato famine of the mid-19th century, too, was caused by a deadly fungus preying upon the one single variety of potato that was relied upon by Irish farmers.

Human population growth may be slowing, but it is continuing nonetheless.  And it is a simple fact that if there are more people, then more free space has to be used up to accommodate them, and more resources have to be committed to providing food for them.  In America, the growth in demand for more space exceeds the growth in population, because the average size of residential homes has been increasing over time, and there has been a general trend of de-urbanization, with migration away from concentrated urban areas into the surrounding suburbs.  Growth in population also means that there is a growth in waste products and other pollutants, which risk poisoning the reducing available habitat for the rest of the species on this planet, not to mention the humans themselves.

Ironically, the perennial argument that is made for keeping on doing what we’re doing is that it hasn’t resulted in catastrophe so far.  Last month, right around Earth Day, the American Enterprise Institute published an article entitled “18 Spectacularly Wrong Predictions Made Around the Time of the First Earth Day, 1970, Expect More This Year”.  Most of these involved predictions of various environmental catastrophes, including mass species extinctions, suffocating smog, and even a significant die-off of the human population due to widespread famine and other causes.  Articles such as this one by the American Enterprise Institute appear periodically, with the same air of smugness and confident underlying message that what made these doomsayers so foolishly wrong was their ignorance of the power of economic forces and basic human ingenuity to support the continued unchecked growth of civilization and population.  While economics and ingenuity have done much to stave off disaster, what these articles generally neglect to mention is that another very significant countermeasure has been a heightened awareness of environmental degradation, and a commitment to reduce its causes.  It was in 1970, in fact, that the U.S. Environmental Protection Agency came into being, and Congress enacted the Clean Air Act.  Subsequent legislation, along with the activities of EPA, have done much in the past half century to rein in pollution and other environmental hazards in the U.S.  The problem of acid rain, for example, which was beginning to decimate certain fish populations in North American lakes late in the 20th century, has been neutralized as a result of legislation which targeted the main source of this pollutant: sulfur dioxide and nitrogen oxide emissions by coal-burning electric power plants.  Since 1990, sulfur dioxide emissions have decreased by nearly 90% and nitrogen oxide emissions by nearly 80%; and this in spite of the fact that total electricity generation has increased by 36% during the same time frame.

It is certainly true that the dynamism inherent in economic forces has created – time and time again – suitable incentives for innovation and adaptation that have enabled us to defy the predictions – also made around 1970 – that we would run out of vital raw materials and energy resources within a generation.  And there is no doubt that this process will continue to make us resilient as a species and as a civilization.  But for how long?

There is one macabre fact about economics as we practice it.  Our entire system of economic growth is premised on the fact that the population will – or rather, must – continue to grow.  Economists regard declining population growth as problematic, and an actual decline in population as an economic crisis.  A large part of the reason for this is that as the working age population becomes a smaller share of the general population, it is called upon to support a proportionately larger share of the population that has retired from the workforce.  In the U.S., the main cause of the projected unsustainable expansion in national debt over the next few decades is the growing cost of Social Security and Medicaid.  A growing share of the population will be drawing upon these services while a shrinking share of the population will be available to pay for them.  That the U.S. government is paying for them instead by simply borrowing more money is not solving the problem, of course, but postponing a calamity that becomes only more serious with every year that it is not addressed.

And that is exactly the same general behavior that is playing itself out along several dimensions.  We continue practices which we know must be unsustainable in the long run, but convince ourselves that “the long run” is very far into our future.  And, we tell ourselves, weren’t we able to rise up to crises in the past that had been the result of an extended period of ignorant behavior, and ultimately resolve them?  Didn’t this happen with the Great Depression?  And didn’t this happen in England when, after years of ignoring the growing Nazi menace, under Winston Churchill’s dynamic leadership the nation was able to re-arm itself and ultimately defeat the Nazis?  Isn’t it, after all, a crisis that often brings out the best in human resolve and ingenuity?

Perhaps.  But I have an uneasy sense that the crises which are now looming before us due to our collective apathy are much, much more serious than any we have contended with in the past.  I fear that these may be beyond the capabilities of human resolve and ingenuity to overcome them.  And, what is particularly troublesome is that if my pessimistic fears come to pass, then these particular crises may be severe enough to bring down the entire global economy, if not human civilization, or even the human species.  But such things are unpleasant for us to ponder, and so we continue on our way, with a tentative resolve to give these problems more serious consideration at some undetermined future date, when circumstances are more accommodating to do so.


As I finish this entry, it is on the third day of a three-day holiday weekend.  It was just such a day that I had hoped I might be able to devote to repairing my teetering bookcase.  But I frittered away most of the day on distractions and easier chores, and left the bookcase ignored.  Meanwhile, the shelves continue to buckle, and I only take consolation in the fact that they have held out so far, and so maybe they will hold out until the next time that I have a free day to consider devoting to the bookcase’s repair.  But I know that with each passing day that I procrastinate, the looming disaster grows larger.  And I can only hope that before it happens, I actually do find the time and the dedication to prevent its occurrence.  As for that larger house of cards, I can only hope that we find the leadership and/or the collective will to stave off its catastrophic collapse before it’s too late.


Postscript (February 28, 2018):

After posting this blog, at the end of May 2017, I continued to manage avoiding the task of fixing my bookcase - always finding something more important and definitely more interesting to do on my days off - until finally, during an extended holiday vacation at the end of the year, I addressed myself to making the necessary repairs.  The result was catastrophic: the more I tried to repair certain shelves, the more rickety and unstable the bookcase seemed to become, until finally it literally collapsed into a pile of boards.  My first reaction was to throw up my hands in despair.  I immediately started shopping on Amazon.com for a new bookcase, but couldn't find any as large as the one I had, and the ones that came close were very expensive.  Meanwhile I dreaded having to arrange to have the pile of wood littering my room hauled away as garbage.  But then, when I looked a little closer at what used to be the shelves of the old bookcase, I realized that they had actually been individually quite sturdy: they showed none of the telltale signs of bowing that happens when the shelves are made of inferior material.  And so I resolved to rebuild the bookcase from scratch.  It was a slow and arduous process at first, as I shopped around for the right tools and metal brackets to put the thing together, and then tried to construct a sturdy frame.  But once I managed to accomplish this, things got progressively easier.  I even added some enhancements to make the new bookcase superior to the old one, such as putting on a plywood back to the bookcase to cover the flimsy cardboard one that had originally been the sole means of stabilizing the frame.  And now, today, two months after I started, I finally put the last shelf in place.  The bookcase hardly looks like new: it bears many visible scars from its prior incarnation.  But it is sturdy now, and I suspect that it will actually outlast me.  As I was going through this restoration effort, I couldn't help but imagine that this, too, is a metaphor for the "house of cards" phenomenon, because often when a long-festering problem is finally addressed, the full measure of its toxicity is finally exposed, and a greater crisis occurs.  This, for example, is what happened after England and France had ignored the military buildup of Nazi Germany for many years.  When war was finally declared, France fell to the Nazi invaders, and England was on the brink of succumbing to the same fate.  But with Churchill's persistent tenacity in facing and then contending with the problem, slowly, but surely, the menace was halted and eventually overcome.  If and when the longstanding problems that I referred to in this blog finally do erupt into full-blown crises, I will look to my bookcase experience as a source of optimism that the darkest hour, if faced with courage and tenacity, will also give way to a better, sounder future.

Sunday, March 26, 2017

A Conversation Across the Gender (and Generational) Divide

Earlier this year, a couple of friends of mine and I decided to start a book club.  These are both coworkers, though one of them retired at the end of last year.  I think that part of the inspiration for starting the club, in fact, was that this particular friend knew that he would now have much more free time on his hands, and so he could address himself to doing things that he had always wanted to spend more time on, such as reading books of personal interest.  In our inaugural meeting, we compiled a list of candidate books.  We wanted to read something of substance, but not something that was too lengthy, since our plan was to read and discuss one book a month.

The Second Sex, by Simone de Beauvoir, was one of my suggestions.  This is one of those books, like Adam Smith’s The Wealth of Nations, which is often quoted and cited as a landmark, foundational work in its particular field (The Wealth of Nations as a pioneering work on economics; The Second Sex on feminism), but which very few people – even people who claim to be well-versed in the particular subject area – seem to have actually read.  I had fallen under the mistaken impression that The Second Sex (unlike The Wealth of Nations) was actually a short book, because when I looked it up on Amazon.com, the length was reported as only a little over a hundred pages.  It was only after I had persuaded my comrades to make this our first book selection that I discovered that what I had actually been looking at online was an abridged version of the book, and that the complete work was actually over 800 pages long.  Nevertheless, we were reluctant to give up our collective first choice, but realizing that reading such a tome in a month’s time would be an almost insurmountable challenge (given that two of us still had full-time jobs), we decided to lower our sites and settle for reading the abridged version of the book, which included just the opening and closing chapters of the complete work.

I should mention that our third comrade added some immediate diversity to our fledgling group, because she is a very young woman, perhaps still in her twenties.  I anticipated that this diversity would add suitable color to our discussion.  But when I arrived for our meeting, on a Sunday morning, at the home of our young friend, I immediately saw that there would be even more color to the discussion than I had hoped for.  She had invited two of her friends to join us, both fellow females, and both probably in their twenties as well.  Here were the makings of either low comedy or high drama: the two of us middle-aged men and these three women – all of whom were young enough to be our daughters – talking about a work on feminism.  Our host had laid out a variety of drinks and hors d’oeuvres for us to enjoy: I poured myself a glass of whiskey, straight, and prepared for the conversation.  

As the person who had suggested our book selection, it had fallen upon me to guide the discussion with a set of questions about the book.  Most of these were fairly generic (“What arguments in Simone de Beauvoir’s book are still relevant?”  “Which are outdated?”  “What was your greatest revelation in reading the book?” and the like), and while we generally addressed them, the conversation often branched off into the general subject of feminism.  My first lesson that day dealt with the concept of “microaggression” (a term not to be found in the book).  One of the young women explained that this is the subtle way that men try to oppress women and keep them in stereotyped roles and modes of behavior.  A prominent example of this, she said, is when men talk over women when they are in the middle of a sentence.  The comedic effect that this remark had upon me, and I suspect upon the other male member of our group as well, is that for the remainder of the meeting he and I were especially diligent about not speaking up unless we were certain that nobody else – particularly any of the women – was talking.  I must say that while I saw some substance to this charge, when I reflected upon my own life experiences, I felt that I have probably been a victim of this behavior myself at least as often as I have been guilty of it.  I remembered that while growing up in my household, for example, it was usually difficult for me to get a word in edgewise whenever I was talking with my sister.  But another example that was presented was in the workplace, in meetings, when the men present assume that the women in the meeting will handle the more menial chores, such as taking notes.  I saw substance in this charge as well, and in fact my female coworker brought up a recent example when her boss, a woman, was expected to do that very thing.  I weakly protested, however, that at least things have improved:  I still remember a time, I said, when women were expected to make coffee in the office.  (Of course, part of the reason that they have been relieved of this burden might be because most offices now have single-cup automatic coffeemakers.) 

Continuing in this same vein, I remarked that there has certainly been much improvement in the general perception of women’s capabilities.  When I was a child, it was common for men to complain about women drivers, but insurance companies have long since exposed the lie underlying this complaint, since rates tend to be higher for male drivers, based upon their higher accident rates.  A generation ago (and probably before any of the female members of our group were even born), a commercial featuring a popular actor and actress touted the benefits of an automatic camera.  The actor in the commercial declared that the camera was so simple to operate that even a woman could use it.  Now this was intended as a joke (as evidenced in the actress’s exasperated but bemused reaction to his remark), but the  basis of the joke was that there was a time well within the memory of most of the viewers of that commercial when such a claim would have been made in all seriousness.  I noted that, in fact, there seemed to have been a retrogression that occurred in the 20th century, because in the first half of that century, women commonly shared occupations with men: Amelia Earhart, for example, made a name for herself piloting airplanes, and many women took on assembly and other production jobs during World War II.  But during the ensuing “baby boom” years, when the majority of women returned to traditional homemaking roles and abandoned these other occupations, a popular mindset eventually arose that they were not in those occupations because they were incapable of performing them.  This, I think, was the toxic mythology that had prompted Betty Friedan to write The Feminine Mystique in the 1960s.  Ironically, in recent years there seems to have been a complete reversal of that phenomenon.  There is much talk of a “boy crisis”: referring to the fact that boys are not performing as well as girls in grade school, and are not advancing on to higher education at the same rate that they are.  Law schools and medical schools – institutions that once catered to a predominantly male student body – now see more female students than male.

The discussion now turned to women’s career and life choices, and whether they should be judged based upon how “traditional” these choices are.  One of the female members of our group referred to a pair of existentialist expressions that were used by Simone de Beauvoir: “good faith” and “bad faith”.  A decision made in “good faith” is one that was not the result of social coercion – either overt or subtle – or even social conditioning, while one made in “bad faith” had been shaped by one or more of these forces.  Based on this criteria, she said, if a woman freely chooses to be a housewife and homemaker, then that woman is above reproach in her choice.  I was intrigued by this distinction, and wondered aloud if it applied as well to popular female music entertainers, and their tendency in recent decades to use sexually titillating costumes and dances in their performances.  Is this a form of sexual exploitation, or, as many of them seem to contend (along with their feminist fans), their own way of expressing female empowerment?  I referred to the case of Brittany Spears, a female performer who started her career when she was a blonde, innocent-looking child as a “mouseketeer”: one of a group of entertainers in a wholesome kid’s program.  But at a time in her life when she had barely passed through puberty, she suddenly became a pop star, with a show that featured her as a sexually voracious nymphet dancing in risqué clothing and singing songs with suggestive lyrics.  To me, it seemed that she was acting out some middle-aged man’s pedophiliac sexual fantasy (her manager’s or some record company executive’s, I suspected), rather than making some sexually-charged statement of independence, and, based upon news accounts about her, it does appear that she has had little control over her career and the direction it has taken, or even the other personal details of her life.  Contrast this, however, with other popular female entertainers, such as Madonna, or – more recently – Beyonce and Lady Gaga, who do seem to have more control over their careers.  Their acts are just as sexually provocative.  Were Brittany Spears’ performances a tragic example of female exploitation, while very similar performances by these other entertainers examples of female empowerment?  The female members of our discussion group contended that this was the case, and I found it difficult to disagree with them.  After all, male musical entertainers with sexually suggestive costumes and dances, such as Elvis “the Pelvis” Presley, were never pitied as victims of sexual exploitation.  I had to concede, as well, that one definitely sees the same spectrum from good faith to bad faith in occupations that are explicitly sexual in nature, such as porn stars and prostitutes.  At the one extreme, there are many women that have clearly been exploited in these occupations, and are victims.  And yet, at the opposite extreme, there are also many women who have entered these occupations voluntarily, have done so in a manner in which they retain control of their careers and have even profited from them handsomely, and, in many cases, even vociferously defend their choices and oppose those who feel that these careers should be outlawed.

And yet, one apparent consequence of the trend among popular female entertainers to package themselves in very sexually expressive ways – whether as a result of “good faith” or “bad faith” – is that there seems to be a tangible influence on young girls in our culture to present themselves as objectified sexual beings, at least in certain circumstances.  What particularly comes to mind is Halloween costumes.  While boys’ costumes run the gamut from monsters to super heroes, it seems that nearly all girls’ costumes these days have a burlesque element to them, whether it be a sexy nurse, or a sexy vampire, or a sexy heroine.  It is hard for me to imagine that this has a positive effect on how they perceive themselves and their future roles in the world.  Is a choice made in “good faith” sufficient unto itself to justify that choice, or must consequences of the choice upon others be taken into account?

Simone de Beauvoir discussed Marxism and Freudianism in her work, and how each had fallen short in addressing feminist issues.  The other male discussant in our group noted that the Marxist revolution in Russia had conspicuously failed women.  I commented that I had read a book in my youth titled The Dialectic of Sex, in which the author, a radical feminist named Shulamith Firestone, contended that no Marxist revolution will ever be truly successful unless it is a Marxist feminist revolution.  I added that Firestone went so far as to aver that sexual oppression will never be completely eliminated until all babies are produced in birthing machines.  I had expected that this remark would produce a round of laughter, and was shocked when, instead, I saw that all three of the young women in our group were nodding earnestly in agreement.  One of them even remarked that she is irritated by the current rather militant stance that many mothers are now taking about the right to breastfeed in public, because, she said, this is one of the most conspicuous ways that a woman’s role in society tends to be circumscribed, in the public consciousness, by her anatomy. 

I put one final question to the group for discussion:  In spite of Simone de Beauvoir’s iconic status as a feminist theorist, she has not been without her critics.  There are two particular strains of criticism that have been leveled against her: the first that she is actually a misogynist herself, and the second that her particular vision of feminism was limited by her own particular circumstances: that of a white, bourgeois woman living in an advanced European country.  Our group was unanimous in concluding that the charge of misogyny against her was baseless, but the second criticism drew some support by one of our female discussants.  She had introduced me to another new term during our discussion, “intersectional feminism”, and this represents an ideological realization that sexist oppression takes different forms among different races, ethnic groups, economic classes, political systems, and nations.  The issues of social justice – particularly those involving gender – that a white, middle class woman is contending with will be markedly different from those of a black woman living in the same country, or of a woman in India, or in Russia, or of destitute women in any country.  It is undoubtedly true that Simone de Beauvoir’s writing did not span the conditions of every woman, in every culture and social strata.  But I don’t think that any of the members of our little group felt that this was a significant flaw in her book, or in her thinking.  Our reading of selected passages of The Second Sex left us feeling that its reputation is well-deserved and, in my case anyway, kindled a desire in me to someday get through the rest of those 800 pages.


As I finished my whiskey, I was relieved that our discussion had never segued into either high drama or low comedy.  In fact, I was rather impressed by the gentility of our conversation.  There were no mutual recriminations, or expressions of sarcasm, or bemusement at the views expressed by others.  I can’t help but wonder if the timber of this conversation would have been the same if this discussion had taken place back when I was the age of our female members, a few decades ago.  Nevertheless, we selected, for our next reading, a book that would be less inclined to separate us along such clearly defined lines: Sigmund Freud’s Civilization and Its Discontents.  And, too, we resolved to expand our group so that there would be a broader spectrum of views, blurring the lines of generation and gender.  (The other male member of our group, for example, said that he is going to try to persuade his wife to join us.)  All in all, it was an enlightening experience, and one that I am grateful to have been a part of.  If only the two political parties of the U.S. Congress could explore their different views and perspectives in such a genteel fashion.  One can only hope that they someday learn how to do so.  Perhaps the field of political science has yet to produce its own Adam Smith, or Simone de Beauvoir, to show the way.

Monday, February 27, 2017

Blue Collar Elegy

I recently finished Hillbilly Elegy: a Memoir of a Family and Culture in Crisis, by J. D. Vance.  It is the autobiographical tale of a man who was raised in a poor, working class environment that was racked with unemployment, broken homes, alcoholism, and drug abuse.  The extended family that he was a part of had migrated to Middletown, Ohio from the Appalachian hill country in Kentucky in search of work.  They found it, for a while, until the industries that had supported their area downsized, closed down, or relocated to foreign countries in search of cheaper labor.  What these companies left behind were shells of what had once been vibrant communities, with many of the former workers now desperately searching for some kind of employment to keep their heads above water.  Families started breaking apart, as alcoholism and drug abuse set in, along with often violent conflicts in the home.  The author’s own parents were divorced, and his mother, a nurse, suffered from a string of addictions, including prescription pills and, at times, even heroin.  He hardly knew his real father, and had to live with a succession of “stepfathers” during his childhood.  He and his sister preferred living with their grandparents, who provided the closest thing to stability and a supportive environment that they could find.  The author’s story has a happy ending – for him – as he fights his way out of his dead-end world, first by joining the Marine Corps, and, after finishing his tours of duty, enrolling in college.  He eventually finds his way into a top flight law school, earns a law degree, and embarks on a professional career, marrying one of his fellow law students along the way.  His sister, too, escapes from their traumatic childhood environment – even before he does – through a happy marriage.

But the real intent of Vance’s book, I think, was to shine a spotlight on an entire segment – and a growing one – of American culture which is increasingly finding itself in trouble.  It is the segment of white, blue-collar workers who had once been able to adequately provide for their families in the factory towns across America.  They were patriotic, hard working, religious, and with strong family values.  They generally mistrusted government intrusion, and particularly resented those who seemed to eschew their work ethic and instead depended upon the largess of government spending to sustain themselves.  The paradox, of course, is that as the factory jobs which provided employment for these blue-collar workers began to disappear, they found themselves increasingly reliant upon government aid to get by.  And, as unemployment and underemployment became more rampant among them, alcoholism and drug use became more widespread, and in its wake, the structure of the nuclear family began to unravel.  Broken or abusive marriages, unwed mothers, and criminality became a pervasive phenomenon.

J. D. Vance is much younger than I am, and perhaps it is for this reason that my own family background, which is in many ways similar to his own, was not nearly as traumatic.  The decline of factory employment had not begun to play itself out nearly so dramatically during my own childhood, and my father, who was also a transplant from the South into the Midwest, had a job up until the day he died.  My family, too, remained intact during that time, although my father’s tendency to indulge in alcohol created occasional turbulence in the home.  Still, our family got by rather well, as I never remember any of my siblings or me ever going to bed hungry, or ever even remotely facing the risk of being removed from our household because of domestic upheavals.  Ours was a relatively stable home environment, and in fact I think that for this reason we were actually envied by the children in some of our neighboring households.

But much of what Vance described in his book was evident in our own neighborhood.  There were families that suffered from domestic abuse at the hands of their fathers.  There was drug abuse and juvenile delinquency.  I remember one neighbor girl who was disowned by her parents when she dated and eventually married somebody of another race.  There were a couple of suicides as well, by children who would never live to see their twenty-first birthdays.  The author says that kids in his neighborhood could be stigmatized if they did too well in school, and in particular boys, who would be branded as sissies.  While I don’t remember such extreme stigmatization in my own schools, I do remember that it generally wasn’t “cool” to be a good student.  We felt that if we read what we were told to read, and did our homework, we were “collaborating” with the “establishment”, as if education itself was a sort of brainwashing that would rob us of our native intelligence and special identity (an ironic position, since we all came from white Christian families – hardly a minority group at the time).  This attitude stuck with me all the way into my first two semesters of college, each of which I flunked out of.  (Mercifully, this institution – which was a local community college –marked each of these semesters as “incomplete”, which means that they did not count in my grade point average, after I eventually decided to return to college in earnest.)

And like the author, I was able to turn my own life around in large part because of encountering positive role models outside of the old neighborhood, while also getting support from people within my sphere of friends and family who believed in me and encouraged me.  The principal role model was a man who I ended up working for when I took a factory job: he had a PhD in ceramic engineering.  While I had always talked about going back to college in earnest some day, I had never really felt a powerful incentive to do so.  He gave that to me.  Because here was a man who loved what he did for a living (my own work history up to then had been in low wage, mind-numbing jobs that made me feel like I was just passing time), and who was paid very well for it.  I almost immediately returned to that community college that I had flunked out of a few years before, but this time I made straight A’s on all of my classes, and even eventually took on a full-time course load while still working at that factory.  And when I had finished enough classes to be able to transfer to a full-year school, it was that same boss who helped me to take the final leap.  I had been hesitant to do so, because I was afraid that I couldn’t afford to pay for two years with the savings I had accumulated up to that time, but he set me on my way with a stern lecture which was as powerful a push as any I’ve ever received.

As I began my climb, into the white collar world of engineers, lawyers, accountants, and managers, I had an experience that was also described in J. D. Vance’s book.  This was the experience of positive networking.  People actually helped me along my way, opening doors for me, and assisting me in navigating through the intricacies of jumping the various hurdles that had to be passed through on the way to attaining the brass ring.  And, what’s more, many if not most of these people belonged to groups that I had been conditioned, in my childhood, to mistrust, as those who might prevent me from getting a slice of the pie.  Because in my youth I had been indoctrinated in the belief that just about everybody who was different than me – in race, religious belief, ethnic background, income level, and even gender – was out to take from me the meager opportunities that might be available.  Around the time that I set out for attending a four-year university, the “boat people” – refugees from Vietnam – were making the news, as they began to arrive on American shores in large numbers.  I remember somebody from the old neighborhood joking that the first thing these people learned, when they got here, was to ask how to get on welfare.  But when I transferred to a four-year college – the University of Illinois at Champaign-Urbana – my roommate turned out to be one of these refugees from Vietnam, and I have always, in retrospect, been grateful for it.  We were both in the Electrical Engineering program at that university, and while I struggled through the classes in that curriculum, he demonstrated a natural facility in mastering the subject matter.  In part, this was because of his exemplary study habits, which were so much superior to my own, and which stemmed in turn from his very powerful motivation to succeed in his adopted country.  I really believe that I owe much of my own success in getting through that program to the happy circumstance of my having him for a roommate.  He both inspired me and helped me directly in comprehending what sometimes seemed to me to be an overwhelmingly complex subject.  And when it came time to interview for jobs in our senior year, I lamented the fact that he was at a disadvantage due to English being his second language, because he was a far superior engineer than I could ever hope to be.  Here was an immigrant who, far from stealing a job from me, did much to make it possible for me to land my first job out of college.

This experience of being helped, rather than hindered, by others different than myself continued after I entered the professional workforce.  Women began entering the U.S. labor force in record numbers in the late 1960s, and so naturally many men began to worry then that these women would take their jobs, or at least make it harder for them to find a job.  This fear lingered on into the 1980s, when I got my engineering degree, but as I look over my career since then, I have to say that it has been women more than men who have helped me to advance – providing opportunities for promotion, for new positions, and even for taking on special projects that I found personally rewarding.  Members of racial and religious minorities were also regarded as threats in the old neighborhood: potentially taking – or stealing – the little piece of the pie that we imagined ourselves fighting to retain.  But two of my favorite, most respected, and most supportive bosses were black men.  And it was an Israeli finance professor who wrote the personal letter of recommendation that helped to get me into graduate school.  Whatever success that I have had in my career, I credit much if not most of it to the very people that I had been raised to fear and/or mistrust.

I laughed, when reading Hillbilly Elegy, about how the author had to learn subtle forms of etiquette that he never learned at home when he moved into the professional workforce, because that, too, struck a familiar chord with me.  He talked about being overwhelmed, while having dinner with a prospective employer, by the large number of silverware pieces that were in front of him, as he struggled to figure out which ones to use first.  A friend who happened to be at the same dinner whispered to him that he simply had to start with the pieces on the outside and work his way in.  I think I learned that lesson from a movie.  But I never realized that one is supposed to balance one’s used pieces of silverware on the plate until I chanced to pick up and purchase a book called A Gentleman at the Table which I came across many years ago when buying a suit.

I never finished that book, and maybe I should have, because there is apparently one other piece of table etiquette that I never learned.  I’m sure it must be some type of etiquette, because I’ve seen this particular behavior at the table practiced almost universally, and especially among the well-to-do.  This is the habit of leaving the meal that was served partially uneaten.  And usually this doesn’t mean just a scrap or two, or maybe a particular food item – like a vegetable – that the eater might have disliked.  Often a sizable chunk of the main course – the beef, or chicken, or fish – is left on the plate, and I have seen half the meal abandoned.  If this behavior were only practiced by petite women, I could understand it, but I have seen tall and burly men exhibiting the same bird-like mannerisms.  At any rate, this is a practice that goes so against the grain of my upbringing that I have never been able to mimic it: I’ve never even wanted to attempt to do so.  I have memories as a child being told by my mother to finish everything on my plate, because there is a starving kid somewhere in Korea who would give or do just about anything for what I was eating.  (I’ve wondered, in recent years, if affluent parents in South Korea tell their kids that they should eat everything in front of them because some starving American kid in Appalachia would be overjoyed to have what they had.)  And I remember one time in particular when my mother forced me to stay at the dinner table for well over an hour after everyone else had left, because I refused to eat one of the things on the plate.  (I think the offending item was a vegetable – probably spinach or Brussels sprouts.)  We never went hungry in my parents’ household, but still the idea that a regular meal was something to be grateful for – and something never to waste – was ingrained in me.  And so, to this day, regardless of what social setting I’m at – a fancy restaurant, a lunch or dinner at a conference, or just a dinner at the house of a friend – I never let anything remain behind on my plate.

I can only assume that such behavior is a social “no-no” because I have found myself ostracized for it on many occasions, ranging from a mild rebuke (a fellow diner saying “Wow, you must have really enjoyed it”) to something a little more pointed (“My, my, you must have been very hungry”) to a blatant criticism.  The most memorable instance of this more extreme variety of criticism happened when I was having dinner with several colleagues while we were preparing to participate in a regulatory hearing in our state capital the following day.  After I had finished my meal, the attorney who was supervising our testimony remarked, in reference to me, “I’m surprised he didn’t lick the plate clean.”  But no matter.  Even if this is a social faux pas, I adamantly refuse to correct it.  I can only assume that leaving a meal partially uneaten is supposed to serve as some sort of courteous complement to the host, signaling that the host has been more than generous in the portions provided.  But I can hardly imagine that the owners of a restaurant take note of such behavior, and I am sure that the wait staff don’t appreciate having to collect and dump out all of that leftover food.  The only beneficiaries of this custom, as far as I can see, are the rats that get their meals from the restaurant dumpster outside.  And even in the case of a private host, wouldn’t they be far more flattered if they saw that their guests had eaten everything that had been provided to them?  I know that if I had gone to great lengths to prepare a good meal, I would not feel complimented at all if half of it was left over by those who had been at my dinner table.

Another familiar chord struck by the author was the description of his problems with anger management.  While I was growing up, arguments were never simply disagreements.  One didn’t argue well with somebody, in my household, and in my neighborhood, unless one did tangible and lasting damage to the other’s self esteem.  There was no such thing as a verbal “fair fight”.  I laughed when the author described how his horrified wife (who had been brought up in a much different social environment) once had to restrain him from getting out of his car during a fit of road rage.  There have been many times in my own life when just a little bit less self restraint while encountering bad driving behavior might have landed me behind bars.  I’ve often wondered if my career suffered because I treated every disagreement, no matter how minor, as if it was mortal combat, and often fell into the mental trap of regarding my coworkers as rivals fighting for slices of that limited pie rather than collaborators who together could make the pie grow larger.  Of course, as is so evident in the higher echelons of American business and politics today, not all bad and even boorish behavior can be attributed to a modest upbringing.  I have seen, time and time again, that the rat race runs all the way up the ranks to the boardroom.

While J. D. Vance finds some humor in the tales of his upbringing, he always makes it very plain that there was much more tragedy than comedy in the old neighborhood.  Here I think is where our backgrounds differed, because things have definitely gotten worse in the world of the white working class between the decades that I grew up in it and he did.  All of the families in my neighborhood had both fathers and mothers, living together, and all of the fathers had jobs.  We never really lacked anything – at least anything important – particularly when it came to food and clothing.  And although most of my friends from the old neighborhood never went to college, they all landed jobs that could support them and their families.  There were drugs, and alcohol, but addiction was still a relatively rare phenomenon.  In the world of Vance’s childhood, alcohol and drug abuse was much more rampant, as were families without fathers, and widespread, habitual unemployment.  Someone growing up in that environment could not count on the fact that they would find decent work when they got out of high school (if they even got that far in their education), let alone aspire to have a standard of living better than that of their parents.  And this has created a paradoxical cause of resentment and psychological conflict among those living in this environment:  For generations, the white working class had been proudly individualistic: openly resenting others in society who they felt were living on the government dole.  And yet, in recent decades, they have found themselves increasingly dependent on government programs of one form or another just to get by.  It grates on them, and drives them to search about for a scapegoat.

I have always been a strong believer in the positive benefits that immigration confers upon the American economy, and upon American culture itself.  It seems that a drawback of being a “native” American, which means being somebody whose immediate ancestors did not come from a foreign country, is that it fosters a sense of entitlement, which grows with the number of generations that have lapsed since one’s immigrant ancestors first arrived in this country.  How quickly we forget that we are a nation of immigrants.  In a presentation that I gave a few years ago, I pointed out that over 40% of Fortune 500 companies, which collectively employ over ten million workers, were founded by U.S. immigrants or children of immigrants, and that 12% of all businesses in this country with employees are owned by immigrants.  But this message tends to fall on deaf ears among white working class Americans today, when they see their children struggling to find a job.  It particularly grates on them to see Hispanic workers doing work that they imagine their own children could be doing.  They look back to a happier time, when it was so easy for them to find work, and when all signs were posted in English, and compare it to the present day, when every sign and every product label is now written in both English and Spanish, and other foreign languages as well.  In the region of the country where I was born and raised, northern Illinois, even the grocery stores now post all of their signs in English and Spanish: something that would have been unheard of when I was a child.  This, to many, is the “smoking gun” that something has happened which has robbed native white working class Americans of their livelihood.  But it is not just the perceived threat within that is robbing our working class youth of jobs, but the perceived external threat as well: the “off-shoring” of jobs to factories in other countries where people are willing to work at assembly lines for much lower wages.

As an economist, when I try to argue against this logic, I am rarely successful.  Another slide that I liked to show in some of my past presentations is that the total amount of wages paid in support of producing the Apple iPhone to American workers exceeds the total amount of wages paid to all other countries in the world combined.  Yet if you count the total number of employees who are involved in the production of this product, there are far, far more of them in foreign countries than in America.  The reason for this is that the American workers tend to be more skilled, and are paid higher wages.  The American workers are engineers, computer programmers, and designers, while those in foreign countries such as China tend to be unskilled assembly line workers who are simply putting the components together.  If the working class in our country is being left behind, it is because they are not seeking, or getting, the kind of training that they would require to get the more lucrative jobs that are not being “off-shored”.

And, of course, many of the jobs that our fathers worked in are no longer available because they no longer exist.  They were not “off-shored”, or taken over by eager immigrants willing to work at any price, but were simply eliminated due to automation, and the continuing advances of information technology.  No government policy short of opposing science itself will ever bring these jobs back.

But the reason that these arguments ultimately prove hollow is that they ignore the fundamental truth that white, working class America is currently in a state of crisis, as is much of black America, and even the Latino-American community.  When I look to history for a parallel, the one that comes uncomfortably to mind is the last days of the Roman Republic.  Ironically, the societal breakdown that led to the fall of the republic was caused by the very success of that republic itself.  For as it grew militarily powerful and brought more and more of its neighbors into annexation, subjugation, or submission, a consequence of this was that there was a large influx of slaves.  This free labor eliminated the need for many of the jobs that had been held by Rome’s working class, and the result was a growing mass of the population that was unemployed or underemployed.  The political leaders of Rome attempted to placate this increasingly restive segment by distributing free food and by distracting them with extravagant public entertainments and spectacles (“bread and circuses”).  But as the gap between rich and poor widened, each of these two classes actively sought out champions among the politicians and, ultimately, the military, until finally the military dictatorship of Julius Caesar led to the birth of imperial Rome (and the death of the republic).

America’s political system seems to be hopelessly deadlocked, and the frustration of the working class can be seen in how its loyalties shift across both ends of the political spectrum, from avowed socialists to big business plutocrats, simply based upon how bold their promises are to upend the current state of government paralysis and affect real change.  I feel a sense of despair, because even as an economist, I don’t know what policies would restore the ability of my younger contemporaries in the old neighborhood to find good work, and to rescue themselves and their children from the debilitating effects of broken families, domestic and urban violence, and drug and alcohol abuse.  One thing is certain, however.  If this condition continues to expand and worsen, it will ultimately prove to be the undoing of the American Dream.  America cannot be sustained simply by noble ideas and military might.  It only works if it is supported by the working men and women who have always made up the foundation of its very existence.