One
of my all-time favorite books is titled Extraordinary Popular Delusions and
the Madness of Crowds. Written by
Charles Mackay in 1841, it chronicles some of the worst, the silliest, and the most
notorious cases of mob psychology and mass hysteria of earlier centuries, including
the witch-hunting mania, the Crusades, the popularization of bogus holy relics,
and a variety of economic bubbles and crashes.
Of this latter, the most blatantly silly was the tulipomania craze in 17th century Holland, when otherwise rational people spent small fortunes on single tulips,
simply because it was the fashion of the time to do so. My copy of the book has just a single endorsement
on the back cover, by Bernard Baruch, who wrote, “This book has saved me millions
of dollars.” Bernard Baruch was an early
20th century financier, investor, and philanthropist who served as
an adviser to U.S. Presidents Woodrow Wilson and Franklin D. Roosevelt. He was also one of the few investors who
survived the Wall Street Crash of 1929 and the Great Depression with his wealth
intact.
Sadly, Mackay’s work proved to be
not only a piece of interesting history, but prophetic as well, as the century-and-a-half
that followed its publication would see an economic collapse that would make
the ones that he described pale in comparison, and movements led by political
demagogues that resulted in mass murders, wars, and other abuses of a scale far
beyond that exhibited by the witch-hunting craze and the Crusades. But what is even more intriguing is that if
one surveys the cultural landscape in which we currently live, one can see that
it doesn’t require a demagogue, religious fanaticism, greed, racism, or mass hysteria
to produce general insanity. On any
calm, clear day, we’re practicing absurdity on a large scale, in a variety of
interesting, though mind-numbing ways. In
fact, an outsider who looked in on our culture would probably be tempted to declare,
as one U.S. national politician recently said about another, that we are “not
playing with a full deck”. Here are some
of my favorite examples of “mass absurdity”.
Daylight Savings Time
Twice a year,
people in America, Europe, and other parts of the world change their
clocks. On some early morning in the
springtime, at 2:00, we all start pretending that it is 3:00 instead, and, later,
in the fall, we abandon the pretense, so that 3:00 becomes 2:00 again. (It’s a little like collectively deciding
that we will call things of a certain color, like roses, red part of the
year, and call these same things blue another part of the year.) Daylight Savings Time, which most of us now take
for granted, has only been in existence for about one hundred years. It was first seriously proposed by George Hudson,
a New Zealand entomologist in 1895, who personally valued the extra leisure
time after his day job to collect bugs in the summer months. The proposal was later championed by an Englishman,
William Willett, who also had his own personal reasons for wishing to see it adopted:
he was an avid golfer and wanted more time for his golf game in the evening. But it was during World War I that the idea
was finally embraced and enacted into law by the combatants (first Germany and
its allies, and then Britain and its allies) in order to conserve energy. The practice continued among many of these
countries after the war ended, and most of those who abandoned it reinstituted
it in World War II. Energy conservation,
in fact, continues to be one of the main justifications used for continuing Daylight
Savings Time, but studies over the years have shown that the actual savings
resulting from it are negligible, and may be more than offset by the other costs
associated with the inconveniences of having to change the clocks twice a year
and adjusting to this change. Nevertheless,
while most of us grumble about it (particularly in the springtime), we have
come to accept it as “normal”, and meekly submit to the practice.
The Thirteenth Floor
Several years
ago, when I was attending an industry conference at a large hotel in another
city, I was surprised to get a room at a rate that seemed unusually low for a
hotel of that caliber. I had not used
any of the discount services now so widely available, although I do think that
I was late in making my registration.
The hotel clerk at the registration desk seemed nervous as she handed me
my room key. I looked at the room
number. It was 1305. When I looked back up at the room clerk, she
seemed to be preparing herself for an angry reaction. She was probably shocked when she saw me
break out into a smile and happily accept the key. In fact, I had to restrain myself from
shaking her hand, and even giving her a hug.
Here, in 21st century America, I was actually overjoyed to
find a hotel that was not afraid to have a 13th floor.
It always sickens
me to get onto the elevator of a modern high rise hotel – or any building for
that matter – and see the button for the 12th floor followed by the
button for the 14th floor.
The incongruency of all of the contemporary high-tech electric conveniences
and other marks of ultra-modernity in a building that still caters to this
silly superstition is stark and unsettling.
What exactly are the owners afraid of?
Do they really believe that some terrible thing – like a fire or an
explosion – will more likely occur on a floor with the number 13, and perhaps
only on that floor? Do they think that the
entire hotel will be cursed? Or that
something terrible will befall the hapless occupants who take a room on that floor,
either during their stay or immediately afterward? It might be a more pragmatic concern, with hotel
management simply afraid that an entire floor will be left unutilized, because
few people would want to occupy it. I
hope that they’re wrong about that. I would
like to think that most of the people I know would be just as happy to stay on
the 13th floor as I was, and find the absence of that floor just as
silly. I suppose we can take some
consolation that we in the West are not the only ones who harbor such silly superstitions. China, Japan, and other Asian nations have a
similar irrational terror of the number 4.
In fact, it is not uncommon for hotels in Asia to be missing the 4th,
13th, and 14th floors!
Restaurants with numbered tables there also often omit a Table #4.
I should mention that I had a very pleasant time at that
conference, and enjoyed a safe and comfortable airplane flight there and back. I don’t know if the hotel management ever
sent me a survey about my stay, but if they did, I hope I sincerely congratulated
them on their decision to be boldly – and defiantly – rational.
Grass
No, I’m not referring to the plant
that is now legal for recreational use in eleven U.S. states, but to simple, common,
lawn grass. Most people don’t know that this
type of grass is an invasive species, which hails mainly from Europe and the Middle
East, and was brought over by colonists to North America in the 17th
century. But unlike most of the invasive
species in the U.S. that we read about, like the pythons lurking in the Florida
Everglades, kudzu vines smothering trees and bushes in the South, tree-killing
Emerald ash borers, and of course the infamously ubiquitous Norway rat, grass
would probably not have thrived or even survived were it not for the constant
and fanatical mollycoddling that it receives from landowners. With its shallow root system, this pathetically
tender little plant would just not be hardy enough to handle the extremes of
temperature and wide variations in precipitation that are characteristic of the
North American climate. As a consequence,
more than $40 billion is spent each year on pesticides, weed killers, and
fertilizer to keep grass healthy and alive (killing about 7 million birds a year
in the process), and the average homeowner spends 150 hours annually to maintain
their lawn. Water requirements are just
as steep, averaging about 200 gallons per person per day. And the real absurdity behind all of these
massive expenditures is that they are done in the service of the blandest sort
of social conformity, so that homeowners can have perfect, rectangular patches
of green that match those of their neighbors.
And Heaven forbid that these patches might be “infested” with things
like clovers or flowering native plants, which of course sustain the local population
of bees and other wildlife. But the perfect
lawn must not only be pure and untainted with local flora: the blades must never
exceed an inch and a half in length, and the lawn itself must be neatly edged. Consequently, on a typical summer day, one
can hear the cacophonous roar of dozens of gas-guzzling lawn mowers and other
equipment, in their neighborhood alone, spewing carbon dioxide and noxious fumes
into the air. (74,000 injuries a year
occur due to lawn mowers: about the same number caused by firearm accidents.) To borrow (and mangle) a famous line of
Churchill’s: “Never in the fields of
human endeavor have so many spent so much, with such toxic consequences, for
something so stupid.”
Green
Bottle Beer
I can’t remember when I first discovered
Heineken beer. It was in an era long
before craft beers and microbreweries. I
just remember what a refreshing change it was from the mainstream beers that
dominated the market at the time: Budweiser, and Miller, and Pabst, and Hamm's. As an import, it cost more than the domestics,
of course, but the taste made it all worth it.
Or so I thought. But then
something terrible happened. This memory,
unfortunately, is more vivid, because it has happened so many times since. It is the memory of uncapping a green bottle
of Heineken, and, instead of experiencing that refreshing taste of a premium
beer, confronting what can only be described as a putrid skunk-like assault on
the nose and the palate – what I would come to recognize as the characteristic evidence
of spoilage. I would eventually learn that
this is symptomatic of beers that are sold in green and clear bottles, because
these allow ultraviolet (UV) light to pass through and cause a chemical
reaction that produces sulfur-based compounds not unlike those which a skunk
sprays on its would-be attackers. Time
and time again I would open a bottle of Heineken, with great anticipation, and,
more often than not, it seems, I would find myself drinking a foul-tasting
liquid. I felt like the comic book character
Charlie Brown, whenever he would run up to kick a football, only to have it
lifted up at the last minute by his perennial tormenter, Lucy, leaving him to
fall flat on his back. Lucy always promised
that this time would be different – that this time she would not pull
the prank – but every time that Charlie Brown believed her, he would end up
flat on his back again. Why would a
company sell its product in a container that ruined its taste? And, more to the point, why do consumers,
like me, continue to buy the product, in spite of being regularly disappointed? Sadly, I think that there is actually a
psychological explanation for both of these mysteries. In the 20th century, behavioral psychologists,
like Ivan Pavlov, discovered that an animal, when given a treat, could be
conditioned to exhibit a certain response when rewarded with that treat, but
could also be trained to give the same response to some other stimulus that occurred
at the same time that the treat was provided, like the ringing of a bell. Eventually, the ringing of the bell itself
could produce the desired behavior, and would continue to do so, as long as the
treat itself just occasionally accompanied the bell. Like Pavlov’s dogs, fans of beers that are
packaged in green and clear bottles will still spend money on these products,
apparently even after many bad experiences with them, as long as they get the taste
reward that they are seeking once in a while.
And the sellers of these beers, who believe that green and clear bottles
are more visually appealing (as opposed to brown glass, which, ironically, actually shields
beer from the effects of UV light), can rely on the fact that only the
occasional reward of a good taste will keep their consumers coming back. I am ashamed to say how many years I
continued to succumb to the temptation to buy Heineken beer in green bottles
until I finally said, "Enough!".
But even in my case, I could not completely give it up. A few years ago, I returned to it, but now I
only buy it on tap or in cans.
Circumcision
This next one will generate a little
controversy, I suspect, but still is a worthy member in this “gallery of the absurd”. In recent decades, there has been a growing movement
to ban the practice of female genital mutilation that is prevalent in many
African nations. Formerly referred to as
“female circumcision”, the phrase “genital mutilation” was adopted by critics
of the practice in order to distinguish it from what was apparently regarded as
a more benign practice involving males.
But what is often ignored or overlooked is that even male circumcision
is a form of genital mutilation, and while the particular procedure that is
performed on females is admittedly more pernicious in its consequences, this is
a difference in degree and not in kind.
One cannot help but suspect that the extreme (although justifiable) vitriol
directed against the female version of the practice, as sharply contrasted with
the general apathy, if not outright endorsement, of its male counterpart, is fueled
in part by the fact that the former practice is common among African nations,
and among Moslems (although the procedure is not required by Islam), thus inviting
condescension among white Europeans and Americans as something stemming from outmoded
primitive traditions and religious extremism.
There are three general schools of thought regarding male circumcision:
1) that it is actually a healthy practice, as the male foreskin serves no useful
purpose and may even present health risks, 2) that while conferring no health
benefits, circumcision is harmless, and 3) that it does present some impairment
to male sexual response, but this is relatively minor, and is certainly not
missed by the male if he is circumcised as an infant, which is the standard
practice. But even the first, most
benign interpretation of the practice invites criticism, since there are other
organs of the human body which are regarded as serving no useful function while
presenting potential (even fatal) health risks, such as the tonsils and the
appendix, that are never removed unless these risks actually materialize. One cannot help but suspect that the medical
justifications for male circumcision are actually legitimizations after the fact
for a practice that is actually grounded in religious and cultural
traditions. I for one would find the argument
for health benefits far more convincing if I knew of an adult uncircumcised physician
who voluntarily underwent the procedure after having been convinced of these
benefits. And as regards the religious grounds
for circumcision, as the physical manifestation of a special covenant with the Creator, it would seem (to me, anyway) more appropriate to postpone this procedure
until the time when a person can enter into this covenant voluntarily as a
mature adult. The usual explanation for why this is not done is because the
procedure is much more painful to undergo as an adult. I wonder, though, how much less painful it is
for an infant, and of course we will never know since an infant cannot
communicate the extent of their suffering, distress, and possible trauma. And those of us who have undergone the
procedure as infants will never know what we have lost. In any case, I find the general acceptance of
male infant sexual mutilation in the West existing side-by-side with the
general abhorrence of female sexual mutilation in other cultures to be one of
the most egregious examples of cultural relativism existing among us today.
High Heels
Doctors are pretty unanimous that the
wearing of high heels is detrimental to a woman’s health, and in a variety of
ways. There is the obvious risk of
developing hammertoes, calluses, and bunions, not to mention ankle sprains and weakened
calf muscles, but high heels can also cause lower back pain, knee pain, and
even a type of nerve damage called foraminal stenosis. They contribute to improper breathing, which in
turn can cause damage to the vocal chords.
Some doctors have even gone so far as to suggest a link between wearing
high heels and the incidences of certain types of cancers. Clearly, this social custom is not as
barbaric as the old Chinese custom of foot-binding, but one wonders if here,
too, it is merely a difference in degree rather than in kind. Ironically, when high heels were first invented,
they were designed for men, and actually had a practical application. Persian warriors on horseback in the 10th
century wore them for stability so that they could easily stand up in their
stirrups while shooting their bows and arrows at enemies. (The modern cowboy boot is a direct descendant
of this design.) When Persians brought them
to Europe in the early 17th century, male aristocrats and even
members of the royalty wore them as a symbol of their “upper” status, with the
length of the heel literally corresponding to a man’s place in the social
hierarchy (“1-and- a-half inches for knights, 2 inches for nobles, and 2-and-a-half
inches for princes”, according to one prescription). Women gradually adopted the style for
themselves, and by the early nineteenth century, high heels were exclusively
worn by women. But women today make no
secret of the fact that wearing these things tend to be painful, so why do
it? One possible reason is the effect that
they have upon men. They really are a
sort of visual catnip to most men: a woman who might ordinarily not get a
second look from a particular man will often get an admiring gaze from the same
man simply by wearing high heels. And the
“catnip” effect is not only visual: even the loud “clickety-click” sound of a woman
walking in heels will tend to turn male heads.
I experienced embarrassing direct evidence of this years ago, when I was
wearing shoes with hard soles. Because I
walk with a very brisk gait, the sound that I made in these shoes was very
similar to that of a woman walking in high heels. And I found, when walking into my office from
the parking lot each morning, that it was not unusual for any man walking ahead
of me, within earshot, to come to a complete stop, turn around, and look at me. I got so tired of being a regular source of disappointment
to so many of my male coworkers that I adopted the practice of wearing rubber-soled
shoes. Perhaps, then, this social custom
is not so “absurd” after all. Even women
often confess to a sort of fetish in their zeal to shop for and collect high
heels of various colors and designs. Hence,
high heels might actually represent a vice (but only when they are worn voluntarily, and not in compulsive conformity with outmoded dress codes), rather than a pointless practice,
and the indulgence in vices, while often risky and even toxic, are grounded in
the logical behavior of self-gratification.
And I should add that I would be a terrible hypocrite if I did not admit
that the day high heels are banned, or women universally decide to abandon them,
will be a very sad day in my life.
These, then, are my top choices
for the “gallery of the absurd”. In a
world that so often seems to succumb to lunacy, in so many different ways, and
which leaves us perennially baffled, these examples provide a constant reminder
that evidences of our capacity for irrationality are always out there, right in
front of us, if we only open our eyes to see them.