One of the
principal clichés of our age, endlessly repeated, is that our ability to look
into the future and control our fate has been growing. So much
so that, in the words of Yuval Harari, we are about to transform ourselves
from Homo Sapiens, originally a small, weak and vulnerable
creature constantly buffeted by his surroundings, into a quasi-omnipotent Homo Deus. The
main engine behind this process, we are told, is represented by
fast-accumulating developments in science and technology. Those developments in
turn, are both cause and consequence the kind of education that helped us cast
off superstitions of every kind and, in the words of Immanuel Kant (1724-1804),
“dare to know.” Some would go further still and argue that, if such were not the case,
there might be little point in pursuing any kind of learning in the first
place.
For a
long time, this line of thought was closely related to belief in progress.
Today it is shared both by those who are optimistic in regard to the future and
by those who, like Harari, keep warning against the disastrous consequences
that our very successes may bring down upon our heads. As by changing the
climate, destroying the environment, running out of drinking water, covering
the planet with plastic, breeding antibiotic-resistant superbugs—vide the
corona virus outbreak—and being enslaved, perhaps even exterminated, by some
self-seeking supercomputer out on a roll. But is it really true that we are better in looking into
the future, and consequently more able to control it, than our ancestors were?
And that, as a result, the human condition has fundamentally changed? For some
kind of answer, consider the following.
1.
The Demise of Determinacy
In
Virgil’s words, “Felix, qui potuit rerum cognoscere causas” (happy,
he who can discern the causes of things). For millennia on end, though, so
deficient was our understanding of the future that almost the only way to get a
handle on it was by enlisting some kind of supernatural aid. As by invoking the
spirits, consulting with the gods (or God), tracing the movements of the stars,
watching omens and portents of every kind, and, in quite some places, visiting
or raising the dead and talking to them.
Come the seventeenth century, many of these methods were finally
discarded. If not completely so, at any rate to some extent among the West’s
intellectual elite. Their place was taken by the kind of mechanistic science
advocated by Galileo Galilei, Isaac Newton, and others. Nor was this the end of
the matter Many nineteenth century scientists in particular believed not just
that the world is deterministic but that, such being the case, they would one
day be able to predict whatever was about to take place in it. One of the
best-known statements to that effect came from the polymath Pierre-Simon
Laplace (1749-1827). It went as follows:
An
intellect [not a demon, which was substituted later for effect] which at a
certain moment would know all forces that set nature in motion, and all
positions of all items of which nature is composed, if this intellect were also
vast enough to submit these data to analysis, it would embrace in a single
formula the movements of the greatest bodies of the universe and those of the
tiniest atom; for such an intellect nothing would be uncertain and the future
just like the past would be present before its eyes.
In such a
world not only God but chance, randomness, probability and the unexpected would
be eliminated, leaving only sheer causality to rule supreme. Other scientists,
such as William Thomson, Lord Kelvin, took matters further still, claiming that
science had advanced to the point where there only remained a few minor gaps to
be closed. No less than Stephen Hawking in his last work, Brief
Answers to the Big Questions, admitted to having done just
that. However, the very scientific progress that gave rise to this kind of
optimism also ensured that it would not last for long. Just as, regardless of
what number you multiply zero by, in the end zero is still what you get.
Starting with the discovery of radioactivity in 1896, it has
become increasingly evident that some of nature’s most basic processes,
specifically the decay of atoms and the emission of particles, are not
deterministic but random. For each radioactive material, we know what
percentage of atoms will decay within a given amount of time. But not whether
atom A is going to break up before (or after) atom B and why. Subsequent
discoveries such as quantum mechanics (Max Planck), relativity (Albert
Einstein, the uncertainty principle (Werner Heisenberg, the incompleteness
theorem (Kurt Gödel), and chaos theory (Richard Feynman), all helped extend the
idea of incalculatability into additional fields.
To specify, quantum mechanics started life as a theoretical
construct that could only be applied to the world of subatomic particles, hence
could be more or less ignored by everyone but a very small number of nuclear
scientists. However, since then it has been climbing out of the basement, so to
speak. As it did so it acquired a growing practical significance in the form of
such devices as ultra-accurate clocks, superfast computers, quantum radio (a
device that enables scientists to listen to the weakest signal allowed by
quantum mechanics), lasers, unbreakable codes, and tremendously improved
microscopes.
At the heart of relativity lies the belief that, in the entire
physical universe, the only absolute is the speed of light apart. Taken
separately, both quantum mechanics and relativity are marvels of human wisdom
and ingenuity. The problem is that, since they directly contradict one another,
in some ways they leave us less certain of the way the world works than we were
before they were first put on paper. The uncertainty principle means that, even
as we do our best to observe nature as closely as we can, we inevitably cause
some of the observed things to change. And even that time and space are
themselves illusions, mental constructs we have created in an effort to impose
order on our surroundings but having no reality outside our own minds. The
incompleteness theorem put an end to the age-old dream—it goes back at least as
far as Pythagoras in the sixth century BCE—of one day building an unassailable
mathematical foundation on which to base our understanding of reality. Finally,
chaos theory explains why, even if we assume the universe to be deterministic,
predicting its future development may not be possible in a great many cases.
Including, to cite but one well-known example, whether a butterfly flapping
wings in Beijing will or will not cause a hurricane in Texas.
2.
Tripping Over One’s Own Robe
So far,
the tendency of post-1900 science to become, not more deterministic but less
so. As a result, no longer do we ask the responsible person(s) to tell us what
the future will bring and whether to go ahead and follow this or that course.
Instead, all they can do is calculate the probability of X
taking place and, by turning the equation around, the risk we
take in doing (or not doing) so. However, knowledge also presents additional
problems of its own. Like a robe that is too long for us, the more of it we
have the greater the likelihood that it will trip us up.
First, no knowledge can be better than the instruments used to
measure the parameters of which it consists. Be they size, mass, temperature,
rigidity, speed, duration, or whatever. And no instrument that physicists use
is, or can be, perfectly precise and perfectly accurate. Even the most recent,
strontium-based, clocks are expected to be off by one second every 138 million years,
a fact which, chaos theory says, can make a critical difference to our
calculations. The more accurate our instruments, moreover, the more likely they
are to interfere with each other. The situation in the social sciences is much
worse still, given that both the numbers on which most researchers base their
conclusions and the methods they use to select and manipulate those numbers are
often extremely inaccurate and extremely slanted. So much so as to render any
meeting between them and “the truth” more or less accidental in many cases.
Second,
there is far too much knowledge for any individual to master. Modern authors,
seeking to impress their readers with the speed at which knowledge expands,
often leave the impression that this problem is new. In fact, however, it is as
old as history. In China, the Sui-era imperial library was supposed to contain
300,000 volumes. That of the Ptolemies in Alexandria held as many as half a
million. And this is to assume that knowledge was concentrated inside libraries—whereas
in fact the vast majority of it was diffused in the heads of countless people,
most of them illiterate, who left no record of any kind. Since then the problem
has only been getting worse. Today, anyone seriously claiming to have written a
book containing “all that is most wonderful in history and philosophy and the
marvels of science, the wonders of animal life revealed by the glass of the
optician, or the labors of the chemist” (The World of Wonders, London,
1869) would be quickly dismissed as either a featherweight or a charlatan.
Third, not only is there too much knowledge for anyone to master
but in many cases it keeps developing so fast as to suggest that much of it is
mere froth. Whether this development is linear and cumulative, as most people
believe, or proceeds in cycles, as was suggested by Thomas Kuhn, is, in this
context, immaterial. One of the latest examples I have seen is the possibility,
raised by some Hungarian scientists just a few days before these words were
written in November 2019, that the world is governed not by the
long-established four forces—gravity, the electromagnetic, the strong and the
weak—but by five (and perhaps more). Should the existence of the so-called
photophobic, or light-fearing, force be confirmed, then it has the potential to
blow all existing theories of the world’s behavior at the sub-atomic, hence
probably not only at the sub-atomic, level to smithereens.
Fourth,
we may often have a reasonably accurate idea of what the consequences of event
A, or B, or C, may be. However, working out all of
those consequences is much more difficult. The more so because they may (and
are likely to) have consequences; and so on in an expanding cascade that, in
theory and sometimes in practice as well, does not have a clear end. Some of
the consequences may be intended (in which case, if everything goes right, they
are foreseeable), others not. Some may be beneficial, others harmful. Some may
bend backwards so to speak, turning around and impacting on C, or B, or A,
which in turn has consequences, and so on until the cascade turns into an
entire series of interrelated cascades. That is particularly true in the social
sciences where the very concepts of cause and consequence may be out of place;
and reality, either reciprocal or circular.
Some consequences may even be perverse, meaning that they lead
to the opposite of what was intended. For example, when the scientists employed
on the Manhattan Project worked on a weapon to be used in war—there hardly ever
was any doubt that it would be—they could not know that, to the contrary, it
would render the kind of war on which their country was then engaged
impossible. Both the Chernobyl and the Fukushima reactors were provided with
elaborate, highly redundant, safety systems; but when the time came those
systems, rather than preventing the accidents, only made them worse.
In brief, a simple, elegant “theory of everything” of the kind
that, starting with Laplace, scientists have been chasing for two centuries
remains out of sight. What we got instead is what we have always had: namely, a
seething cauldron of hypotheses, many of them conflicting. Even when we limit
ourselves to the natural sciences, where some kind of progress is undeniable,
and ignore the social ones, where it is anything but, each question answered
and problem resolved only seems to lead to ten additional ones. Having
discovered the existence of X, inevitably we want to know where it comes from,
what it is made of, how it behaves in respect to A and B and C. Not to mention
what, if any, uses it can be put to.
The
philosopher Karl Raimund Popper went further still. Scientific knowledge, he
argued, is absolutely dependent on observations and experiments. However, since
one can always add 1 to n, no
number of observations and experiments can definitely confirm that a scientific
theory is correct. Conversely, a single contradictory observation or experiment
can provide sufficient proof that it is wrong. Science proceeds, not by adding
knowledge but by first doubting that which already exists (or is thought to
exist) and then falsifying it. Knowledge that cannot, at any rate in principle,
shown to be false is not scientific. From this it is a small step towards
arguing that the true objective of science, indeed all it can really do, is not
so much to provide definite answers to old questions as to raise new ones. It
is as if we are chasing a mirage; considering our experience so far, probably
we are.
3.
The Drunk at the Party
If all this were not enough, the problem of free will persists.
In the words of the French anthropologist Claude Levi-Strauss, it is the
drunken guest who, uninvited, breaks up the party, upsetting tables and
spreading confusion. Much as scientists may claim that it is simply a
delusion—even to the point of showing that our bodies order us to raise our
hands as much as ten seconds before we make a conscious decision to do so—our
entire social life, specifically including such domains as education and
justice, continues to rest on the assumption that we do in fact have a choice.
As between action and inaction; the serious and the playful; the good and the
evil; the permissible and the prohibited; that for which a person deserves to
be praised, and that for which he deserves to be punished. Long before King
Hammurabi had the first known code of law carved in stone almost four millennia
ago, a society that did not draw such distinctions could not even be conceived
of.
So far, neither physicists nor computer experts nor brain
scientists, working from the bottom up, have been able to close the gap between
matter and spirit in such a way as to endow the former with a consciousness and
a will. Economists, sociologists and psychologists, working their way from the
top down, have not been able to anchor the emotions and ideas they observe (or
assume) people to have in underlying physical reality. Whichever route we take,
the complete understanding of everything that would be necessary for prediction
to be possible is as remote as it has always been. In no field is the crisis worse
than in psychology; precisely the science (if one it is) that, one day, will
hopefully explain the behavior of each and every one of us at all times and
under all circumstances. Its claim to scientific validity notwithstanding, only
25-50 percent of its experimental results can be replicated.
Given the
inability of science to provide us with objective and reliable visions of the
future, those we have, as well as the courses of action we derive from them,
depend as much on us—our ever-fluid, often capricious, mindset, our ira and
our studio—as they have ever done. Elation, depression, love, euphoria,
envy, rage, fear, optimism, pessimism, wishful thinking, disappointment, and a
host of other mental states form a true witches’ brew. Not only does that brew
differ from one person to another, but its various ingredients keep interacting
with each other, leading to a different mixture each time. Each and every one
of them helps shape our vision today as much as they did, say, in the Rome of
the Emperor Caligula; the more so because many of them are not even conscious,
at any rate not continuously so. In the process they keep driving us in
directions that may or may not have anything to do with whatever reality the
physicists’ instruments are designed to discover and measure.
4.
The Persistence of Ignorance
To conclude, in proposing that knowledge is power Francis Bacon
was undoubtedly right. It is, however, equally true that, our scientific and
technological prowess notwithstanding, we today, in our tiny but incredibly
complex corner of the universe, are as far from gaining complete knowledge of
everything, hence from being able to look into the future and control it, as we
have ever been.
Furthermore, surely no one in his right mind, looking around,
would suggest that the number of glitches we all experience in everyday life
has been declining. Nor is this simply a minor matter, e.g. a punctured tire
that causes us to arrive late at a meeting. Some glitches, known as black
swans, are so huge that they can have a catastrophic effect not just on
individuals but on entire societies: as, for example, happened in 2008, when
the world was struck by the worst economic crisis in eighty years, and as
coronavirus is causing right now. All this reminds me of the time when, as a
university professor, my young students repeatedly asked me how they could ever
hope to match my knowledge of the fields we were studying. In response, I used
to point to the blackboard, quite a large one, and say: “imagine this is the
sum of all available knowledge. In that case, your knowledge could be
represented by this tiny little square I’ve drawn here in the corner. And mine,
by this slightly—but only slightly—larger one right next to it.” “My job,” I
would add, “is to help you first to assimilate my square and then to transcend
it.” They got the message.
There thus is every reason to believe that the role ignorance
concerning the future, both individual and collective, plays in shaping human
life is as great today as it has ever been. It is probably a major reason why,
even in a country such as France where logic and lucidity are considered
national virtues and three out of four people claim they are not superstitious,
almost half touch wood, and about one third say they believe in astrology. Nor
are the believers necessarily illiterate old peasants. Most young people (55
percent) say they believe in the paranormal. So do many graduates in the
liberal arts and 69 percent of ecologists. As if to add insult to injury,
France now has twice as many professional astrologers and fortune tellers as it
does priests. Both Black masses and Satan-worship have been on the rise. The
situation in the U.S is hardly any different.
How did old Mark Twain (supposedly) put the matter? Prediction
is difficult, especially of the future.