In an earlier post, I examined what would happen ‘If All the Books Disappeared.’ Ricky Gervais pointed out that science is the axiom the universe, an unchanging constant that would be discovered again and again should we lose all knowledge and records of learning. He contrasted this to religion which would reappear in a different form because it is couched in culture, language, and context.
For Gervais, science is worth believing in. Religion was not.
In contrast, C. S. Lewis an atheist until his early 30s, described himself as a “reluctant convert” to Christianity, because as an intellectual, he found he had no choice but to accept what he clearly saw to be truth.
In his essay ‘Is Theology Poetry’ he mused,
I believe in Christianity as I believe that the Sun has risen, not only because I see it but because by it, I see everything else.
This little comic articulates the importance of ideas to shape the way we see the world. Should we lose all books, humanity would have to reprocess the fundamentals of ‘knowing’ and ‘seeing’ the world, in order to test, examine and rediscover science.
Without ideas of being, notions of truth and identity, we would in fact ‘see’ the world very differently. Science would not only have to be relearned but would have to in fact be ‘re-seen.’
This process of epistemology, the process of ‘knowing’ is philosophical and tied to notions of belief, truth, and identity. This is why humanity are story tellers, and our narratives of identity which form the basis of religious beliefs run parallel to, and indeed fundamental to, the scientific process.
Without inner narratives we would be lost in a chaotic world.
We are all storytellers; we make sense out of the world by telling stories. And science is a great source of stories.
Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.
But when we use data of the physical world to explain phenomena that cannot be reduced to physical facts, or when we extend incomplete data to draw general conclusions, we are telling stories. Knowing the atomic weight of carbon and oxygen cannot tell us what life is. There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.
Science is not at fault. On the contrary, science can save us from false stories. It is an irreplaceable means of understanding our world. But despite the verities of science, many of our most important questions compel us to tell stories that venture beyond the facts. For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.
To see where science and story meet, let’s take a look at how story is created in the brain. Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel:
The king died and then the queen died.
It is nearly impossible to read this juxtaposition of events without wondering why the queen died. Even with a minimum of description, the construction of the sentence makes us guess at a pattern. Why would the author mention both events in the same sentence if he didn’t mean to imply a causal relationship?
Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.
Despite the verities of science, we are compelled to tell stories that venture beyond the facts.
The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions. The reward system extends from the limbic area of the brain, vital to the expression of emotion, to the prefrontal cortex, critical to executive thought. Though still imperfectly understood, it is generally thought that the reward system plays a central role in the promotion and reinforcement of learning. Key to the system, and found primarily within its brain cells, is dopamine, a neurotransmitter that carries and modulates signals among brain cells. Studies consistently show that feeling rewarded is accompanied by a rise in dopamine levels.
This reward system was first noted in the 1950s by two McGill University researchers, James Olds and Peter Milner. Stimulating electrodes were placed in presumed brain reward areas of rats. When allowed full unrestricted access to a lever that, when depressed, would cause the electrodes to fire, the rats quickly learned to repeatedly depress the lever, often to the exclusion of food and water. Realizing that our brains are capable of producing feelings so intense that we choose to ignore such basic drives as hunger and thirst was a first step toward understanding the enormous power of the brain’s reward circuitry.
Critical to understanding how stories spark the brain’s reward system is the theory known as pattern recognition—the brain’s way of piecing together a number of separate components of an image into a coherent picture. The first time you see a lion, for instance, you have to figure out what you’re seeing. At least 30 separate areas of the brain’s visual cortex pitch in, each processing an aspect of the overall image—from the detection of motion and edges, to the register of color and facial features. Collectively they form an overall image of a lion.
Each subsequent exposure to a lion enhances your neural circuitry; the connections among processing regions become more robust and efficient. (This theory, based on the research of Canadian psychologist Donald O. Hebb, a pioneer in studying how people learn, is often stated as “cells that fire together wire together.”) Soon, less input is necessary to recognize the lion. A fleeting glimpse of a partial picture is sufficient for recognition, which occurs via positive feedback from your reward system. Yes, you are assured by your brain, that is a lion.
An efficient pattern recognition of a lion makes perfect evolutionary sense. If you see a large feline shape moving in some nearby brush, it is unwise to wait until you see the yellows of the lion’s eyes before starting to run up the nearest tree. You need a brain that quickly detects entire shapes from fragments of the total picture and provides you with a powerful sense of the accuracy of this recognition.
One need only think of the recognition of a new pattern that is so profound that it triggers an involuntary “a-ha!” to understand the degree of pleasure that can be associated with learning. It’s no wonder that once a particular pattern-recognition-reward relationship is well grooved into our circuitry, it is hard to shake. In general—outside of addiction, that is—this “stickiness” of a correlation is a good thing. It is through repetition and the sense of familiarity and “rightness” of a correlation that we learn to navigate our way in the world.
Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones. Thought-experiments can be compared to storytelling exercises using well-known characters. What would Sherlock Holmes do if he found a body suspended in a tree with a note strapped to its ankle? What would a light ray being bounced between two mirrors look like to an observer sitting on a train? Once done with their story, scientists go to the lab to test it; writers call editors to see if they will buy it.
People and science are like bread and butter. We are hardwired to need stories; science has storytelling buried deep in its nature. But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible. A fundamental prerequisite for pattern recognition is the ability to quickly distinguish between similar but not identical inputs. Not being able to pigeonhole an event or idea makes it much more difficult for the brain to label and store it as a discrete memory. Neat and tidy promotes learning; loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.
When we make and take incomplete stories from science, there are moral consequences.
Just as proper pattern recognition results in the reward of an increased release of dopamine, faulty pattern recognition is associated with decreased dopamine release. In monkeys, the failure to make a successful prediction (correlation between expected and actual outcome) characteristically diminishes dopamine release exactly at the time that the predicted event is anticipated but fails to occur. Just as accurate correlations are pleasurable, lack of correlation produces the neurotransmitter equivalent of thwarted expectation (or worse).
Once we see that stories are the narrative equivalent of correlation, it is easy to understand why our brains seek out stories (patterns) whenever and wherever possible. You may have read or heard about the famous experiment in which University of Illinois psychology professor Daniel Simons asked subjects to watch a video and count the number of times a ball is dribbled by a basketball team. When focused on counting, the majority of viewers failed to see a woman in a gorilla suit walk across the playing area. In effect, well-oiled patterns of observation encourage our brains to compose a story that we expect to hear.
Because we are compelled to make stories, we are often compelled to take incomplete stories and run with them. With a half-story from science in our minds, we earn a dopamine “reward” every time it helps us understand something in our world—even if that explanation is incomplete or wrong.
Following the Newtown massacre, some experts commented on the killer having Asperger’s syndrome, as though that might at least partially explain his behavior. Though Asperger’s syndrome feels like a specific diagnosis, it is, by definition, nothing more than a constellation of symptoms common to a group of people. In the 1940s, Austrian pediatrician Hans Asperger noted that a number of patients had similar problems with social skills, eccentric or repetitive actions, unusual preoccupation rituals, and communication difficulties, including lack of eye contact and trouble understanding facial expressions and gestures. The 2013 decision by the American Psychiatric Association to remove the diagnosis of Asperger’s syndrome from its guidebook for clinicians, the Diagnostic and Statistical Manual of Psychiatric Disorders (DSM-V), for failing to conform to any specific neuropathology, underscores the all-too-common problem of accepting a clustering of symptoms as synonymous with a specific disease. Syndromes are stories in search of underlying causes.
Similarly, studies of psychopaths have shown a diminished volume of gray matter in specific regions of the prefrontal cortex. But these findings aren’t the sole explanation for violent acts. Because it is impossible to stimulate a specific brain region to produce complex and premeditated acts, we are left to conclude that while certain brain conditions can be correlated with a complex act, they are not necessarily causing it. Likewise, brain scans that reveal abnormalities in mass murderers may help us understand what might have contributed to their behavior. But the abnormalities are no more the sole explanation for violence than childhood neglect or poor nutrition are. They are stories, albeit with a detailed neurophysiological component, but stories nonetheless.
When we make and take incomplete stories from science, there are often moral consequences. How much personal responsibility should we assign to an individual with a damaged or malfunctioning brain? What is the appropriate punishment and possibility of rehabilitation for such a person? Only when we openly acknowledge the degree to which science is presenting its observations in the form of story can we address this moral dimension. We must each work out our own guidelines for when we think scientific data has exceeded its bounds and has morphed into the agenda and bias of story. Of course this is always going to be a challenge in the absence of a full array of scientific data.
But we can begin by being aware of the various ways that storytelling can insinuate itself into the presentation and interpretation of data. Good science is a combination of meticulously obtained and analyzed data, a restriction of the conclusions to those interpretations that are explicitly reflected in the data, and an honest and humble recognition of the limits of what this data can say about the world.
Loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.
When reading science reports, we should also search for information on the limits of the data. Were assumptions made? What do the “error bars,” or graphic representations of variable data, say? We may not always understand the data limits, but we should be worried when some discussion of them is completely absent.
In the end, scientists have the tools, language, and experience to tell us informed, engaging, and powerful stories. In turn, we should judge their studies in the same light in which we judge other artistic forms. Like a literary critic, we should assess the preciseness of language, the tightness of structure, the clarity and originality of vision, the overall elegance and grace of the study, the restraint with which they present moral issues, how they place their studies in historical, cultural, and personal context, and their willingness to entertain alternative opinions and interpretations.
The methodology of science remains one of the great advances of humankind. Its stories, properly told, are epic poems in progress, and deserve to stand alongside the great stories of history.
Thales of Miletus, c. 624 – c. 546 BC was a Greek philosopher, mathematician, and astronomer who influenced much of later classical Greek and western thought
He was one of the pre-Socratic philosophers, who were concerned with “the essence of things. They were named physiologoi (φυσιολόγοι), physical or natural philosophers or physikoi (physicists) because they sought natural explanations for phenomena, as opposed to the earlier theologoi (theologians), whose explanations looked to the supernatural.
The pre-Socratic philosophers were asking:
From where does everything come?
From what is everything created?
How do we explain the plurality of things found in nature?
How might we describe nature mathematically?
Thales’ hypothesised that the originating principle of nature and matter was a single substance: water. Moreover, rather than assuming that earthquakes were the result of the whims of divine beings, Thales explained them by theorising that the Earth was a large disc which floated on water and that earthquakes occurred when the Earth was rocked by waves.
Thales used geometry to calculate the heights of pyramids and the distance of ships from the shore.
Placing your stick at the end of the shadow of the pyramid, you made by the sun’s rays two triangles, and so proved that the pyramid[height] was to the stick [height] as the shadow of the pyramid to the shadow of the stick.
W. W. Rouse Ball, A Short Account of the History of Mathematics (1893, 1925)
He is the first known individual to use deductive reasoning applied to geometry, by deriving the Thales’ theorem which observed that any triangle which sits along the diameter of a circle will by nature be a right angled triangle.
Thales was one of the seven sages of Greece, ho heptoi sophoi, (οἱ ἑπτὰ σοφοί) alongside Solon of Athens, and Periander of Corinth. These sages were known for pithy sayings including the inscription [attributed to Thales] at the Oracle of Delphi
The Seven Sages of Greece were not only philosophers, scientists and teachers but also involved in political life. Thales political involvement had mainly to do with the involvement of his region, Ionia in the defense of Anatolia [Asia Minor] against the growing power of the Persians. The neighbouring king of Lydia, king Croesus, had conquered many of the coastal cities of the Ionians and he engaged Thales support in his war against the Medes. The war endured for five years, but in the sixth an eclipse of the Sun spontaneously halted a battle in progress (the Battle of Halys). It seems that Thales had predicted this solar eclipse and based on it the Lydians and Medes made peace immediately, swearing a blood oath.
The Medes were vassals of the Persians under Cyrus. Croesus now sided with the Medes against the Persians and marched in the direction of Persia, stopping by the river Halys, then unbridged. The king gave the problem to Thales who got the army across by digging a diversion upstream so as to reduce the flow, making it possible to ford the river. When Croesus was unsuccessful against the Persian armies in Cappadocia, he marched home, and summoned his dependents and allies to send fresh troops to Sardis. The Persian army surrounded the armies of Croesus, trapping them within the walls of Sardis. This time, Thales fame as a counselor was to advise the Milesians not to engage in “fighting together”, with the Lydians against the Persians.
Croesus was defeated before the city of Sardis by Cyrus, and Miletus was subsequently spared because it had taken no action. Cyrus was so impressed by Croesus’ wisdom and his connection with the sages that he spared him and took his advice on various matters. The Ionians were now free and Miletus, received favorable terms from Cyrus including amnesty.
It was Thales wisdom in science, philosophy and politics which led to the rise of the Milesian school of philosophy which was influenced by both Egyptian and Babylonian mathematics and astronomy. It was Anaxagoras [c. 510 – c. 428 BC] of the Milesian school of philosophy who later brought its teaching to Athens, influencing Socrates and Pericles under the Golden Age of Greece.
Although Socrates born two centuries later [c. 470 – 399 BC], is more famously remembered to be the ‘father of western philosophy’, it is Thales earlier wisdom and scientific endeavours that have led to him being credited with fathering western philosophy.
Damien Shalley is someone who randomly came across Bear Skin, several pages deep in google search listings and subsequently submitted feedback. Considering we gain hits from all around the globe, and a following which interestingly comes largely from North America, it was a surprise to find out he lived in the same city as me in a corner of the anitipodes. Since then he has submitted various guest posts to Bear Skin on various themes of interest – art, music and even creative originals. His latest piece is a reflection on everyone’s friend, Richard Dawkins.
The Dawkins Dilemma
And there he is again, right on schedule, evolutionary biologist and social commentator Richard Dawkins. Perhaps best known for his “evangelical atheism” and his very public position that any form of religious belief is patently absurd, Dawkins loves to express his point of view during traditional Christian religious holidays such as Christmas. One seemingly cannot turn on a television during the festive season without being subjected to his anti-deist opinions. His annual analysis of why belief in God is foolhardy turned up as pre-Christmas viewing on both the BBC and the ABC in 2015, and his previous four-part analysis of why religious faith is antithetical to scientific endeavour also got a repeat airing. (He saves his strongest criticism for the Catholics in the final instalment, in case you hadn’t already guessed). Strangely enough he also resorted to spreading his views via Al Jazeera television last year. (Al Jazeera is funded by the Islamic government of Qatar). Make of this what you will.
Professionally, Dawkins is an esteemed evolutionary biologist with a knack for clearly and accurately explaining biological and evolutionary processes. For this, he has my admiration. He may well be peerless in his capacity to disseminate this knowledge in an understandable way. I have often marvelled at how well he describes processes such as natural selection, the driver of evolution, and felt awed by his dedication to the advancement of human knowledge.
But Dawkins insists that anyone who adheres to a religious faith or spiritual beliefs of any kind – his most famous target being Christianity – is deluded and foolish. In his publicly-stated view, religious belief is not worthy of serious consideration. His primary argument against it is simple – it is unscientific. God cannot be observed directly and “belief” cannot be quantified or measured. As such, religious belief systems defy the kind of objective analysis that a scientist like Dawkins requires and must be rejected outright. Yet one only has to scratch the surface of Dawkins’ primary argument to reveal a universe of questions for which he has no answer.
Dawkins himself is a polite and erudite man in his mid-seventies. He is impeccably well-qualified and any attempts to question the scientific basis of his arguments are quickly and skilfully shut down during debates. His primary weakness, it seems, is his intolerance of alternative points of view. In his opinion, God is a delusion, Christians are fools and forms of belief that he does not understand are equally foolish.
Dawkins is an adherent of rationalism and empirical analysis. He espouses a well-known and scientifically well-accepted view that our universe came into existence after a massive cosmic detonation. This explosion spread atoms from an infinitely dense ball of matter approximately the size of a melon to the farthest reaches of space. Elements created in this “big bang” formed the building blocks of organic life. Carbon-based life forms were created on earth when water, amino acids (proteins) and electricity combined to kick-start primordial existence. Human life subsequently came into being after billions of years of evolution.
This is fine as far as it goes. It is a scientifically sound premise and there is a significant amount of evidence to support aspects of this theory. We live in an expanding universe (consistent with an explosive genesis), we are carbon-based life forms, we can observe primitive aerobic organisms living in hot springs in parts of the world today that might well be our primitive precursors, and we can see evidence of evolution in the form of prehistoric fossils and observe natural selection processes in wild environments. The Dawkins position looks pretty strong. And yet, it isn’t.
Can matter originate from nothing? Can nothingness ever be the originator of “somethingness” (for want of a better word?) If our universe began when a massive accumulation of cosmic energy caused a concentrated ball of matter to explode, what was this cosmic energy and where did this ball of matter come from? Cosmologists have recently posited that in space, matter might accumulate in concentrated forms due to inversions wherein space folds in on itself in a cyclical manner. (This has been described as similar to the way in which warm air and low pressure systems create cyclones). This is another scientifically sound theory. But what is this matter which is accumulating? What is this “essence” of the universe – this foundation of creation, so to speak – and where did it come from? And why is the vociferous Richard Dawkins so strangely silent about this topic? Put simply, why can’t Richard Dawkins explain this in the same way that he so easily explains the known and understandable aspects of biology?
Because he can’t, that’s why. (Also, he doesn’t want to).
Dawkins has been at pains in the past to inform us that scientists cannot seek to explain phenomena starting from a “supernatural” standpoint. A premise such as the creation of matter by God bears no “internal consistency” to a scientist seeking a rational explanation. He cannot countenance this theological option, and within the boundaries of his scientific analysis, he doesn’t have to. But that still leaves a major hole in his analysis, as well as his conclusions about those who choose to seek additional answers elsewhere.
The fact that television programmers choose to allow Dawkins to stick his head above the parapet during the holiday season is probably more a function of their search for an audience than anything else. (He regularly attracts both supporters and critics and they all watch his shows – including me). In his latest outing, Dawkins interviews cloistered monks, an American Catholic priest and comedian Ricky Gervais, as well as looking to astronomy and classic English literature to help explain his position. He concludes after 50 minutes that there is no God and that belief in a deity is facile, that we are an accident of the cosmos, that people should live as though death will render everything in their lives utterly redundant and that Christian celebratory holidays are cultural norms of the delusional. (Incidentally, it hardly seems unusual to me that a society with a Christian history has public holidays linked to this heritage. But hey, show some respect, Richard Dawkins is speaking).
I’m not convinced of Dawkins’ argument and never have been, although I’m no opponent of science (or of free thought either). I believe that science is an invaluable tool for the betterment of the human race, and I believe that it has delivered the foundational understanding for our modern lives. Medicine, engineering, communications, education: these and virtually every other aspect of human existence have been improved by scientific advances. But I also believe that science does not, indeed cannot, answer any and all questions about human existence. And if people wish to seek answers in religious belief and social structures based on religious principles, it is not Richard Dawkins’ place to tell them that they shouldn’t.
If you would like to submit a guest post to Bear Skin, please feel free to email me at email@example.com
In the west, we condone a liberal tolerance of all points of view – asserting there is no such thing as “ultimate truth.” This itself is a truth claim but is a valid truth claim because it supports freedom of thought. So we believe in individual freedom.
We don’t believe in any over arching system of ethics or system of truth, until another culture contravenes our ideas of what is right and wrong. Case in point, what greater evil than the censorship of freedom of speech? right ?
In western nations, we believe in the power of forgiveness but not in oppressive views or regulations about sexuality. Other cultures believe in conservative sexual values, but not necessarily in our liberal notions of forgiveness. Not an honour-shame society for example.
What is right and what is wrong ? Our bias tells us our ways are right and others are wrong. Other’s truth claims lead to violence and hate. Our truth claims are valid because they endorse freedom and life.
In western nations, we hold dearly to notions of liberal individualism, yet imposing such notions on developing communities, essentially divorcing the individual as an entity from their community, wreaks havoc both for the individual and for the community in question. So well meaning help, from the vantage point of what we value highly can actually be a violence to a community.
This begs the question of whether there is an ultimate narrative to aspire to understanding – an ultimate hero-journey, an ultimate discovery of “what is” that will guide our way? Or do we simply impose order and narrative onto life? This quote caught my eye recently in the Huffington Post.
In 2009, Julianne Moore’s mother, Anne Smith, died suddenly of septic shock. She was 68, and Moore was devastated. After that, she stopped believing in God. “I learned when my mother died five years ago that there is no ‘there’ there,” Moore, 54, told the Hollywood Reporter.
“Structure, it’s all imposed. We impose order and narrative on everything in order to understand it. Otherwise, there’s nothing but chaos.”
Do we impose a narrative on life – or is there a narrative there to discover ? Ultimately, what is truth?
Interestingly, Pilate asked the same question of Christ. John 18 recounts:
With this he went out again to the Jews gathered there and said, “I find no basis for a charge against him. 39But it is your custom for me to release to you one prisoner at the time of the Passover. Do you want me to release ‘the king of the Jews’?”
40They shouted back, “No, not him! Give us Barabbas!”
In John’s account, Jesus makes the startling claim to not “speak the truth” but the “be the truth” that all truth-tellers speak of.
In our understanding, the teachings of Christ are good and moral. He taught to forgive, to show mercy, to love our enemies. He gave up his life for these values. He was an iconoclast, a prophet not unlike Ghandi or Siddharta.
His audactious claims tell us a few things:
He did not ever wish to be a good teacher pointing to the truth. He cannot be equated among good teachers for this claim.
In the words of C S Lewis, “He is either a lunatic, a liar or …………….”
So, what do we do with his claim to BEthe truth? If he claimed to embody the truth, this truth must be something like freedom or life, the only things that are of ultimate value and not relative worth.
Science makes truth claims, but science is a provable system of empirical tests. Science claims don’t seek to control us, but rather support our understanding of the reality we live in. Moreover, the claims of science are ultimately disprovable, and the next test or proof can totally shift our understanding of reality to a new and deeper truth claim.
C S Lewis explained his belief in God:
I believe Christianity just as I believe the sun rises, not only because I see it, but because by it I see everything else.
So Christ claimed to be the light by which we would see the world and reality.
In narrative terms, Christ claimed to be the ultimate narrative to aspire to, the ultimate meaning in the universe. He stated that we do not simply “impose order and narrative” onto everything, but his IS the grand narrative.
This week the Wall Street Journal and The Australian both ran an interesting article on the scientific evidence for the existence of a creator. Written by Eric Metaxas, biographer and journalist, the article raises the question of God using scientific arguments.
Metaxas cites the 1966 Time magazine headline, “Is God Dead?”, in which the astronomer Carl Sagan announced that there were two important criteria for a planet to support life:
The right kind of star, and
a planet the right distance from that star.
He goes on to point out that given the roughly octillion — 1 followed by 24 zeros — planets in the universe, there should have been about septillion — 1 followed by 21 zeros — planets capable of supporting life. As of 2014, researches have discovered precisely zero.
Metaxas continues to show that as knowledge of the universe has increased, it became clear that there were far more factors necessary for life than Sagan supposed. His two parameters grew to 10 and then 20 and then 50, and so the number of potentially life-supporting planets decreased accordingly. The number dropped to a few thousand planets and kept on plummeting.
As factors continued to be discovered, the number of possible planets plummets below zero. In other words, the odds turned against any planet in the universe supporting life, including this one. Probability says that even we shouldn’t be here. Today there are more than 200 known parameters necessary for a planet to support life — every single one of which must be perfectly met, or the whole thing falls apart. Without a massive planet like Jupiter nearby, whose gravity will draw away asteroids, a thousand times as many would hit Earth’s surface and so forth.
Metaxas concludes, the finetuning necessary for life to exist on a planet is nothing compared with the finetuning required for the universe to exist at all. Alter any one value and the universe could not exist. For instance, if the ratio between the nuclear strong force and the electromagnetic force had been off by the tiniest fraction of the tiniest fraction — by even one part in 100,000,000,000,000,000 — then no stars could have ever formed after the Big Bang at all.
“Multiply that single parameter by all the other necessary conditions, and the odds against the universe existing are so heart-stoppingly astronomical that the notion that it all “just happened” defies common sense.”
What is curious to me about this dialogue is several points:
Western tradition, stemming from the Enlightenment period has placed a sharp divide between discussion around faith or spirituality in relation to science. The religious wars of Europe at the time resulted in an uneasy truce based on the determination to separate church and state, science and religion from each other. There is almost a ban on public discussion to this day of the combination of these ideas.
However, scientists making strong athiestic statements of the ilk “God is dead” re-enter this debate as guiltily as any churchman or Musliman or Hinduman.
Since the Romantic period of the late 1800s, art and culture has moved strongly towards a more spiritual dialogue, integrating what was denied during the rationalistic period of enlightenment debates. This re-ignited stories of spirits, other worlds, magic, time travel, dreams and re-opened questions of origin and being.
Science fiction is a descendent of the romantic tradition, combining scientific knowledge with permission to wonder and imagine.
Science Fiction, not unlike ancient myth and legend, has long asked these questions with absolute permission. Unembumbered by rhetoric required to separate rationalism and spiritualism, questions of being, life, existence have been freely explored.
Moreover, the harsh modern and pre-modern debates are largely out of date in contemporary society, a society in which most people and cultures acknowledge a spiritual realm, even if they do not agree to the nature or name of that realm. Such an article, other than within the close circles of academia still bound to the strict mores of generations past, will not seem surprising at all.
In fact, I believe most people sigh a sigh of relief to hear that science is gradually catching up with the zeitgeist of the time to acknowledge it’s okay to discuss spirituality in the public realm again.