Game of Thrones, in its eighth and final season, enjoyed audiences of more than 17 million people per week. However, fan and critic reaction though out the season indicated many of those millions loathed it.
Her thesis was simple; the original narrative created by George R. R. Martin struck a cord with audiences because of its unique subtlety as a sociological story which stood out among Hollywood narratives characterised by being psychological and individually motivated.
It’s not just bad storytelling—it’s because the storytelling style changed from sociological to psychological .
She explains that in sociological narratives, characters evolve in response to the broader social, political, economic and cultural incentives and norms that surround them. Author George R. R. Martin drew from medieval and renaissance history for his characters and plot devices, as well as from European myth and legend. In doing so, he specialized in having characters evolve in response to the broader social fabric and beliefs within which they were placed.
On the other hand, psychological narratives features characters driven by much more individual quests and motives. The preference for this narrative style in Hollywood is understandable: the story is easier to tell and we gravitate toward identifying with the hero or hating the antihero, at the personal level. The hallmark of sociological storytelling however, is it can encourage us to put ourselves in the place of any character, not just the main hero/heroine, and imagine ourselves making similar choices. The complexity made it much richer than a simplistic morality tale, where unadulterated good fights with evil.
An example of the power of Martin’s sociological storytelling was his willingness to kill off major characters frequently without losing the thread of the story. Narratives driven by psychological and individual motives rarely do that because the main characters are the key tools with which the story is built. Given the dearth of such narratives in fiction and in TV, this approach clearly resonated with a large fan base that latched on to the show.
Showrunners, D. B. Weiss and David Benioff, took the narrative beyond Martin’s books, and turned the later seasons into Hollywood psychological narratives. In the final season, none of the main characters are killed early or unexpectedly and the motives and movements of the protagonists and antagonists became ever more internally wrought. What resulted in season 8 was ‘deus ex-machina’ styled defeat of the forces of the dead, and simplistic dissolution to good vs. evil interplay between main characters.
And it was the story’s richness which was lost in season 8, moving fans and critics to openly pan the final episodes online. Meme’s abound like the image below featuring a young woman’s Halloween costume literally ‘trashing’ season 8.
Without inner narratives we would be lost in a chaotic world.
We are all storytellers; we make sense out of the world by telling stories. And science is a great source of stories.
Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.
But when we use data of the physical world to explain phenomena that cannot be reduced to physical facts, or when we extend incomplete data to draw general conclusions, we are telling stories. Knowing the atomic weight of carbon and oxygen cannot tell us what life is. There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.
Science is not at fault. On the contrary, science can save us from false stories. It is an irreplaceable means of understanding our world. But despite the verities of science, many of our most important questions compel us to tell stories that venture beyond the facts. For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.
To see where science and story meet, let’s take a look at how story is created in the brain. Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel:
The king died and then the queen died.
It is nearly impossible to read this juxtaposition of events without wondering why the queen died. Even with a minimum of description, the construction of the sentence makes us guess at a pattern. Why would the author mention both events in the same sentence if he didn’t mean to imply a causal relationship?
Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.
Despite the verities of science, we are compelled to tell stories that venture beyond the facts.
The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions. The reward system extends from the limbic area of the brain, vital to the expression of emotion, to the prefrontal cortex, critical to executive thought. Though still imperfectly understood, it is generally thought that the reward system plays a central role in the promotion and reinforcement of learning. Key to the system, and found primarily within its brain cells, is dopamine, a neurotransmitter that carries and modulates signals among brain cells. Studies consistently show that feeling rewarded is accompanied by a rise in dopamine levels.
This reward system was first noted in the 1950s by two McGill University researchers, James Olds and Peter Milner. Stimulating electrodes were placed in presumed brain reward areas of rats. When allowed full unrestricted access to a lever that, when depressed, would cause the electrodes to fire, the rats quickly learned to repeatedly depress the lever, often to the exclusion of food and water. Realizing that our brains are capable of producing feelings so intense that we choose to ignore such basic drives as hunger and thirst was a first step toward understanding the enormous power of the brain’s reward circuitry.
Critical to understanding how stories spark the brain’s reward system is the theory known as pattern recognition—the brain’s way of piecing together a number of separate components of an image into a coherent picture. The first time you see a lion, for instance, you have to figure out what you’re seeing. At least 30 separate areas of the brain’s visual cortex pitch in, each processing an aspect of the overall image—from the detection of motion and edges, to the register of color and facial features. Collectively they form an overall image of a lion.
Each subsequent exposure to a lion enhances your neural circuitry; the connections among processing regions become more robust and efficient. (This theory, based on the research of Canadian psychologist Donald O. Hebb, a pioneer in studying how people learn, is often stated as “cells that fire together wire together.”) Soon, less input is necessary to recognize the lion. A fleeting glimpse of a partial picture is sufficient for recognition, which occurs via positive feedback from your reward system. Yes, you are assured by your brain, that is a lion.
An efficient pattern recognition of a lion makes perfect evolutionary sense. If you see a large feline shape moving in some nearby brush, it is unwise to wait until you see the yellows of the lion’s eyes before starting to run up the nearest tree. You need a brain that quickly detects entire shapes from fragments of the total picture and provides you with a powerful sense of the accuracy of this recognition.
One need only think of the recognition of a new pattern that is so profound that it triggers an involuntary “a-ha!” to understand the degree of pleasure that can be associated with learning. It’s no wonder that once a particular pattern-recognition-reward relationship is well grooved into our circuitry, it is hard to shake. In general—outside of addiction, that is—this “stickiness” of a correlation is a good thing. It is through repetition and the sense of familiarity and “rightness” of a correlation that we learn to navigate our way in the world.
Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones. Thought-experiments can be compared to storytelling exercises using well-known characters. What would Sherlock Holmes do if he found a body suspended in a tree with a note strapped to its ankle? What would a light ray being bounced between two mirrors look like to an observer sitting on a train? Once done with their story, scientists go to the lab to test it; writers call editors to see if they will buy it.
People and science are like bread and butter. We are hardwired to need stories; science has storytelling buried deep in its nature. But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible. A fundamental prerequisite for pattern recognition is the ability to quickly distinguish between similar but not identical inputs. Not being able to pigeonhole an event or idea makes it much more difficult for the brain to label and store it as a discrete memory. Neat and tidy promotes learning; loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.
When we make and take incomplete stories from science, there are moral consequences.
Just as proper pattern recognition results in the reward of an increased release of dopamine, faulty pattern recognition is associated with decreased dopamine release. In monkeys, the failure to make a successful prediction (correlation between expected and actual outcome) characteristically diminishes dopamine release exactly at the time that the predicted event is anticipated but fails to occur. Just as accurate correlations are pleasurable, lack of correlation produces the neurotransmitter equivalent of thwarted expectation (or worse).
Once we see that stories are the narrative equivalent of correlation, it is easy to understand why our brains seek out stories (patterns) whenever and wherever possible. You may have read or heard about the famous experiment in which University of Illinois psychology professor Daniel Simons asked subjects to watch a video and count the number of times a ball is dribbled by a basketball team. When focused on counting, the majority of viewers failed to see a woman in a gorilla suit walk across the playing area. In effect, well-oiled patterns of observation encourage our brains to compose a story that we expect to hear.
Because we are compelled to make stories, we are often compelled to take incomplete stories and run with them. With a half-story from science in our minds, we earn a dopamine “reward” every time it helps us understand something in our world—even if that explanation is incomplete or wrong.
Following the Newtown massacre, some experts commented on the killer having Asperger’s syndrome, as though that might at least partially explain his behavior. Though Asperger’s syndrome feels like a specific diagnosis, it is, by definition, nothing more than a constellation of symptoms common to a group of people. In the 1940s, Austrian pediatrician Hans Asperger noted that a number of patients had similar problems with social skills, eccentric or repetitive actions, unusual preoccupation rituals, and communication difficulties, including lack of eye contact and trouble understanding facial expressions and gestures. The 2013 decision by the American Psychiatric Association to remove the diagnosis of Asperger’s syndrome from its guidebook for clinicians, the Diagnostic and Statistical Manual of Psychiatric Disorders (DSM-V), for failing to conform to any specific neuropathology, underscores the all-too-common problem of accepting a clustering of symptoms as synonymous with a specific disease. Syndromes are stories in search of underlying causes.
Similarly, studies of psychopaths have shown a diminished volume of gray matter in specific regions of the prefrontal cortex. But these findings aren’t the sole explanation for violent acts. Because it is impossible to stimulate a specific brain region to produce complex and premeditated acts, we are left to conclude that while certain brain conditions can be correlated with a complex act, they are not necessarily causing it. Likewise, brain scans that reveal abnormalities in mass murderers may help us understand what might have contributed to their behavior. But the abnormalities are no more the sole explanation for violence than childhood neglect or poor nutrition are. They are stories, albeit with a detailed neurophysiological component, but stories nonetheless.
When we make and take incomplete stories from science, there are often moral consequences. How much personal responsibility should we assign to an individual with a damaged or malfunctioning brain? What is the appropriate punishment and possibility of rehabilitation for such a person? Only when we openly acknowledge the degree to which science is presenting its observations in the form of story can we address this moral dimension. We must each work out our own guidelines for when we think scientific data has exceeded its bounds and has morphed into the agenda and bias of story. Of course this is always going to be a challenge in the absence of a full array of scientific data.
But we can begin by being aware of the various ways that storytelling can insinuate itself into the presentation and interpretation of data. Good science is a combination of meticulously obtained and analyzed data, a restriction of the conclusions to those interpretations that are explicitly reflected in the data, and an honest and humble recognition of the limits of what this data can say about the world.
Loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.
When reading science reports, we should also search for information on the limits of the data. Were assumptions made? What do the “error bars,” or graphic representations of variable data, say? We may not always understand the data limits, but we should be worried when some discussion of them is completely absent.
In the end, scientists have the tools, language, and experience to tell us informed, engaging, and powerful stories. In turn, we should judge their studies in the same light in which we judge other artistic forms. Like a literary critic, we should assess the preciseness of language, the tightness of structure, the clarity and originality of vision, the overall elegance and grace of the study, the restraint with which they present moral issues, how they place their studies in historical, cultural, and personal context, and their willingness to entertain alternative opinions and interpretations.
The methodology of science remains one of the great advances of humankind. Its stories, properly told, are epic poems in progress, and deserve to stand alongside the great stories of history.
The article below by Andrew Simmons was published in The Atlantic on April 8, 2014. It’s linked here below verbatim.
The oft-neglected literary form can help students learn in ways that prose can’t.
16 years after enjoying a high school literary education rich in poetry, I am a literature teacher who barely teaches it. So far this year, my 12th grade literature students have read nearly 200,000 words for my class. Poems have accounted for no more than 100.
This is a shame—not just because poetry is important to teach, but also because poetry is important for the teaching of writing and reading.
High school poetry suffers from an image problem. Think of Dead Poet’s Society‘s scenes of red-cheeked lads standing on desks and reciting verse, or of dowdy Dickinson imitators mooning on park benches, filling up journals with noxious chapbook fodder. There’s also the tired lessons about iambic pentameter and teachers wringing interpretations from cryptic stanzas, their students bewildered and chuckling.
Reading poetry is impractical, even frivolous. High school poets are antisocial and effete.
I have always rejected these clichéd mischaracterizations born of ignorance, bad movies, and uninspired teaching. Yet I haven’t been stirred to fill my lessons with Pound and Eliot as my 11th grade teacher did. I loved poetry in high school. I wrote it. I read it. Today, I slip scripture into an analysis of The Day of the Locust. A Nikki Giovanni piece appears in The Bluest Eye unit. Poetry has become an afterthought, a supplement, not something to study on its own.
In an education landscape that dramatically deemphasizes creative expression in favor of expository writing and prioritizes the analysis of non-literary texts, high school literature teachers have to negotiate between their preferences and the way the wind is blowing. That sometimes means sacrifice, and poetry is often the first head to roll.
Yet poetry enables teachers to teach their students how to write, read, and understand any text.
Poetry can give students a healthy outlet for surging emotions.
Reading original poetry aloud in class can foster trust and empathy in the classroom community, while also emphasizing speaking and listening skills that are often neglected in high school literature classes.
Students who don’t like writing essays may like poetry, with its dearth of fixed rules and its kinship with rap. For these students, poetry can become a gateway to other forms of writing. It can help teach skills that come in handy with other kinds of writing—like precise, economical diction, for example. When Carl Sandburg writes, “The fog comes/on little cat feet,” in just six words, he endows a natural phenomenon with character, a pace, and a spirit.
All forms of writing benefits from the powerful and concise phrases found in poems.
I have used cut-up poetry (a variation on the sort “popularized” by William Burroughs and Brion Gysin) to teach 9th grade students, most of whom learned English as a second language, about grammar and literary devices. They made collages after slicing up dozens of “sources,” identifying the adjectives and adverbs, utilizing parallel structure, alliteration, assonance, and other figures of speech. Short poems make a complete textual analysis more manageable for English language learners. When teaching students to read and evaluate every single word of a text, it makes sense to demonstrate the practice with a brief poem—like Gwendolyn Brooks’s “We Real Cool.”
Students can learn how to utilize grammar in their own writing by studying how poets do—and do not—abide by traditional writing rules in their work. Poetry can teach writing and grammar conventions by showing what happens when poets strip them away or pervert them for effect. Dickinson often capitalizes common nouns and uses dashes instead of commas to note sudden shifts in focus. Agee uses colons to create dramatic, speech-like pauses. Cummings of course rebels completely. He usually eschews capitalization in his proto-text message poetry, wrapping frequent asides in parentheses and leaving last lines dangling on their pages, period-less. In “next to of course god america i,” Cummings strings together, in the first 13 lines, a cavalcade of jingoistic catch-phrases a politician might utter, and the lack of punctuation slowing down and organizing the assault accentuates their unintelligibility and banality and heightens the satire. The abuse of conventions helps make the point. In class, it can help a teacher explain the exhausting effect of run-on sentences—or illustrate how clichés weaken an argument.
Yet, despite all of the benefits poetry brings to the classroom, I have been hesitant to use poems as a mere tool for teaching grammar conventions. Even the in-class disembowelment of a poem’s meaning can diminish the personal, even transcendent, experience of reading a poem. Billy Collins characterizes the latter as a “deadening” act that obscures the poem beneath the puffed-up importance of its interpretation. In his poem “Introduction to Poetry,” he writes: “all they want to do is tie the poem to a chair with rope/and torture a confession out of it./They begin beating it with a hose/to find out what it really means.”
The point of reading a poem is not to try to “solve” it. Still, that quantifiable process of demystification is precisely what teachers are encouraged to teach students, often in lieu of curating a powerful experience through literature. The literature itself becomes secondary, boiled down to its Cliff’s Notes demi-glace. I haven’t wanted to risk that with the poems that enchanted me in my youth.
Teachers should produce literature lovers as well as keen critics, striking a balance between teaching writing, grammar, and analytical strategies and then also helping students to see that,
…literature should be mystifying.
It should resist easy interpretation and beg for return visits. Poetry serves this purpose perfectly. I am confident my 12th graders know how to write essays. I know they can mine a text for subtle messages. But I worry sometimes if they’ve learned this lesson. In May, a month before they graduate, I may read some poetry with my seniors—to drive home that and nothing more.
ABOUT THE AUTHOR:
Andrew Simmons is a writer, teacher, and musician based in California. He has written for The New York Times, Slate, and The Believer.
This post is a summary of a great article by Maria Popova of Brain Pickings. You can read the full post here.
She summarises the wok of Martha Nussbaum, who shows that story telling belongs to the realm of moral philosophy. Both narrative and play deepens the inner world; it becomes a place for individual creative effort and for the trusting identification of self within the world.
“The power of ‘the Eye of the Heart,’ which produces insight, is vastly superior to the power of thought, which produces opinions,”
the great British economic theorist and philosopher E.F. Schumacher wrote in his 1973 meditation on how we know what we know. He was responding to the Persian poet and philosopher Rumi who, seven centuries earlier, extolled “the eye of the heart” as seventy-fold more seeing than the “sensible eyes” of the intellect.
To the intellectually ambitious, this might sound like a squishy notion. But as contemporary scientists continue to shed light on how our emotions affect our susceptibility to disease, it is becoming increasingly clear that our emotional lives are equipped with a special and non-negligible kind of bodily and cognitive intelligence.
The nature of that intelligence and how we can harness its power is what Martha Nussbaum, examines in her magnificent 2001 book Upheavals of Thought: The Intelligence of Emotions. Nussbaum’s treatise offers a lucid counterpoint to the old idea that our emotions are merely animal energies or primal impulses wholly separate from our cognition. Instead, she argues that they are a centerpiece of moral philosophy and that any substantive theory of ethics necessitates a substantive understanding of the emotions.
One of Nussbaum’s central points is that the complex cognitive structure of the emotions has a narrative form — that is, the stories we tell ourselves about who we are and what we feel shape our emotional and ethical reality, which of course is the great psychological function of literature and the reason why art can function as a form of therapy. What emerges is an intelligent manifesto for including the storytelling arts in moral philosophy.
We cannot understand [a person’s] love … without knowing a great deal about the history of patterns of attachment that extend back into [the person’s] childhood. Past loves shadow present attachments, and take up residence within them. This, in turn, suggests that in order to talk well about them we will need to turn to texts that contain a narrative dimension, thus deepening and refining our grasp of ourselves as beings with a complicated temporal history.
She revisits the rationale behind the book’s title:
Emotions should be understood as “geological upheavals of thought”: as judgments in which people acknowledge the great importance, for their own flourishing, of things that they do not fully control — and acknowledge thereby their neediness before the world and its events.
But this neediness — a notion invariably shrouded in negative judgment and shame, for it connotes an admission of our lack of command — is one of the essential features that make us human. Nussbaum writes:
Human beings appear to be the only mortal finite beings who wish to transcend their finitude. Thus they are the only emotional beings who wish not to be emotional, who wish to withhold these acknowledgments of neediness and to design for themselves a life in which these acknowledgments have no place. This means that they frequently learn to reject their own vulnerability and to suppress awareness of the attachments that entail it. We might also say … that they are the only animals for whom neediness is a source of shame, and who take pride in themselves to the extent to which they have allegedly gotten clear of vulnerability.
And yet neediness, Nussbaum argues, is central to our developmental process as human beings. Much like frustration is essential for satisfaction, neediness becomes essential for our sense of control:
The process of development entails many moments of discomfort and frustration. Indeed, some frustration of the infant’s wants by the caretaker’s separate comings and goings is essential to development — for if everything were always simply given in advance of discomfort, the child would never try out its own projects of control.
The child’s evolving recognition that the caretaker sometimes fails to bring it what it wants gives rise to an anger that is closely linked to its emerging love. Indeed, the very recognition that both good things and their absence have an external source guarantees the presence of both of these emotions — although the infant has not yet recognized that both take a single person as their object.
This interplay of two imperfect beings is, as Joseph Campbell memorably observed, the essence of romantic love. An intolerance for imperfection and for the basic humanity of our own neediness, Nussbaum notes, can impede our very capacity for connection and make our emotions appear as blindsiding, incomprehensible events that befall us rather than a singular form of our natural intelligence:
The emotions of the adult life sometimes feel as if they flood up out of nowhere, in ways that don’t match our present view of our objects or their value. This will be especially true of the person who maintains some kind of false self-defense, and who is in consequence out of touch with the emotions of neediness and dependence, or of anger and aggression, that characterize the true self.
Nussbaum returns to the narrative structure of the emotions and how storytelling can help us rewire our relationship to neediness:
The understanding of any single emotion is incomplete unless its narrative history is grasped and studied for the light it sheds on the present response. This already suggests a central role for the arts in human self-understanding: for narrative artworks of various kinds (whether musical or visual or literary) give us information about these emotion-histories that we could not easily get otherwise. This is what Proust meant when he claimed that certain truths about the human emotions can be best conveyed, in verbal and textual form, only by a narrative work of art: only such a work will accurately and fully show the interrelated temporal structure of emotional “thoughts,” prominently including the heart’s intermittences between recognition and denial of neediness.
Narrative artworks are important for what they show the person who is eager to understand the emotions; they are also important because of what they do in the emotional life. They do not simply represent that history, they enter into it. Storytelling and narrative play are essential in cultivating the child’s sense of her own aloneness, her inner world. Her capacity to be alone is supported by the ability to imagine the good object’s presence when the object is not present, and to play at presence and absence using toys that serve the function of “transitional objects.” As time goes on, this play deepens the inner world; it becomes a place for individual creative effort and hence for trusting differentiation of self from world.
In the remainder of Upheavals of Thought, which remains a revelatory read in its hefty totality, Nussbaum goes on to explore how the narrative arts can reshape our psychoemotional constitution and how understanding the intelligence of the emotions can help us navigate the messiness of grief, love, anger, and fear.
Beneath the entertainment and diversion of narrative and art lies a great power – the power to tell a truth or truths. Many of us would watch a film or read a book for an escape from the real world, but there are in fact much greater and deeper purpose to story and art.
Post-enlightenment theory and post-modern philosophy would have us believe that “there can be no certainty in an objective reality or morality.” The only certainty we can have is our own existence and experience.
In contrast, for narrative to work, a story must exist within a world built upon various rules: a historical, political, geographical, and moral framework, one fit with religions, fate and destiny for characters and a trajectory and denoument for the plot. The hero belongs to this world and explores it, constrained by its rules and limitations, struggling against foes therein, travelling through its landscape and striving to find catharisis and resolution.
Immersion in a narrative world for the contemporary reader, is immersion in a world of objective meaning, against which the protagonist can struggle to find themselves. By following the protagonist through this world, the reader can find some foundation from which to understand their own world, a world often too terrible and great to understand alone through one’s subjective lens.
Andrew Chan and Myuran Sukumaran inspired other inmates to get involved in meaningful pursuits rather than waste their lives.
A young girl is guided to place a candle on a flower wall that reads “#keephopealive” as part of an Amnesty international vigil for the Bali nine duo, Andrew Chan and Myuran Sukumaran. Photo: Getty Images
It is hard to imagine the final thoughts of Andrew Chan as he waited, tied to a stake, for bullets to tear into his flesh and swiftly bring his life to a violent, inglorious end. If you contemplate this even briefly you can sense something of the chilling terror of that moment. It’s a tragedy when any life ends prematurely, but somehow the cold mechanical intentionality of execution carries an especially grave weight. Australians have been feeling that weight of late, despite the fact that executions are routine in various parts of the world, including in the United States.
Chan famously became a Christian in jail, studied theology, and right up until his transportation to Nusakambangan Island led the English worship service in Kerobokan prison. In recent weeks he was ordained as a minister of the church. He attributed his radically changed life entirely to his religious conversion.
Not everyone buys it. “Jailhouse religion” is a pejorative term for crims finding God in the slammer in the hope of a reprieve or better treatment. There have been some notorious ones. Charles “Tex” Watson, Charles Manson’s right-hand man, has been a Christian since 1975 and these days, despite being unlikely to ever be released is an ordained minister. Even serial killer Ted Bundy claimed a dramatic conversion on his way to the electric chair.
But as his appointment with death loomed, Australians became familiar with the stories of Chan, (along with fellow convicted drug trafficker Myuran Sukumaran), having become a model prisoner and an inspiration to other inmates to clean up their acts and get involved in meaningful pursuits rather than waste away.
And while myriad voices, from politicians to celebrities of various stripes have “stood for mercy” and begged for forgiveness, others have resolutely focused on the victims of the drug trade and the many lives that stood to suffer irreparable damage had the infamous shipment of heroin made it through customs in Australia a decade ago. These people are finding it hard to feel much sympathy for the condemned pair, believing Chan and Sukumaran’s perilous journey was utterly selfish. They knew the risks and simply had to accept the punishment.
Chan certainly faced up to his crime and accepted that he was profoundly in the wrong when he embarked on that ill-advised venture as a 21-year-old. The Christian faith that he claimed as his own has never been about a gathering of the “good people”, rather it is more like the “league of the guilty”, as Francis Spufford called it – the solidarity of those who know they need forgiveness and redemption.
Chan considered his former life a waste and of no benefit to anyone, and it was in reading the Bible that he came to see the value of being a blessing to others. Despite his incarceration he interpreted his conversion as being set free from the inside. He is now famous for being such a calming and life-affirming presence among the other prisoners that even the Korobokan prison governor was one of those appealing for a last-minute reprieve.
Close observers have noted how calm and positive the 31-year-old Chan – who married his fiancée on the eve of his execution – remained right to the end. Jeff Hammond was for four years a spiritual counsellor and pastor to Chan. After visiting him for the last time Hammond told Fran Kelly at ABC Radio National: “He’s got a peace within his own heart … his hope whether he lives or whether he dies [is] that the fruit that he’s been able to produce will continue to be a blessing to other people.”
What do we make of this hope that Chan carried with him right up to his death? It’s fair to say that if there is no God and his radically changed life was all the result of some grand illusion then the whole sorry episode does begin to look like a meaningless waste for all involved. More broadly such a lack must mean that there is no hope for ultimate justice or mercy for any of us. And perhaps that’s right and we need to face it. Religion as a crutch is a familiar trope for those who aren’t convinced of its substance.
But if Chan was onto something all those years ago when, in solitary confinement, he first sensed God alongside him as he read the gospel accounts of Jesus’ life, then he joins many others who have gone to their deaths in similar circumstances, not glad of the fate that awaited them, but hopeful, even confident that through their death they were in fact being ushered into life; that they were in a mysterious but profound sense going home.
Dietrich Bonhoeffer, the famous German theologian executed by the Nazis for his part in a plot to kill Hitler, died when he was 39 years old and engaged to be married. The Nazis sent him to the gallows at Flossenbürg concentration camp two weeks before it was liberated. As he was led away from his cell he is reported to have said to another prisoner, “This is the end … For me, the beginning of life.” It is the very idea that gave Chan a reason not to utterly despair as he stood to face the dreaded line of rifles pointed cruelly in his direction.
Given the torment and unimaginable stress of recent months when any night a knock on his cell door could signal the end, perhaps he also felt the release of a weight built up over a decade of anguished expectation expressed immortally by Dickens’ Sydney Carton on his way to the gallows in A Tale of Two Cities, “It is a far far better rest that I go to than I have ever known.”
Simon Smart is director of the Centre for Public Christianity. He is the co-author with Jane Caro, Antony Loewenstein, and Rachel Woodlock of For God’s Sake – an atheist, a Jew, a Christian and a Muslim Debate Religion.
The cave you fear to enter, holds the treasure you seek. – Joseph Campbell
With this one line, Joseph Cambell captures the power and significance of narrative to our lives. Campbell identified the archetype of The Hero Journey and its presence in myths and legends of every culture.
In the first chapter of his work “The Hero with 1000 Faces,” he writes:
It has always been the prime function of mythology and rite to supply the symbols that carry the human spirit forward, in counteraction to those that tend to tie it back. In fact, it may very well be that the very high incidence of neuroticism among ourselves follows the decline among us of such effective spiritual aid.
The first work of the hero is to retreat from the world scene of secondary effects to those causal zones of the psyche where the difficulties really reside, and there to clarify the difficulties, eradicate them in his own case (i.e., give battle to the nursery demons of his local culture) and break through to the undistorted, direct experience and assimilation of what [Carl] Jung called “the archetypal images.”
Thanks again the marvellous Brain Pickings and TED-Ed this video tells of Joseph Campell’s ‘mono-myth’ or hero journey and timeless significnace to our lives.
For him athiesm is a much more internally consistent belief system.
It avoids the prickly internal contradiction that maintains there is an all knowing , all good and all powerful God responsible for this world who is also desiring of our unending grattitude and praise.
Cultural commentator Russell Brand, mouthpiece for the spiritual awakening pervasive in western culture , had his reply on The Trews.
The debate is interesting because it drills down beyond dogma into the narrative of belief systems. Every world view has a story at its heart and from this core narrative we draw the meaning of our existence.
The narrative of Buddhism says suffering is an illusion tied to desire. If we achieve detachment from desire we can escape the world of suffering and so the world of rebirth.
The narrative of Hinduism says suffering is merited, and karmic cycles deliver suffering upon us for past misdemeanours.
The narrative of Islam says God is far greater than humanity, and God’s greater wisdom means humans cannot understand the meaning of their suffering.
The narrative of athiesm says says suffering is entirely meaningless [as is joy or evil]. The locus of reality lies in existential being.
What all these narratives agree on is that suffering incites in us a sense of justice. From it we gain a sense of meaning outside of our own experiences, a solidarity with others who suffer. Suffering gives us a knowledge that all is not right with this world and that suffering is inherently wrong for the human condition.
The Hebrew understanding of suffering to me offers the most profound illustration in the Book of Job.
The narrative of Job shows that suffering is real and it is often unmerited. Job choses not to resign himself to God’s mystery.
His suffering presses him to go beyond religion.
Job then has the choice to turn from God to nihilism but instead he turns TO God with a daring challenge. “Show yourself.”
God created this mess and so only God can stand between an imperfect humanity and a perfect God and arbitrate.
In doing so, Job is declared righteous, as righteous as any of the covenant. It’s not blood sacrifice, circumcision, baptism, church attendance, meditation, renunciation, humility, pennance, piety or prayers that God smiles upon. From the very beginning it’s faith.
It’s the vision of God standing between us and Godself, a God-man ultimately carrying our suffering.
This redemption gives ultimate meaning to our suffering, not removing it but bearing with us, walking with us, taking away our tears with a glorious future hope.
In the west, we condone a liberal tolerance of all points of view – asserting there is no such thing as “ultimate truth.” This itself is a truth claim but is a valid truth claim because it supports freedom of thought. So we believe in individual freedom.
We don’t believe in any over arching system of ethics or system of truth, until another culture contravenes our ideas of what is right and wrong. Case in point, what greater evil than the censorship of freedom of speech? right ?
In western nations, we believe in the power of forgiveness but not in oppressive views or regulations about sexuality. Other cultures believe in conservative sexual values, but not necessarily in our liberal notions of forgiveness. Not an honour-shame society for example.
What is right and what is wrong ? Our bias tells us our ways are right and others are wrong. Other’s truth claims lead to violence and hate. Our truth claims are valid because they endorse freedom and life.
In western nations, we hold dearly to notions of liberal individualism, yet imposing such notions on developing communities, essentially divorcing the individual as an entity from their community, wreaks havoc both for the individual and for the community in question. So well meaning help, from the vantage point of what we value highly can actually be a violence to a community.
This begs the question of whether there is an ultimate narrative to aspire to understanding – an ultimate hero-journey, an ultimate discovery of “what is” that will guide our way? Or do we simply impose order and narrative onto life? This quote caught my eye recently in the Huffington Post.
In 2009, Julianne Moore’s mother, Anne Smith, died suddenly of septic shock. She was 68, and Moore was devastated. After that, she stopped believing in God. “I learned when my mother died five years ago that there is no ‘there’ there,” Moore, 54, told the Hollywood Reporter.
“Structure, it’s all imposed. We impose order and narrative on everything in order to understand it. Otherwise, there’s nothing but chaos.”
Do we impose a narrative on life – or is there a narrative there to discover ? Ultimately, what is truth?
Interestingly, Pilate asked the same question of Christ. John 18 recounts:
With this he went out again to the Jews gathered there and said, “I find no basis for a charge against him. 39But it is your custom for me to release to you one prisoner at the time of the Passover. Do you want me to release ‘the king of the Jews’?”
40They shouted back, “No, not him! Give us Barabbas!”
In John’s account, Jesus makes the startling claim to not “speak the truth” but the “be the truth” that all truth-tellers speak of.
In our understanding, the teachings of Christ are good and moral. He taught to forgive, to show mercy, to love our enemies. He gave up his life for these values. He was an iconoclast, a prophet not unlike Ghandi or Siddharta.
His audactious claims tell us a few things:
He did not ever wish to be a good teacher pointing to the truth. He cannot be equated among good teachers for this claim.
In the words of C S Lewis, “He is either a lunatic, a liar or …………….”
So, what do we do with his claim to BEthe truth? If he claimed to embody the truth, this truth must be something like freedom or life, the only things that are of ultimate value and not relative worth.
Science makes truth claims, but science is a provable system of empirical tests. Science claims don’t seek to control us, but rather support our understanding of the reality we live in. Moreover, the claims of science are ultimately disprovable, and the next test or proof can totally shift our understanding of reality to a new and deeper truth claim.
C S Lewis explained his belief in God:
I believe Christianity just as I believe the sun rises, not only because I see it, but because by it I see everything else.
So Christ claimed to be the light by which we would see the world and reality.
In narrative terms, Christ claimed to be the ultimate narrative to aspire to, the ultimate meaning in the universe. He stated that we do not simply “impose order and narrative” onto everything, but his IS the grand narrative.