Page 2804 – Christianity Today (2024)

by Matthew Sleeth

Good Germs, Bad Germs.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

When my son Clark was little, he was prone to upper respiratory infections. I used to call him over, pull out my handkerchief, and tell him to blow. Now and then, I commented, “Wow, you’re leaking a lot of brain lubricant.” The poor guy. Years later, he told me he had taken me at my word. As a result, he’d gone around sniffing to keep his brain from losing all its lubricant. I wonder what would have become of him if I’d told the truth about the teeming masses of bacteria in his runny nose.

Page 2804 – Christianity Today (2)

Good Germs, Bad Germs: Health and Survival in a Bacterial World

Jessica Snyder Sachs (Author)

Brand: Hill and Wang

304 pages

$23.53

Clark, now age 19, has long ago forgiven, if not forgotten, my doctor humor. He was recently home on his college break when I received for review Jessica Snyder Sachs’ Good Germs, Bad Germs: Health and Survival in a Bacterial World. Before I get to the book, I’ll digress a bit more. Over Clark’s school break, he asked me to go see the movie I Am Legend, starring Will Smith. As it turns out, Clark’s movie choice was rather providential. Next fall my son will start medical school. I’m not sure his generation of doctors has more to learn from Good Germs, Bad Germs or I Am Legend.

The film begins with a scene from a TV newsroom. Karen at the health desk is interviewing Dr. Alice Krippen about her medical invention. “Give it to me in a nutshell,” Karen prompts. “The premise is quite simple,” Dr. Krippen begins. “Take something that is designed by nature and reprogram it to make it work for the body rather than against it.” We learn that Dr. Krippen’s team has done clinical trials on 10,009 patients using a genetically altered measles virus. In follow-ups, all are cancer free. The doctor is asked if she has found the cure for cancer. “Yes. Yes. Yes, we have,” she says—as the scene shifts to a post-apocalyptic world a few years later.

Will Smith, in the role of virologist Robert Neville, is the last human inhabitant of Manhattan Island, except for a bunch of ghoulish, bloodthirsty cancer-vaccine “survivors.” It seems the cancer cure has left everybody dead—or “undead”—except for Neville, who is immune to the vaccine’s side effects. About thirty minutes into the film, Neville injects a captive, unconscious vampire girl. She lunges toward him, and that’s when I told Clark I had to leave the theater.

In my first year of residency, I’d been surgically inserting a right subclavian line on a comatose, near-death patient when my needle must have hit a nerve. The unconscious patient sat bolt upright, opened his eyes, and stared at me just like the zombie Neville injects. Decades later, the memory of this patient still undoes my composure.

Now to Sachs’ fine book. It begins with a real-life prologue about a college student who is well one day, and the next day rapidly goes into septic shock and dies. Throughout her narrative, Sachs interjects stories such as this, and herein lies much of the book’s hold on the reader.

In Part 1, “The War on Germs,” we meet the Renaissance physician Girolamo Fracastoro, whose 1530 text on “the French disease” was composed in Latin hexameter. His poetic treatise on syphilis was ahead of its time, correctly postulating a microbial vector and setting the stage for a branch of modern medicine.

Sachs does not mention that one of the early cures for syphilis was to have patients contract malaria—the subsequent high fever proved too much for the pesky spirochete—but she does trace other toxic cures, and then pauses at Paul Ehrlich’s 1908 introduction of Salvarsan, which was effective against syphilis. From there she moves to the modern antibiotic era with Alexander Fleming’s 1928 serendipitous observation of Penicillium mold, which had contaminated and thus inhibited the growth of colonies of Staphylococcus aureus.

In the chapter “Life on Man,” Sachs provides a fascinating description of the bacterial colonization of the human landscape. Just 24 hours after birth, our skin sports one thousand bacteria per square centimeter. At 48 hours, the number jumps to ten thousand. We hit the hundred thousand mark by six weeks. It is this dense forest of one hundred billion friendly bacteria on our skin that guards us from the rare, unfriendly sorts. Fifteen trillion essential bacteria line and protect our empty digestive tracts. We learn that the type and count of bacteria are affected by emotional states and, even more intriguing, that the bacteria can, and do, signal our cells to enhance these symbiotic relationships.

One of the book’s strong points is its blend of the highly technical with the everyday. There is enough of the nonscientific to keep all but the most unrepentant technophobes slogging along. Hang on through some subjects that just cannot be made any simpler, and you will be rewarded with stories that no one taught us in med school. For example, in 1959, while filming Cleopatra, Liz Taylor fell ill with a deadly, resistant form of staph pneumonia. An experimental batch of methicillin saved her life. Thousands of her fans and dozens of her husbands owe a debt of thanks to the antibiotic maker. If that’s not enough to pique your interest, there is even a lurid description of how bacteria, once thought to be asexual in their reproductive life, have sex. This is one of the mechanisms whereby bacteria transfer antibiotic resistance from one to another and— shockingly—from one species to another.

Transferring and developing resistance to antibiotics is what much of Sachs’ book is about. It is a frightening subject that has made many a headline. But the untold side of the story is that many bacteria simply stop being harmful. Strep throat no longer carries the death sentence of resulting rheumatic heart disease and glomerular nephritis that it once did. Smallpox has been eradicated and, for the most part, tuberculosis is no longer the scourge of European cities. For unknown reasons, the plague ceased to be the threat it was even before the advent of antibiotics. There is some good news.

Sherlock Holmes, the fictional invention of a physician, was a clever investigator. He taught his pupils to look for clues; he also taught them that some clues were telling by their absence. If I have one criticism of Good Germs, Bad Germs, it is that one of the great infectious disease tales is missing—that of HIV/AIDS. The disease, the introduction of antiretroviral drugs, emergent resistance to them, and the use of antibiotics in treating immuno-compromised patients: this story, so apt for Sachs’ theme, is mysteriously absent from her book.

We come now to what I believe is the single most important story in Good Germs, Bad Germs. In 1986, Michael Zasloff, a researcher at the nih, stumbled upon a chemical that helps frogs fight off bacteria. The substance consists of short chains of amino acids. These antimicrobial peptides also are made by humans. They bathe our eyes and skin with their protective activity. Zasloff realized that the amphibian version of these chemicals was particularly potent.

It seemed that Zasloff had found a safe new form of antibiotic that bacteria could not adapt or mutate to resist. The New York Times lauded the discovery and pronounced, “Dr. Zasloff will have produced a fine successor to penicillin.” Zasloff and investors rushed to license the new wonder drug. Despite the approval of the Times editorial staff, the FDA demanded more clinical trials.

Enter two heroes: biologists Graham Bell and Pierre-Henri Gouyon. They published an opinion piece calling for restraint. Bacteria have the habit of becoming resistant to antibiotics once those drugs are in widespread use. They reasoned that, even though antimicrobial peptides operate differently than penicillin or other antibiotics, resistance could happen again. (In the United States alone, 25 million pounds of antibiotics are given to animals and three million pounds to humans annually.)

Zasloff replied in the press, calling Bell and Gouyon’s logic “fundamentally wrong.” What ensued was the equivalent of a wrestling match. Zasloff dared Bell to grow bacteria that could develop resistance to his patent medicine. Bell took up the challenge and, with his tag-team assistant, grew 22 colonies of resistant E. coli and pseudomonas.

What is so significant about this? If Zasloff, Smithkline Beecham, or others interested in the peptides had brought the drugs to market, the result might well have been bacteria resistant to our natural lines of defense. A “boo-boo” on the knee could have become an almost certain “bye-bye.” To his credit, Zasloff admitted the error of his own thinking and the validity of Bell and Gouyon’s.

Zasloff meant well. But, as my maternal grandmother was fond of saying, “The road to hell is paved with good intentions.” Today’s technology has the capacity to do great harm. Genetic engineering and the development of microbial, antimicrobial, and chemotherapeutic agents already have met with disaster and near disaster. If the past has anything to teach us, it is this: “First, do no harm.”

I’ve passed out many prescriptions for antibiotics. Some, I’m sure, were not needed, but in the er setting, I could not be confident that the infection would go away without medicine, and I worked for individual patients. The role of those who regulate new therapies is to protect society in general. They cannot be swayed by anecdotal sentiment, no matter how compelling.

Good Germs, Bad Germs and books like it have something to teach a society dizzy with the hubris of science. I was able to walk out on I Am Legend when it got too scary, but if a Pandora escapes from the gene-splicing lab, it will not be so simple. May God grant the next generation of doctors and scientists—including my son—a greater wisdom than ours.

Matthew Sleeth, a physician, is director of Blessed Earth (www.servegodsavetheplanet.org). He is the author of Serve God, Save the Planet (Chelsea Green/Zondervan).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Matthew Sleeth

by Randall Balmer

Ron Sider’s Scandal of Evangelical Politics.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

At a time when evangelical leaders were slobbering over Richard Nixon, Ronald Sider’s voice was tonic—especially for a college student still puzzling over how a tradition once identified with social justice could have negotiated such a radical right turn. By the time Rich Christians in an Age of Hunger appeared in 1978, Sider had emerged as one of my evangelical heroes. Here was a man who had organized Evangelicals for McGovern in 1972 (whose entire caucus, I suspect, could be tallied on two hands), and who had been the guiding force behind the Chicago Declaration of Evangelical Social Concern the ensuing year.

In light of the rise and the eventual dominance of the Religious Right later that same decade, the sentiments expressed in the Chicago Declaration seem quaint now. But it was a remarkable statement. “We deplore the historic involvement of the church in America with racism,” the declaration read, adding that evangelicals must “challenge the misplaced trust of the nation in economic and military might.” At the instigation of Nancy Hardesty, then an English professor at my Christian college, the Chicago Declaration included a passage that, harking back to the rich tradition of evangelical feminism in the 19th century, rebuked evangelicals for having “encouraged men to prideful domination and women to irresponsible passivity” and called “both men and women to mutual submission and active discipleship.”

Following the Chicago Declaration, Sider went on to form Evangelicals for Social Action and to write a number of books (including Rich Christians), which generally fall under the rubric of evangelical social ethics. His latest contribution is The Scandal of Evangelical Politics: Why Are Christians Missing the Chance to Really Change the World?—a book that, on the whole, is as disappointing as Rich Christians was bracing.

Sider notes that the “absence of any widely accepted, systematic evangelical reflection on politics leads to contradiction, confusion, ineffectiveness, even biblical unfaithfulness, in our political work.” Reviewing the political ideologies of various Christian thinkers through the centuries, from Augustine and Aquinas to Martin Luther, John Calvin, and the Anabaptists, Sider makes the point, echoing Luther, that the primary function of the state is the restraint of evil so that the gospel can flourish. Sider also considers, and finds wanting, the ideas of John Rawls, though I wish he had spent some time—any time at all!—on Jeffrey Stout, especially his Democracy and Tradition.

All of this is useful, and the author renders his thoughts cogently and persuasively. But as Sider moves from what he calls a “solid framework” to an “evangelical political philosophy,” he suffers a disheartening—and uncharacteristic—failure of nerve.

Sider speaks eloquently about the possibilities of peacemaking and invokes the “just war” tradition, but he neglects to mention that the invasion of Iraq meets few or any of these criteria. He rails against no-fault divorce, which is a defensible argument, though it ignores the fact that vindictive spouses can “game” the system to punish entire families with protracted divorce proceedings. He asserts that a “strong evangelical support for global human rights (especially religious freedom) led to what some have called a new evangelical ‘internationalism,’ ” but he fails to note the current “evangelical” president’s demonstrated disregard for human rights.

On hom*osexuality, Sider unblinkingly employs the Religious Right’s preferred incendiary term, “gay lifestyle,” implying that sexual orientation is simply a matter of volition. (As a gay friend of mine once asked, incredulously: “Why would anyone choose to be gay?”) On the separation of church and state, Sider dithers before finally lending his endorsem*nt to the disestablishment clause of the First Amendment. But he misses his best arguments: Religion has flourished here in the United States as nowhere else precisely because the government—for the most part, at least—has stayed out of the religion business, and the collusion between church and state ultimately trivializes the faith. Sider can’t bring himself to take a position on taxpayer-supported vouchers for religious schools, however, and at times his ducking and weaving borders on comical. What about “In God We Trust” emblazoned on our currency, government-supported chaplains, or references to the Deity in the pledge of allegiance? “I doubt that either retaining or abandoning these practices would be very significant,” Sider concludes, “although the debates will undoubtedly continue.”

Sider does make some good points. He argues that the importance once ascribed to the holding of property should be reconfigured as equal access to education; knowledge, he writes, “is the primary source of wealth creation.” He also warns, in a distant echo of the Chicago Declaration, that “Christians must be extremely vigilant against the ongoing temptations of idolatrous nationalism.”

By the time I finished reading The Scandal of Evangelical Politics, however, I was scratching my head. Where’s the scandal? Sider’s criticisms are so measured and his proposals so tepid that the book reads more like an endorsem*nt of evangelical political behavior over the last several decades than a critique. What happened to the author of Rich Christians in an Age of Hunger, who boldly summoned us to heed Jesus’ injunctions to care for “the least of these”? Surely, those of us who profess allegiance to the scandal of the gospel cannot simply accede to the status quo or the tired playbook coming out of Colorado Springs.

So what is the scandal of evangelical politics? The persistence of hunger in a land of plenty? The fundamental contradiction between pressing for “intelligent design” in public school curricula and utter indifference to the handiwork of the Intelligent Designer? The failure to purge misogynists and white supremacists from the highest echelons of evangelical leadership? The failure of evangelicals to rise up in collective moral outrage over the present administration’s persistent and systematic use of torture?

I returned to Sider’s preface in search of a scandal. The best that I could determine was that evangelicals had failed to “move from a commitment to Jesus Christ and biblical authority to concrete political decisions that lead us to support or oppose specific laws and candidates.” Fair enough, though it’s not clear how the book helps us address that scandal.

What made me even more uneasy was the triumphalism that tinges the conclusion to Sider’s preface. “All around the world,” he writes, “evangelical thinkers and politicians are wrestling at a deeper level with how to act politically in faithfulness to Christ.” And the payoff? “If even a modest fraction of that rapidly growing number of 500 million evangelicals and Pentecostals would develop a commonly embraced, biblically grounded framework for doing politics, they would change the world.”

Change the world by “doing politics”? That’s a remarkable statement, especially from someone who hails from the Anabaptist tradition. Anabaptists understand better than most Jesus’ renunciation of earthly power and his declaration that his kingdom was not of this world. The cautionary lesson from the sorry saga of the Religious Right lies not in the movement’s political ineptitude, egregious as that has been, but in its devaluing of the gospel in the quest for political influence. The New Testament suggests that religion always functions best from the margins of society and not in the councils of power—a principle strongly reinforced by an overview of American history. Whenever people of faith begin grasping after power, they lose their prophetic voice. This was no less true of mainline Protestantism in the 1950s, tethered as it was to white, middle-class Eisenhower suburbanism, than it has been of the Religious Right in the decades surrounding the turn of the 21st century.

Am I arguing that people of faith should not make their voices heard in the arena of public discourse? On the contrary: I believe that public discourse would be impoverished without those voices. But we should never delude ourselves into thinking that “doing politics,” to use Sider’s phrase, represents the highest or the best or even a proximate expression of our prophetic mission. A prophet always stands at the margins, calling the powerful to account. Misplaced allegiance to political power represents a form of idolatry, and the failure of evangelicals generally and the Religious Right in particular to call politicians to account, especially those politicians they propelled into office, is the stuff of, well, scandal.

In a very real sense, albeit in a backhanded way, The Scandal of Evangelical Politics attests to the frightful potency of the Religious Right. The fact that one of our clearest, most prophetic voices has been reduced to equivocation may not rise to the level of scandal. But it is a tragedy.

Randall Balmer, an Episcopal priest, is professor of American religious history at Barnard College, Columbia University, and a visiting professor at Yale Divinity School. He is the author most recently of God in the White House: A History: How Faith Shaped the Presidency from John F. Kennedy to George W. Bush (HarperOne).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Randall Balmer

by Gary Scott Smith

Faith and the presidency from JFK to George W. Bush.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

During the last five years numerous books and articles have analyzed the faith of American presidents, focusing on one or several chief executives or considering the broad sweep of the presidency. Randall Balmer’s God in the White House: How Faith Shaped the Presidency from John F. Kennedy to George W. Bush is a welcome addition to this literature. Balmer, a professor of American religious history at Barnard College, Columbia University and a leading scholar of American evangelicalism, traces how Americans moved from disregarding religion as a principal consideration for voting in 1960 to expecting candidates to reveal their religious convictions and explain their relationship to God by 2004. He analyzes and deplores both the “politicalization of religion” and the “‘religionization’ of politics” during these years.

Balmer labels himself an evangelical “whose understanding of the teachings of Jesus points him toward the left of the political spectrum.” He censures the leaders of the Religious Right for distorting the gospel and defaulting “on the noble legacy” of 19th-century evangelical activists who worked to help the less fortunate. When faith is “aligned too closely with a particular political movement or political party,” Balmer argues, it loses its integrity and prophetic voice. Religion plays a more positive role in society when it operates from “the margins of society,” not the centers of power.

These presuppositions guide Balmer’s thoughtful analysis of the nine presidents from Kennedy to George W. Bush. Balmer assesses the personal faith of these presidents and evaluates how it affected their work in the oval office; in a series of appendices he includes a major speech by each president to illustrate their religious convictions. Those presidents who strove to separate their faith from policymaking or used it to pursue liberal political ends are evaluated more positively.

Kennedy’s pledge to divorce his religious commitments from political considerations helped him win the closely contested 1960 election and “demolish the shibboleth that no Roman Catholic could ever be elected president.” Kennedy argued compellingly during the 1960 campaign that a president’s religion should not affect how he performed his duties. This conviction, coupled with the negative reaction of many Protestants to Kennedy’s Catholicism, led him to rely little on his faith in making decisions and formulating policies. Lyndon Johnson, despite exhibiting only “perfunctory, even performative” piety, was nevertheless inspired by the Golden Rule to develop his Great Society programs to help the poor and supply medical care for the elderly.

Balmer faults Richard Nixon for misusing religion, especially by holding worship services in the White House, and for hypocrisy. The stain of the Watergate scandal, the disgrace of the Vietnam War, and Gerald Ford’s pardon of Nixon enabled Jimmy Carter to appear as “a kind of savior” who could lead Americans “out of the wilderness of shame and corruption to the promised land of redemption and rehabilitation.” Intrigued and inspired by Carter’s claim that he was a “born again” Christian, many evangelicals voted for Carter in 1976. Despite Carter’s genuine piety and pursuit of numerous policies that reflected biblical priorities, most evangelicals deserted him in 1980. Initially galvanized by their desire to defend “the integrity of evangelical institutions against governmental interference,” Balmer argues—rather than by opposition to abortion—evangelicals, who had generally been politically disengaged, created the Moral Majority and similar organizations in the late 1970s to support candidates and policies consistent with their values. Upset by Carter’s refusal to try to outlaw abortion and his promotion of politically liberal policies, the Religious Right played an active role in helping elect Ronald Reagan, a divorced and remarried man who “had the weakest claim to evangelical faith” of the three major candidates. Preoccupied with the economy and the Soviets, Reagan neglected many key aspects of the Religious Right’s agenda. Nevertheless, most evangelicals loyally supported Reagan in the 1984 election and throughout the turmoil and scandals of his second term.

In 1988, evangelicals helped Episcopalian George H. W. Bush defeat Michael Dukakis, “the first truly secular major-party candidate for president,” but they embraced him less enthusiastically than Reagan. Although Bill Clinton professed to be a Christian, attended church regularly, and used evangelical rhetoric, his personal traits—especially his sexual infidelity—and liberal political policies irritated and offended many members of the Religious Right.

Evangelicals were attracted to George W. Bush’s Christian testimony, “compassionate conservativism,” and pledge to “restore decency and honor to the White House.” The 2000 election demonstrated that candidates’ faith had become important to many Americans, but voters were more concerned with the candidates’ sincerity than with the particularities of their religious commitments. Aided by John Kerry’s refusal to openly discuss his faith and his own frank professions of faith, Bush captured a large percentage of the votes of regular church attenders, enabling him to narrowly win reelection in 2004.

Based on this analysis, Balmer contends that no clear connection exists between a president’s faith and personal morality and his policies. The record of the last four and half decades suggests that candidates’ professions of faith are “a fairly poor indicator of how they govern.” Although Balmer finds fault with all nine presidents, he is most critical of Republican chief executives. Reagan failed to deliver on his campaign promises that he claimed were inspired by his religious commitments, Balmer says. He excoriates George W. Bush, denouncing the “radical disjunction” between “Bush’s claims of moral rectitude and his indifference to the moral ramifications of his policies,” especially his “aggressive” military campaign in Iraq, which flouted Christian just-war criteria.

In light of the record of the past 45 years, Balmer concludes, it is unfortunate that Americans focus more on whether presidential candidates “pass some sort of catechetical test” than on whether they possess charisma, political skills, substantial foreign and domestic policy experience, and administrative expertise. Although voters should consider a candidate’s faith because it provides insight into his character, Balmer maintains, it should only be one of numerous factors they take into account. He faults Americans for expecting the president to be “the sum total of our projections about the supposed goodness and honor and moral superiority of America” and politicians for encouraging us to “see them as embodiments of our supposed virtue.”

While the presidency has been damaged by injecting religious considerations into it, Balmer insists, faith has been harmed by politicizing it. The reputation of Quakerism was not improved by its connection with Nixon, nor was that of the Disciples of Christ aided by its association with Johnson or Reagan. Moreover, Balmer asserts, the Religious Right gained very little from its active participation in the political process. Once a faith is identified “with a particular candidate or party or with the quest for political influence,” it suffers.

Balmer encourages prospective voters to ask candidates how their faith affects their views of economics, social issues, and foreign policy. He protests that candidates’ professions of faith are often pious platitudes or window-dressing, which provide little insight into how they will govern and divert our attention from more important considerations. Balmer complains that we permit “politicians to hypnotize us with lullabies about faith and morality” and fail to hold them accountable for the principles they profess. But the blame can’t be reserved for politicians: Americans’ “collective affirmations of faith are no more sincere than those of our politicians.” Balmer challenges Americans to reject “the false gospel of America’s moral superiority,” ensure that candidates’ actions are consistent with their religious rhetoric, and live by the ideals we profess.

Balmer’s critique of American Christians’ self-delusion and hubris is commendable. Certainly he is right to insist that the faith of candidates should only be one consideration in the electoral process. Throughout American history presidents who have claimed to be Christians have sometimes violated biblical morality and pursued policies that contradicted scriptural teaching. On the other hand, in many instances, the faith of presidents has strengthened their character, increased their courage and confidence, helped them deal with the immense challenges of their office, inspired them to exhort Americans to live up to their best ideals, and encouraged citizens to promote policies that truly embody biblical teaching.

Indeed, although the politicizing of religion involves dangers, and though presidents have often misused religious rhetoric to woo voters, win support for policies, and please various constituencies, their personal faith has generally helped them perform their duties more effectively. Moreover, at times in American history the participation of religious groups in the political process has helped make our nation more compassionate and just (such as the abolition of slavery, the promotion of civil rights, and various policies to aid the poor). Therefore, while criticizing the political misuse of religion by politicians, religious groups, and voters, we should encourage all three groups to consider carefully how biblical values and personal faith can help shape and direct the political process in ways that benefit our nation and the world.

Gary Scott Smith chairs the History Department at Grove City College. His latest book is Faith and the Presidency: From George Washington to George W. Bush (Oxford Univ. Press).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Gary Scott Smith

by Mark Noll

The Canadians, of course.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

It is fitting that Preston Jones’ book on the public use of the Bible in Canada comes in an unassuming package. The volume is slim and marketed with the standard-issue cover that the University Press of America puts on most of its publications. It thus embodies the characteristic Canadian disinclination against gaudiness, pretension, and hype, which citizens to the North tend to see as characterizing their neighbors to the South. Jones, who teaches at John Brown University and who has published discerning works on subjects as diverse as Alaskan history and the East Asian sex trade, is not a Canadian himself, but long-time residence in Canada has allowed its culture to seep in.

Page 2804 – Christianity Today (6)

A Highly Favored Nation: The Bible and Canadian Meaning, 1860D1900

Preston Jones (Author)

University Press of America

130 pages

$50.99

The book is, therefore, surprising for reaching much the same conclusion about the Bible in Canadian public life that others have reached for the United States. The Scriptures, that is, were ubiquitous at almost all levels of public discourse in the second half of the 19th century. But that very ubiquity revealed more about the Canadians’ skill at using the Bible for their own purposes than letting the Bible exert a discernible sway on their actions. In Jones’ words, “Canadian nationalists wrenched Bible verses out of context and arrived at implausible parallels between biblical history and their own and they waged rhetorical war against others with the words of the Bible.” As in the United States, especially before and during the Civil War, Canadians used Scripture to “promote opposing visions of the Canadian nation.” The result was that “the Bible’s status as something to be revered was diminished.”

This conclusion leads Jones to challenge historians (including myself) who have argued that Canada in the late 19th century came closer to the ideals of genuinely Christian civilization than did the United States. To the claim I once made that Canada had not thrown its weight around internationally as much as the United States, Jones’ reply is preemptory: “If the language of Canada’s … English-speaking nationalists is taken seriously, one can conclude that had Canada ever acquired any such weight, it would have been thrown.”

Jones may not be entirely convincing in this comparative blame game. The ethnocentrism of Canada’s British population was indeed deep and wide, but whether it reached the depth or extended as broadly as American prejudice against colored peoples can be questioned. There was a great deal of discrimination in turn-of-the-century Canada that was supported by Bible-quoting Protestants, but no lynchings sanctioned by conservative evangelical clergy. Nevertheless, in a well documented account of Canada’s leading statesmen and Protestant ministers, Jones demonstrates that Canadian use of the Bible could be just as formulaic and just as merely rhetorical as the American.

The Dominion of Canada, which was established in 1867, took its name from Psalm 72:8: “And he shall have dominion also from sea to sea and from the river even unto the ends of the earth.” This ascription came after the Fathers of Confederation decided not to call Canada a “Kingdom” for fear of offending the triumphant republicanism of the victorious Union armies. Yet if pious Canadians were more successful at sneaking a few open biblical references into Canada’s founding documents than had been their American counterparts ninety years earlier, it did not mean, according to Jones’ convincing argument, that Canada’s scripturalism went deeper or was freer from hypocrisy than the American case.

The one possible exception to this sober conclusion is Quebec. The irony here is that Protestant denunciation of the province as a priest-ridden domain where laypeople were barred from opening the Bible actually spurred Quebec Catholic promotion of Scripture (just so long as it was an authorized French translation of the church’s official Vulgate). One of the most interesting of the ironies concerned Charles Chiniquy, whose violently polemical Fifty Years in the Church of Rome, published first in 1886, remains in print to this day. Chiniquy began as a priest in Quebec who was favored by the hierarchy because of his able promotion of temperance. But repeated immoralities cost him the favor of his superiors, his head-strong independence led to excommunication in 1856, and his animus against the church then grounded a long career as an anti-Roman crusader. In the era’s standard Protestant propaganda, the anti-Catholic Chiniquy repeated the charge that Catholics were prohibited from reading the Bible. In response, Quebec Catholics unearthed the records of a debate between Father Chiniquy and a Protestant pastor from 1841, in which Chiniquy had displayed a New Testament he called his constant companion and repeated the church’s injunctions urging lay attention to Scripture.

More edifying than the period’s Catholic-Protestant polemics is Jones’ account of how leading Quebec figures used the Bible for outlining Quebec’s unique status as a people “under God.”[1] To them, by contrast, it was French and Catholic Canada that deserved to be considered the antitype of biblical narrative. For over fifty years, as Quebec suffered growing economic subservience to English Canada, and as up to half a million Québécois immigrated to the United States, a number of prominent clerics promoted the view that French Canada was nothing less than God’s new Israel. Most prominent in this number was the third bishop of Trois-Rivièrs, Louis-Fran&ccedit;ois Laflèche (1818-1898), who in 1866 published a major work, Some Considerations on the Connections between Civil Society and Religion and the Family, that led on to a lifetime of further exploration of French Canada’s unique place in God’s design for the world as a whole.

Laflèche proved a master in using the Bible to defend his conviction that “our mission and our national destiny are the work of native missions and the extension of the Kingdom of God by the formation of a Catholic people in the Valley of the St. Lawrence.”[2] In making these arguments Laflèche called upon more Scripture than almost any of his Canadian or American contemporaries, and he actually developed a theology to provide not only an inspiring vision for Quebec nationalism, but also a practical antidote against “la fièvre de l’émigration.”[3]

Laflèche’s theology was based on a general conception of nationhood as part of God’s design first expressed to Abraham in Genesis chapters 12 and 13. Careful study of Scripture as well as careful attention to history proved to him that God judged the nations depending on how they fulfilled their missions under God. In Laflèche’s understanding, the exemplary record of the founders and early martyrs of French Canada verified the sacredness of Quebec’s destiny. Laflèche was not as sophisticated in his use of Scripture as, for example, Steven Keillor in his recently published case for deploying the category of divine judgment in assessing the United States in the wake of the terrorist attacks of September 2001.[4] But compared to others who were proposing scriptural accounts of Canadian or American nationhood during his own age, Laflèche stood out.

Preston Jones is not persuaded that Bishop Laflèche was using the Bible correctly by interpreting it as a Quebec-ocentric text. Nor does he romanticize Quebec Catholicism; he shows that even with serious encouragement from the Bishops, most Québécois were not going to be reading the Bible since the province’s rate of literacy was so low. Yet he does find in spokespersons like Laflèche, along with the biblical representations in the art and sculpture of Quebec churches, a public use of Scripture at least somewhat less superficial than in English Canada and the United States.

Canadian history deserves more attention than it receives from Americans because the northern neighbor’s differences from the United States are so small, yet also so strategic. In this case, paying attention to Jones’ carefully researched study would show breast-beaters (like myself) that American malfeasance is perhaps not as singular or as egregious as we sometimes think. It might encourage American filiopietists to pause when tempted to treat the United States as a unique object of God’s concern—if Canadians thought the same about their land (and if Russians, Germans, English, Dutch, South Africans, and Poles have done so too), then maybe no modern nation deserves such distinction. Most cogently, Jones shows that Canadian experience illustrates just as much as American experience the beatitude arising from treating God-given “holy things” as gifts from a merciful Sovereign, but also the enervating peril when these “holy things” are mishandled as objects of partisan advantage.

Mark Noll is Francis A. McAnaney Professor of History at the University of Notre Dame.

1. An earlier article in Books & Culture, which also benefited from some of Jones’ work, has already mentioned some of these figures: Mark Noll, “The Bible in American Public Life, 1860-2005,” September/October 2005, pp. 7, 46-50.

2. Louis-Fran&ccedit;ois Laflèche, Quelques Considérations sur les Rapports de la Société Civile avec la Religion et la Famille (Saint-Jacques, Quebec: Editions du Pot de Fer, 1991 [orig. 1866]), p. 71: “notre mission et notre destine nationals sont l’oeuvre des missions sauvages, et l’extention du royaume de Dieu par la formation d’un people catholique dan la vallée du St.-Laurent.”

3. Ibid., p. 25.

4. Steven Keillor, God’s Judgments: Interpreting History and the Christian Faith (InterVarsity Press, 2007); Brad Gregory discussed this work in Books & Culture, July/August 2007, pp. 18-19.

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Mark Noll

by Ken Stewart

Secularization and the university.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

By common reckoning, Canada experienced secularization more rapidly in the 20th century than did the United States. Indeed, it is frequently remarked that as to manifestations of religious faith in public life, Canada more resembles the nations of Western Europe than she does the United States. It is the merit of Catherine Gidney’s A Long Eclipse that it calls into question the application of this broad-brush interpretation of comparative secularization to the unfolding direction of Canadian university life.

Page 2804 – Christianity Today (8)

Page 2804 – Christianity Today (9)

C. Stacey Woods and the Evangelical Rediscovery of the University

A. Donald MacLeod (Author)

IVP Academic

283 pages

$3.62

Page 2804 – Christianity Today (10)

Conversant with the literature that suggests an extensive secularization of American universities by the 1920s, Gidney probed the institutional histories of five Canadian schools and found intriguing differences. According to her findings, Canadian universities, whether founded originally as church-backed or as government-funded institutions, reflected the ethos of mainline Protestantism into the late 1950s.

Canada’s universities (excepting Roman Catholic institutions) existed to provide training in arts and sciences for a populace deemed essentially Protestant. Presidents for such schools were men who had been church leaders and these, with their university faculties, affirmed the indebtedness of the arts and sciences to the classical and Christian past. They understood their work to include the moral as well as intellectual formation of their students. Only by 1960 did this world vanish.

Gidney, having gathered impressive data, is not loath to explore why the changes came, and when. In the period surveyed, she notes that Canadian university education ceased to be the privilege of the professional classes. Also, these institutions increasingly reflected the cultural pluralism which followed on Canada’s open immigration policies. Expanding enrollments required a proliferation of faculty members; these also were now more diverse. No longer could presidents hire only those whose academic credentials were augmented by loyalty to Christianity. Collectively these changes meant that Canada’s Protestant hegemony was diminished and that long-established university deference to Christianity had ended. The issue was ultimately forced when the still Christian-oriented universities could not fund the level of technological research necessitated by the Cold War era; they similarly lacked the resources to fund the postwar faculty call for an expansion of scarce graduate programs. At this point, even Canada’s church-backed universities were driven into the arms of their respective regional governments.

But the white flag of surrender had not been raised all at once. Gidney finds evidence that into the 1950s, Canada’s universities were still trying to maintain distinctively Christian content through core curriculum in Scripture and theology. Presidents who were rightly concerned about secularizing tendencies on their campuses gave backing to Christian student ministries such as the Student Christian Movement and, after some initial reluctance, to InterVarsity Christian Fellowship (IVCF). Gidney looks at Canada’s mainline Protestant churches and observes that these, which had founded and supported many of Canada’s universities, had by their own embrace of a liberalized Protestantism grown ambivalent in upholding Christianity’s uniqueness. The spiritual malaise of Canada’s universities was thus rooted in the failure of Canada’s mainline denominations to evangelize a secularizing culture. Here then are lessons to be pondered both by those who have lived to see their formerly Christian-oriented universities re-directed to other ends and by those who have strained to create new Christian colleges and universities since 1960.

This instructive volume provides us with a context for tracing two other trajectories: the expansion of a struggling new Christian initiative toward the university (IVCF) undertaken in Canada (and later in the United States) by an expatriate Australian, Stacey Woods (1909-1983); and the career of a Canadian determined to make a difference in Canada’s universities as a Christian historian, W. Stanford Reid (1913-1996). For the probing portrayal of these trajectories—so nearly parallel, yet so utterly distinguishable—we are the debtors of the Canadian church historian A. Donald MacLeod; he knew both his subjects well over decades.

That Woods was selected in 1934 to stabilize the fledgling Canadian InterVarsity movement—begun in a 1929 visit of British medical student Howard Guinness—was itself both a marvel and a parable. By what standard of reckoning was an Australian graduate of what is now Dallas Theological Seminary (then Evangelical Theological College), whose only prior experience of Canada was leading summer beach missions for adolescents, the natural choice for such a role? It was that both Woods and the major Canadian backers of the struggling student movement shared a Plymouth Brethren tie; their network had taken note of his beach ministry; and they could think of no one better suited to the task of rescuing the movement (of which they were the substantial backers) from its Depression-era jeopardy.

That was the marvel. The parable had to do with the fact that InterVarsity—which at this stage existed (on this side of the Atlantic) only in Canada—was turning for leadership to a person who was himself no product of the public university system. Credits from Woods’ first degree (in theology at Dallas) had been taken north to Wheaton, where after additional coursework, an arts degree had been conferred. No one would maintain that Woods was ill- or miseducated by this process; yet it provides some insight into the state of evangelicalism in 1934 that a fledgling work taking aim at the public university looked for leadership to one who would engage it only when he took this post.

Woods, something of a human dynamo at Toronto from 1934, was also in demand in Chicago by 1939. The movement that had been so near disintegration in 1934 had taken on new life in conjunction with Woods’ leadership. Entrepreneurial, innovative, yet also seemingly incapable of delegating responsibility to associates, Woods for a time directed InterVarsity in both countries. Important Chicago connections were formed with Christian philanthropists (notably Herbert W. Taylor) and soon the InterVarsity movement possessed camping properties in the Michigan peninsula, a student magazine (HIS) and a publishing arm (IVP). Before the 1940’s were out, Woods—representing InterVarsity—had come to play a significant role in the young National Association of Evangelicals, and spent time at an evangelical think-tank at Plymouth, Massachusetts in which names such as Harold Ockenga and Carl F. H. Henry figured prominently. By this time, Woods was also in the orbit of London minister (and British IVCF pillar) D. M. Lloyd Jones, and with him helped to launch the global version of InterVarsity: the International Fellowship of Evangelical Students (IFES).

For a time, Woods divided his time between American InterVarsity and the secretariat of the latter, on behalf of which he regularly crisscrossed the globe; from 1960 he gave himself wholly to the latter. By then, his close American associate, Charles Troutman, so often driven to distraction by his colleague’s failure to delegate responsibility, had left the scene to direct the movement in—of all places—Woods’ native Australia. It becomes clear that Woods’ genius was for launching new initiatives and scouting new territory for student ministry around the globe. The “perfecting” of student ministry, in the sense of advancing the Christian understanding of learning, was a task pursued more resolutely by those who followed Woods.

The “recovery of the university” in conjunction with this foundational era was in truth the pursuit of Christ’s Great Commission in the universities. A key figure in this enterprise was the prominent Canadian Christian scholar, W.S. Reid. A historian in McGill (Quebec) and Guelph (Ontario) universities, Reid would eventually be known as a contributing editor to Christianity Today magazine and as a frequent Staley lecturer on U.S. Christian college campuses.

Those who knew Reid as the historian and churchman that he was might—if they viewed the early parts of his life from the vantage point of the last—suppose that it was bound to unfold just as it did. How easy it would be to suppose that his early academic prowess, the influence of a father and uncles (all ministers), and a tendency to be combative, were seeds that had merely to follow their natural development. The value of this biography, I suggest, is that it enables us to view contingent factors in Reid’s formation—circ*mstances which, had they unfolded differently, would have made for a very different story.

Born in 1913 as son to a Presbyterian minister, Reid’s youth was spent in Montreal and, in time, its Anglophone university, McGill. When, after the Great War of 1914-18, advocacy for the union of Canada’s Presbyterian, Congregational and Methodist churches resumed, Reid’s uncles and father were all caught up in the debate. What would have been the effect on young Stanford if in 1925 his own father had followed one brother into the new “United Church”? Stanford, whatever his temperament and intelligence, would then never have become a standard-bearer for conservative Presbyterianism; contingencies were in play.

After his McGill graduation in 1934, Reid, a pietistic student with roots in the InterVarsity movement, might have persisted through disconcerting experiences at Montreal’s Presbyterian College, a seminary of his denomination. As it was, pronouncements about the assured results of the higher criticism of the Bible and skepticism regarding the bodily resurrection of Christ provoked his withdrawal; he instead obtained a McGill M.A. in history. Had he remained, he would never have been faulted, subsequently, for snubbing the school; yet had he stayed, he might—as a young preacher of ability—never have ventured beyond the pastorate.

Reid did again take up the study of theology, and his choice was fraught with far-reaching implications. At Westminster Seminary, Philadelphia, Reid was exposed to neo-Calvinism, which provoked him to re-evaluate the evangelical pietism of his family and the early InterVarsity movement. His completion of a Ph.D. in history at the University of Pennsylvania raised the possibility that he might never work in Canada, never be associated with the church of his upbringing, never reconnect with McGill. And yet all three transpired. Though his sudden exit from Presbyterian College was remembered at his return to Canada in 1941, attempts to obstruct his entry to pastoral ministry failed; he also gained a part-time lectureship in McGill University.

As a Montreal pastor, Reid might have been less critical of ecumenical missionary policies focused on pre-communist China. He might also have refrained from opposing the absorption of the Presbyterian College into a joint McGill Faculty of Theology. The cost of such opposition appeared in 1949-50, when he was passed over in that college’s search for a church historian, a position for which Reid, an experienced pastor and McGill history lecturer, was eminently qualified. Officials remembered his “disloyalty” in 1934 and his strenuous conservatism since.

Reid might well have ended his academic career at McGill University, where by the 1960s he was a full professor. Yet the rise of Quebec nationalism provoked fears about the future of Anglophone Quebec higher education and culture. In 1965 therefore, he accepted the invitation to found a history department in the fledgling University of Guelph, Ontario. Just then, Reid might instead have become the president of Westminster Seminary, Philadelphia. In sum, this fine biography opens for us complexities about Reid that would never be appreciated if we took the shape of the man’s life to have been merely determined by his roots.

In the larger picture sketched out by Gidney, Reid was a Christian rowing “against the stream” in Canadian higher education during an era in which university life underwent rapid secularization. He did this concurrently with the InterVarsity movement’s expanding program of witness to the Gospel on campuses under identical influences. Reid persevered in the face of closed doors, some controlled by his denomination, some controlled by public universities which had steadily less room for the anachronism of the minister-scholar. Yet his career as a Christian academic illustrated the combination of faith and learning which the wider evangelical movement was increasingly struggling to recover.

Ken Stewart, a Canadian, is professor of Theological Studies at Covenant College, Lookout Mountain, Georgia. He is the author of Restoring the Reformation: British Evangelicalism and the Francophone Ré;veil (Paternoster, 2006) and co-editor of The Emergence of Evangelicalism (IVP-UK, 2008).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Ken Stewart

by Terence Halliday

Three scenarios.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Like a giddy debutante ball, the Olympic Games mark China’s long-delayed coming out into Global Society. At once a moment of international recognition where China can display its modernity and maturity, 2008 will be the symbolic event when the humiliations of 19th-century Western tutelage, the slaughter of millions by the Japanese, the Cold War isolation of China from the non-Communist world, the chaos of the Cultural Revolution, the bloody pavements of Tiananmen Square, will all be forgotten in a blaze of national glory and international acclamation. The world’s eyes will be on Beijing, and neither China nor the rest of the world will be the same again. Or that ostensibly is the hope of China’s leaders and the aspiration of its people.

In the aftermath of the catastrophic Sichuan earthquake, there will be considerable international sympathy for China, perhaps defusing some of the criticism that built in the months leading up to the games, as the Olympic torchbearers ran gauntlets of foreign protesters. But which China will follow the Beijing Olympics? The hot China of spectacular economic growth or virulent anti-Japanese demonstrations? The warm China of pandas and cultural exchanges? The cool China of military build-up and hard-headed Communist Party rule? Or the cold China of Tiananmen Square and support for the genocidal Sudanese government?

Answers to these questions diverge sharply. Three significant books display strikingly incompatible interpretations of China’s present and prognostications about its future. From their respective angles of orientations, these China-hands position themselves along a rough continuum from bright optimism to dark skepticism. In so doing, they effectively caution that this vast and exceedingly diverse country belies any naïve characterizations or glossy snapshots. They also exemplify how easy it is to allow faulty methodology and incomplete theory to produce flawed historical extrapolations.

Certain facts about China are unassailable. Over thirty years China has sustained annual economic growth of around 8-10 percent, lifted hundreds of millions of people out of poverty, become the world’s industrial factory, enacted hundreds of new laws, moved from a command economy to a predominantly private market, graduated from amongst the poorest countries in the world to a mid-level developing country, risen from a country of bicycles to only the third nation in history to put a man in space. China now pronounces itself committed to the rule of law and to a “peaceful rise.” The China of Mao jackets and Little Red Books is a distant memory, displaced by ubiquitous Western fashions and technology of every kind. By any standard these are extraordinary accomplishments.

For Randall Peerenboom, an authority on China’s legal system, the straight line of rising economic growth will likely continue at a gallop towards full modernization. China’s rise offers a paradigm of development, emulating the East Asia Model (EAM) that he finds in Japan, South Korea, Hong Kong, Singapore, and Taiwan. “I argue,” Peerenboom says, “that China is now following the same general path—modified slightly in light of the realities of the 21st century—of other East Asian countries that have achieved sustained economic growth, established rule of law, and usually developed constitutional democracies, albeit not necessarily liberal democracies.”

Peerenboom celebrates each of China’s “four main pillars of modernity,” as he styles them. The economic pillar surely merits applause. Few countries have managed to compress so much growth in a scant three decades. To achieve this feat, China’s leaders have prioritized economic growth and taken a pragmatic rather than ideological path to reforms. As a tradeoff, however, they have postponed democracy, settled for a “thin” rule of law, delayed constraining constitutionalism, and left to the future possible civil and political rights. This is the EAM, says Peerenboom, that distinguishes the Asian Tigers from the relative sluggards: Thailand, the Philippines, Cambodia, and India.

He endeavors to set the story straight on the second pillar—human rights. China, he avers, has done extremely well on social and economic rights. On the UN Human Development Index China progresses well. More than 150 million have been lifted from poverty in ten years. Adult literacy is up. Diet is improved. Infant mortality is down. Life expectancy has lengthened. Women’s rights are at a similar level to other nations at a similar income level, though serious problems remain. Most rights for its fifty-five ethnic groups (about 8-9 percent of the population) are reasonably protected. Accusations of cultural genocide in Tibet are overstated.

Civil and political rights are another matter. On “physical integrity rights,” Peerenboom disputes China’s low ranking on Amnesty International’s Political Terror Scale, a rank signifying that “murders, disappearances, and torture are a common part of life.” China’s critics, he says, seize unfairly on dramatic stories about torture of Falun Gong adherents or police brutality. He accepts government statistics on rates of police torture and asserts there are few “extra-judicial killings,” though he does acknowledge that China is ranked in the bottom 10 percent of Asian countries on civil and political rights and deservedly so.

On political rights—freedom of speech, freedom of thought, freedom of assembly—the government takes a harder line. Here social stability is its touchstone. The Communist Party will brook no rivals. That includes religion, because of a “long history of religious movements toppling dynasties in the past.” The Propaganda Department and State Security Ministry control tightly discussion of politically sensitive topics. Domestic debate and overseas news daren’t touch Tiananmen Square, Falun Gong, attacks on the Party, Taiwan, criticism of top leaders, or loose talk about democracy. Yet, says Peerenboom, while not defensible, such stifling of freedom is understandable. China has chosen “economics first,” not “freedom first.” If it follows the EAM trajectory, freedom will come. In any event, he protests, China is subject to a double standard on rights, unfairly criticized when other nations get handled with kid gloves.

On the third pillar of modernity, the legal system, Peerenboom is an eminent specialist and, perhaps not coincidentally, his optimism is tempered. China has come a tremendous distance since the Cultural Revolution as it pushes toward a “socialist rule of law state.” But despite clear advances in the prominence, efficiency, and fairness of the legal system, “the assumption that China is moving toward a liberal democratic conception of the rule of law is unfounded,” at least in the short term. Criminal law reforms, which are most salient to human rights, have largely failed. China has taken enormous strides to implant a commercial law regime. But progress is slowing as “reform fatigue” sets in with diminishing returns. A competent, strong, independent judiciary is a distant dream, and without decisive movement towards a “thick” rule of law the government’s own goals won’t be realized, let alone those of western optimists.

And democracy, the fourth pillar? Decidedly downbeat, Peerenboom says democracy in Asia disappoints. Indeed, progress toward democracy for “Third Wave” countries worldwide with low levels of wealth has been “stunningly disappointing.” China’s leaders have essentially postponed it until later—when the “country is richer and more stable.” Given that “most Chinese citizens are happy with their lives, optimistic about the future, and relatively satisfied with the government as a whole,” he sides with the decision of China’s leaders “to put democracy on the back burner.”

This rose-hued portrait of an inexorable march to prosperity and freedom would have us sit back, defer to the wisdom of China’s leaders and the supposed choice of its people, and allow events to take their course. If all goes well China will be a South Korean success story—a rich, stable, democratic, open society—in a few decades. It might even become, as the book’s subtitle provocatively suggests, “a model for the rest.”

This is not the China that Susan Shirk observes. A distinguished China scholar and former Deputy Assistant Secretary with responsibility for China in the Clinton Administration, Shirk has been closely following Chinese politics for three decades. As her 1971 photo with Zhou Enlai signifies, she has been meeting with and writing on China’s leaders since she was a young political scientist. In China: Fragile Superpower, what we find is less a nascent superpower than a fragile society, teetering on the brink of domestic chaos that could lead to war. Yes, war with the United States.

Shirk doesn’t dispute the remarkable economic progress made by China, nor its increasingly symbiotic economic relationship with the United States and its integral place in the world economy more generally. But, in contrast to Peerenboom, she argues that emerging economic problems augur badly for social and political stability. The social security of the “iron rice bowl has gone,” and with it guaranteed health care, permanent employment, and assured retirement pensions. Tens of millions of workers have lost their jobs, especially in China’s northeast rustbelt. China’s west and hundreds of millions of its rural population are being left behind in a widening inequality that could trigger “massive unrest.” Opportunistic speculators, often in complicity with local officials, seize land without adequate compensation. Corruption is rampant among officials. Environmental problems make domestic headlines daily.

Add these together—rising mass protests, ethnic unrest in Tibet and Xinjiang, labor unrest, rural unrest, student unrest over international incidents, social unrest—mix them with flammable nationalism, and the paranoia of China’s authoritarian leaders intensifies. From this vantage point, a white-hot economy merely buys time for China’s vulnerable leaders who can barely stay in the saddle of their writhing dragon.

China’s Communist leaders, Shirk believes, have made a Faustian bargain. Above all, they strive to stay in power. Yet they are “haunted by the fear that their days in power are numbered.” They struggle to maintain political control, fear their own citizens and exude “a deep sense of domestic insecurity.” They look back to Tiananmen Square, where they came within a hairsbreadth of losing the country. They look across their long borders to Communist regimes that cracked and crumbled with stunning speed. They look outside their cloistered redoubt alongside the Forbidden City and see a population that has abandoned the very ideology that defines the Party.

To maintain their “brittle authoritarian regime” over a public that finds Communist ideology bankrupt, they have stoked the fires of nationalism. China’s new ideology whips restive publics into support of its grey, “colorless, cautious” technocratic leaders by turning their emotions outside—to chest-thumping against real and supposed offenses to China’s pride by the United States, Taiwan, and, above all, Japan. But this bargain—beating the nationalist drum and keeping the economy going in exchange for keeping the Party in power—may end up driving China into the very fate it should avoid, a war that will derail China’s rise and plunge the country into a maelstrom.

Nationalism can explode in the hands of the Party leaders who wield it. Headlines grow ever more incendiary as papers competing for survival in the market find common cause with propagandists. Despite a massive apparatus of media censorship, however, the Propaganda Department and public security find that the internet and cell-phones in the hands of adept youngsters can spill protesters into the streets with little or no warning.

While the Party-state security apparatus spreads its tentacles widely to contain political speech, the media paradoxically tie the hands of cool-headed leaders. Oddly enough, the very media that Party censors tightly control are also a primary source of information for Party cadres and even top leaders. Without the varied outlets that democracies have to inform leaders of strong public sentiment, top Party officials gauge public opinion by relying too uncritically on their own censored publications.

How might the domestic fragility Shirk describes lead to war? China takes the line that in international relations it is “a responsible power.” It seeks friendship with its neighbors in Asia. It conciliates potential rivals, like India. It prides itself as a team player in multilateral organizations. It joined six-party talks to help resolve North Korea’s nuclear ambitions. It participates in UN peacekeeping operations. It has used its economic ties to make friends. Joining the WTO has brought it into the world’s dominant trading regime. But while such gestures may have convinced most of the world that it is “a benign and peaceful rising power,” a central contradiction remains in its foreign policy. Can China resolve the contradiction between its public opinion and a constructive foreign policy?

To solidify its nationalist support, says Shirk, the Party has used the United States, Taiwan, and Japan as triggers to arouse passion. Japan’s brutal occupation of eastern China during the 1930s and 1940s remains fresh. For many Chinese, Japan compounds its perfidy by refusing to acknowledge honestly the measure of its atrocities, from the Nanking Massacre to the approximately 10 million Chinese war dead. When Japan approved a new textbook in 2005 that played down its wartime culpability, 10,000 students demonstrated in Beijing, smashed Japanese storefronts, overturned Japanese cars, bombarded the Japanese embassy with bottles, stones, and eggs, and called for a boycott of Japanese goods. Possibly 100,000 demonstrators turned out in Shanghai and many more elsewhere. Each time Japan’s leaders visit Tokyo’s Yasukuni Shrine, where war criminals are interred, tensions re-ignite. Chinese popular sentiment, fanned by Party leaders, is inflamed by any slights to national “face,” including competing claims to oil and gas fields in the East China Sea, and, not least, Japan’s support of Taiwan.

It is this last flashpoint, this potential affront to China’s national honor, that is most likely to lead to war. Said a senior People’s Army officer to Shirk, “If the leaders stand by and do nothing while Taiwan declares independence, the Communist Party will fall.” To Chinese media and the publics they inform, Taiwan’s leaders seem intent on provoking China to an armed response. In the 1995-1996 Taiwan Strait Crisis, the United States gave a visa to Taiwan’s president to attend a ceremony at his alma mater, Cornell University. The incident escalated when China fired missiles in the direction of Taiwan and the United States responded by sending in two battle fleets. Periodically since the late 1990s, pro-independence leaders in Taiwan have issued statements that engender heated Chinese reactions. Shirk believes China’s current leaders are too weak to tone down shrill reactions and engage in meaningful negotiations to produce a long-term solution. In the meantime, Taiwan could precipitate an armed response from China that would pull in Japan and the United States.

The Chinese public, Shirk contends, is “highly mistrustful of the U.S. government,” while top leaders believe that the United States wants to slow China’s aspirations to become a world power. Periodic incidents reinforce this outlook. On top of the spy-plane confrontation in 2001 and the bombing of the Belgrade Embassy, the Chinese point to U.S. criticism of China’s human rights record, not to mention steady U.S. support for Taiwan and, possibly, the re-armament of Japan. The invasion of Iraq demonstrated how far the United States will go to project its power.

Here again Shirk sees a leadership hard-pressed to pursue China’s long-term interests. While top leaders have worked at improving Sino-American relations, and fully recognize that a peaceful rise—and their own power—depends upon U.S. cooperation and comity, they confront a fractious public which demands they stand up to the U.S. Their anti-American Propaganda Department is not entirely under top leaders’ control. Their crisis-management machinery works too slowly for defusing of explosive incidents.

Not least, says Shirk, ultimately the Party relies on the People’s Liberation Army to keep it in power, as Tiananmen bloodily revealed. Leaders who lack the gravitas of Mao or Deng Xiaoping have bought loyalty by spending heavily on military modernization. The military is projecting its naval power farther and farther from China’s coasts; its rockets can destroy satellites in space; its missiles become ever more accurate and farther reaching. A stronger military tolerates slights less willingly. On a future hot-button issue, hard-line military leaders may demand action, not diplomacy. Thus Shirk confronts us with her worst-case scenario: “A future crisis with the U.S., especially one involving Taiwan or Japan, could arouse the public’s ire to the degree that China’s leaders might believe that the regime would fall unless they respond militarily to the insult to national honor.”

For James Mann, former Beijing Bureau Chief for the Los Angeles Times, neither Peerenboom’s relentless optimism nor Shirk’s sober realism hits the right note. Mann tilts his lance toward The China Fantasy, the predilection of U.S. policymakers and opinion-leaders to engage in massive collusion with China’s Party leaders to pretend all is well on the China front, both domestic and international. Their “Soothing Scenario,” as he styles it, insists that China is heading in the right direction. The economy booms. People are getting richer. Eventually a big middle class will demand more political voice. Authoritarianism sooner or later will yield to liberal democracy. In short, Peerenboom’s East Asia Model.

American business leaders and the foreign policy establishment buy into this sort of pollyannaish thinking because it suits their interests. Corporate CEOs can concentrate on profits while blithely assuming that democracy will follow. China experts get bought as expensive private consultants to tell politicians what they want to hear. And China élites in the United States insist that “the good guys in America and the good guys in China” have to team up and not rock the boat.

To rock the boat would be to tell the truth, Mann says, and the truth is ugly. China is a repressive state run by the Party (7-8 percent of the population) for its narrow interests. He concurs with Shirk that the Party will do anything to stay in power—mow down weaponless protesters with tanks, spirit away tens of thousands of political prisoners to remote camps, use torture and executions to silence dissidents. One way or another, political dissent is ruthlessly silenced. A peaceful demonstration, such as Falun Gong’s brilliant organizational feat of ringing the entire leadership compound of China’s leaders, was met with mass deportations, incarceration without legal redress, torture, and death. China’s former Premier, Zhao Ziyang, was held under house arrest for fifteen years—from 1989 until his death—for being on the right side of Tiananmen. The United States, Mann charges, legitimates Chinese anti-terrorist programs that lock up Tibetan and Uighur activists.

Of course, China’s leaders skillfully disguise their repression. Except for bank notes and the huge portrait of Mao at the entrance to the Forbidden City, visitors to Beijing would be hard-pressed to know that China is a one-party authoritarian state. Tourists and even business people do not see online bulletin boards shut down whenever their exchanges became too wide-ranging and thereby too appealing; they know nothing of arbitrary detention of unknown numbers in labor camps; they cannot observe lawyers who are intimidated and occasionally imprisoned if they defend their clients too vigorously; they are scarcely aware of surveillance cameras flowering in public meeting sites all over the country—a fitting symbol for a political system that fears its own people and stands ready to crush swiftly any seeds of dissent.

Champions of the “Soothing Scenario” explain all this away, says Mann. Jailing of dissidents is ignored. New headlines are treated as old news. China’s leaders are excused for taking two steps forward and one step back, or by suggestions that leaders miscalculated. If evidence of China’s authoritarianism is repressed, positive developments are over-hyped. Village “elections” become harbingers of state-wide democracy. Rule of law in business, to the extent it exists, gets generalized to basic freedoms. China’s lapses are compared to those of India or, even more convincingly, of the United States. If critics talk of repression, they are “China Bashers,” “anti-Chinese,” tainted with a “Cold War mentality.” They are “troublemakers” who are “ideological” and “provocative.”

In Mann’s view, purveyors of the “Soothing Scenario” subscribe to the “Starbucks Fallacy”: more middle-class consumers will eventually lead to more political choice. In fact, China’s population, it is said, is pretty happy. “People in China don’t care about politics,” they just care about “making money.”

In response, Mann doesn’t fall back on a fragility analysis, à la Shirk. He acknowledges that there is an “Upheaval Scenario” in which disaster looms through economic downturns and political disintegration in response to inequality, corruption, rural protests, land seizures, and ethnic struggles. But he cautions that China is a big and surprisingly resilient country that can bounce back under extreme domestic and international pressures.

A more plausible path, he proposes, is “The Third Scenario.” The current economic trajectory is maintained. The middle class thrives and is contented. Rather than mobilizing against Party dominance, it accepts ongoing repression as a tradeoff. So long as material benefits improve, Party leadership will be accepted. No political opposition, no freedom of the press, no religious freedom, no elections beyond the local level, no substantive rule of law but a persistence of repression, a tightening of the security noose, and a non-democratic recasting of “democracy with Chinese characteristics.”

It is sobering for foreigners to be reminded by Shirk and Mann that all is not as it seems. Tourist traffic to five-star hotels, the Great Wall, Xian’s terracotta soldiers, Tibet’s monasteries, the Three Gorges, Hangzhou’s lovely West Lake, and Shanghai’s bustling cosmopolitanism will never see the China described by Mann and Shirk. Since westerners are not well trained to recognize state-directed propaganda, and face formidable language and cultural barriers, too often they fail to observe the social unrest, stirrings of discontent, poverty, inequality, anger at official corruption, and persecution of minority races and religions that have been papered over. Mann properly advises us to sharpen our critical faculties, to be open to the diversity of opinions on China, even by specialists.

But specialists themselves are not immune from methodological lapses that undermine their premises and evidence for China’s alternative futures. Peerenboom, for instance, consistently and properly urges readers to appraise China not only by some absolute standard or by those of advanced or modern countries but by its peers. Yet how those peers are selected substantially determines what conclusions result. It is conventional to compare China to Korea, Taiwan, Japan, and Singapore, all countries that experienced extraordinary economic development over fifty years. Except for Singapore, their economic growth led to increasingly open societies with vigorous multi-party liberal politics. But Korea, Taiwan, and Japan are small and hom*ogeneous compared to China, and they all benefited economically and politically from shelter under the U.S. security umbrella as close U.S. allies. Korea and Taiwan would never have liberalized politically without significant pressure from the United States, particularly on repressive military leaders in the 1980s, pressure that helped widen the democratic opening that sprang from domestic reformers. Moreover, none of these countries had the scope and complexity of the fragilities portrayed by Shirk. And as for a hope that China will become a 21st-century version of late 20th-century Singapore—rich but authoritarian—the differences in history, size, law, and territory are so great as to render any extrapolation very doubtful. If there is an East Asia Model, China may not share its fundamental attributes.

False historical comparisons can also bedevil China predictions. If the end of the Cultural Revolution constitutes the baseline for contemporary comparisons, then conveniently the worst chapters of China’s modern history get excised from the narrative. But this is like talking about American race relations beginning in 1867, without slavery or the Civil War. China’s Communist Party rule looks benign if we are able to forget that under the rule of the ccp, China’s leaders managed to kill tens of millions of their own people—many more than the Japanese. By pitching a thesis based on China only after the Cultural Revolution, it is possible for Peerenboom to compare China favorably to India, a country with an exceedingly diverse population speaking more than a dozen languages where hundreds of millions are poor, but which has nonetheless maintained a robust democracy and open society for a half-century and avoided a Great Famine in the meantime.

In China studies as elsewhere it is too easy to settle for straight-line projection from some series of points aligned in the same direction. For instance, observers look back over a period of 20 or 30 years, discover a steady line of growth and development, and simply extend it into the future as if history brings no surprises. But one does not need to be a historian to recall the society-transforming shocks of the 1929 market crash, Pearl Harbor, the crumbling of the Berlin Wall, the Asian Financial Crisis, 9/11, or Tiananmen Square. Who expected them?

Shirk skillfully points to contingencies for China’s future, to pressure points and faultlines in Chinese society and politics from which seismic shocks might abruptly alter the course of China’s economy and position in the world. An international political incident could escalate out of control, shattering the fragile porcelain that is China’s present creation. Or an economic shock—contamination of Chinese food, an international backlash against Chinese competition, an over-reaction to Chinese product safety—could precipitate a crash “that throws millions of workers out of their jobs or sends millions of depositors to withdraw their savings from the shaky banking system.” The threat of such an event, Shirk warns, is the “greatest political risk” facing China’s leaders.

Not only is history fraught with contingency, unexpected turns, and sudden jolts, but social and economic theories of democracy and markets cannot naively assume that one necessarily or inevitably accompanies the other. Mann does us the service of calling into question the widely held assumption that democracy in China is just over the horizon if we only wait long enough and don’t interfere. As he rightly observes, another model altogether is possible—an economically developed country that is also politically repressive. Some recent empirical research on Latin America lends support to this argument. As countries get richer they don’t necessarily get democratic. That research indicates that citizens tend to support the kind of regime that brought them material benefits. If the quality of life improved under an authoritarian regime, they are likely to continue to support it. China may get richer and use its wealth to clamp down on basic legal and political freedoms.

What then to do? From each diagnosis follows a prescription. Peerenboom’s optimism leads to implicit counsel that China should be left alone to succeed on its own terms. In his defense, when Peerenboom speaks behind closed doors to China’s senior officials, he takes a more contingent line. China’s leaders should recognize, he says, that many countries stall somewhere along the upward climb to economic success. To break through requires hard and wise decisions, which include stronger rule of law. But his emphasis falls much more strongly on material than political values, on property rights than basic legal freedoms.

Shirk’s counsel vividly illustrates Mann’s complaint. After showing that China’s domestic fragility could propel its weak leaders into dangerous military overreactions to an international incident, she might serve as an exemplar of Mann’s “Soothing Scenario” on how to mitigate impending disaster. Like Peerenboom she urges U.S. leaders, whether politicians or monitors of human rights, to exercise restraint. The more noisome foreign pressures, the more muscle China’s weak leaders will need to flex. Dramatizing human rights abuses merely raises the hackles of China’s leaders and inflames their publics.

Mann will have none of this. The United States must care about democracy in China. American citizens cannot turn their backs on the fate of 1.3 billion fellow humans. Moreover, contra Peerenboom, an undemocratic political system is unstable because it provides no way to resolve high level disputes, a judgment likely shared by Shirk. And an undemocratic China clearly poses problems for the rest of the world. If China’s rise manages to combine wealth with repression, this will become a perverse “model for the rest.” Indeed, Mann warns, in such circ*mstances “China will serve as an exemplar for dictators, juntas, and other undemocratic governments throughout the world.” The Burmas, Zimbabwes, and Sudans among nations will gain solace from a paradigm that combines wealth with secret police. They will also find an ideological compatriot to stand against pressures for human rights and democracy. Finally, Mann notes, a politically liberal regime in China would lower the threat of war.

It follows that we must not accept clichés of exotic China, or Panda China, or Olympics China. Compare, Mann says, the Rome Olympics of 1960, Tokyo in 1964, Seoul in 1988—all celebrations of countries that had emerged from authoritarianism—with the Berlin Olympics of 1936, which hoodwinked the credulous into believing that all was well in the Third Reich. All is not well in China, and U.S. leaders should not collude with China’s Party hierarchy to pretend it is. Now is the time to forestall the installation of a permanent Chinese authoritarianism, not in those roseate decades ahead when it may be too late.

Although a mere coda, Mann’s bottom-line deserves serious reflection. He calls for a vigorous domestic debate over what the United States should do about human rights and democracy in China. Such a debate might conclude, with Shirk, that direct public pressures on China’s leaders harm democratic prospects. Or the debate might arrive at Peerenboom’s position—that nothing can be done or should be done, since history will take care of itself. But the debate could conclude that the fate of China and its citizens requires action. Mann gives us few clues about what kind of action, but presumably prudence would lead in directions that would at least keep Shirk’s cautions in mind.

Curiously, none of these authors has much to say about religion in either China or the United States, now or in the future. Since Peerenboom emphasizes property rights over human rights, it is not surprising that more is not said about basic freedoms. But Christians in house churches and even official churches across China would be startled to read “that freedom of religion exists side by side with state-endorsed atheism in China” and that “despite the official endorsem*nt of atheism, China tolerates religious practice subject to concerns about social stability.” So, too, would Buddhists, some of whose holiest sites are devoid of monks and guarded by uniformed soldiers. Peerenboom surely is correct that Christianity in China could be de-stabilizing, if by this he means that China’s Christians will inexorably—some quietly, others more vocally—press for conditions under which their faith and witness can thrive, conditions that cannot exist alongside a one-Party state intolerant of competing ideologies. Shirk by contrast attributes none of China’s fragility to religious restiveness, although she hints that rights-champions in the United States, some of whom are religious activists, might be among those whom China’s leaders and publics find confrontational. Neither Christians in China nor their counterparts in the United States find their way into Mann’s critique, though the former would be prime beneficiaries of the democratic China he advocates, while the latter could emerge as their international vanguard.

Friends of China rightly applaud its tremendous strides over the past twenty-five years, the achievements sympathetically documented by Peerenboom. Yet we do well to heed the cautionary voices of Mann and Shirk as well. The hand of friendship means little if China’s people are abandoned to repression once the world’s television cameras leave Beijing in August 2008. Then we will discover which of Beijing’s Olympic predecessors Party leaders have chosen to follow.

Terence Halliday is Co-Director, Center on Law and Globalization, American Bar Foundation and University of Illinois College of Law. He writes on commercial law-making and the criminal justice system in China. He has consulted with the World Bank, OECD, and State Council Office on Restructuring the Economic System, PRC.

Books discussed in this essay:

Randall Peerenboom, China Modernizes: Threat to the West or Model for the Rest? (Oxford Univ. Press, 2007).

Susan L. Shirk, China: Fragile Superpower(Oxford Univ. Press, 2007).

James Mann, The China Fantasy (Viking, 2007).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Terence Halliday

by Jean Bethke Elshtain

Meet The Savages.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Are The Savages, brother and sister, really that savage? Wonderfully played by Laura Linney (who received an Academy Award nomination) and the inimitable Philip Seymour Hoffman (an Academy Award-winner two years ago for his uncanny embodiment of Truman Capote), Wendy and Jon Savage are alienated from their gruff and abusive father, Leonard, played by the superb Philip Bosco. The film opens on scenes from one of those awful retirement communities in the Sun Belt, age-segregated and devoted to rather peculiar activities designed to keep retirees young at heart. We see a group of women, heavily made-up, the youngest of whom is likely 70, attired in flouncy mini-skirts, tap-dancing to some oldie but goodie. We enter the interior of one of the units, where a Home Health Care Professional named Eduardo is chastising an elderly man then munching a bowl of Wheat Chex. The elderly muncher, Leonard Savage, has failed to flush the toilet. Eduardo is incensed. “I’m not paid to take care of your sh*t,” he announces, ordering Lenny to flush the toilet and snagging his bowl of cereal, promising to return it once the deed is done. “I’m not responsible for you, only Doris,” Eduardo proclaims for good measure.

Eduardo proceeds to turn his attention to Doris, Lenny’s elderly female companion, fawning over her and dolling her up as if she were a 16-year-old being readied for a date. When Lenny fails to reappear and/or respond to calls from outside the bathroom, Eduardo bursts in to find him smearing a nasty five-letter word on the bathroom wall, using his own fecal matter as fingerpaint. A call goes out to Leonard’s daughter.

We have learned a few things about Wendy, a temp worker and playwright wannabe. We know she has submitted applications for fellowships and grants to fund her “subversive autobiographical” play, Wake Me When It’s Over. In the play, two children, a brother and a sister, are abandoned by an abusive father. Then their mother goes out on a date, never to return. The children must fend for themselves. We also learn that Wendy is unmarried; she has a cat; she also has a kind married lover with a big dog, Marley, that goes everywhere with him. Wendy is bored with her lover and troubled about everything.

When Wendy receives word of “the toileting incident,” she places a hysterical call to her brother, Jon, a professor of English at an upstate New York college; his specialty is the theater of cruelty and the absurd. Jon tries to calm his sister down. “We’re not in a Sam Shepherd play,” he intones. Jon is 42 years old; he has a Polish girlfriend whose visa has expired and who must return to Poland. He will not commit and marry her—”we’re not ready”—although this would forestall her departure. Jon agrees to meet Wendy for the flight to Arizona after arranging for a colleague to take over his class on “Oedipal Rage in Brecht.”

We next see them, brother and sister, awkward in each other’s presence, as they drive through the retirement community with its identikit homes, trees, and streets. They enter the dwelling where their father spent twenty unmarried years with Doris—who, shortly after the “toileting incident,” keeled over and died as she was having her fingernails painted a bright red. Jon assumes that Lenny has some rights—surely twenty years counts as a common law marriage. But Doris’ daughter and her husband assure the Savage siblings that this is not the case; that Lenny must leave the home because of the “toileting incident”—in fact the place has already been put on the market—and, besides, Doris’ family has no legal obligation of any kind to Lenny. “We love Lenny, but … .”

Wendy and Jon are now responsible for their failing father. They decide to get a full medical evaluation, and the results are not good: Lenny has Parkinson’s disease; he has cardio-respiratory failure; and, to top it all off, vascular dementia. The film follows Wendy as she shepherds their increasingly bewildered father onto a flight—as Jon has gone ahead to find a place for Lenny in a rest home in Buffalo. The awkwardness and embarrassment on the plane when Lenny barks his need to go to the toilet and stands up—only for his pants to fall down to his knees, revealing his adult Pampers—is painful to witness. And we can’t help but wonder if we will wind up in the same condition some day.

At the “Rehabilitation Center” in Buffalo—director Tamara Jenkins, who also wrote the screenplay, has a keen ear for creepy euphemisms—Jon and Wendy are instructed by staff not to “make a big deal” out of leaving their father. Once the deed is done, Wendy says, “He didn’t even know where we were taking him. We’re horrible people. Horrible.” One is struck at this point by the isolation and loneliness of Jon and Wendy’s lives. They are not members of any sustaining community. Apart from Wendy’s married lover and Jon’s soon-to-depart Polish girlfriend, they appear to lack close friendships. And clearly they have not, as adults, been close to one another.

Wendy is to stay with Jon, to sleep on the couch in his apartment. Entering that apartment for the first time, seeing the piles of papers and books that cover every surface and overflow the interior space, Wendy covers her unease with a wisecrack, as if she were composing a bit of smart-mouthed dialogue for the stage: “It looks like the Unabomber lives here.” Jon nervously shifts piles of books and papers around, warning Wendy that “actually there’s a system.” Her presence is temporary. His books and papers are forever. Jon reassures his guilt ridden sister that “we’re taking better care of the old man than he ever did of us.” Wendy, unassuaged, pages through slick brochures advertising places with names like “Greenhill Manor.” Jon explodes: all the propaganda about wonderful activities and beautiful grounds “isn’t for him, Wendy, it’s for us—it’s to make us feel better. What happens in these places is that people die.” And they stink. And it is awful. Watching the film, we may be uncomfortably aware that many of us are part of the “guilty demographic,” but there is no way around “the miserable fact that people die.”

And so Jon returns to work—he has a steady job, he reminds Wendy, and her life is more portable. Stung, Wendy proclaims that she, too, has work to do; indeed, she has received a Guggenheim to write a play. John, befuddled, can scarcely believe his ears—and for good reason, we later learn. He has applied for a Guggenheim six times, to no avail. Wendy accuses John of jealousy. “No, I’m not jealous. Just surprised.” Adding, “I’m really proud of you, it’s amazing.” They ride out the holidays together with a little help from some Vicodin Wendy purloined from the Arizona home’s bathroom cabinet—a prescription for the deceased Doris. It’s not long, though, before Jon learns that Wendy has lied about her Guggenheim. A blow-up ensues. It turns out Wendy did indeed receive a grant—from FEMA, Federal Emergency Management, which offered aid to anyone affected by 9/11 who made a successful application. (Your tax dollars at work!) John can’t believe it. His sister is “defrauding the federal government.”

Brother and sister negotiate an uneasy truce. When their father finally dies, both children are in the room but have fallen asleep. Wendy awakes and is the first to recognize that Lenny is gone. She awakens her brother. “That’s it?” “Yeah.”

The death of Leonard Savage is attended by two emotionally starved and bewildered adult children. That’s it. No extended family. No friends. No clergy. This is what death looks like absent a transcendent framework of meaning. Dazed and empty, Wendy takes the train back to the city. Jon returns to his study. Wendy encounters her lover, Larry, who, as always, is kind and solicitous. He tells her his much-beloved dog, Marley, will be put down the next day. Marley is old and in pain and it doesn’t make sense to keep her going. Larry apologizes for caring so much about the dog, given that Wendy’s father has died. Abrupt fadeout.

The next scenes take place six months later. Wendy has written a play that is being produced. Jon has journeyed from Buffalo for the preview. He tells his sister that her “combination of naturalism and magical realism is very effective. It’s good. It’s really good.” He’s off to a conference in Poland: maybe, just maybe … . The film’s final scene features Wendy jogging. Alongside her is the once-doomed Marley, whom Wendy has adopted and fitted with a special device with wheels that enables dogs whose hind legs have given out to continue to walk and even to run. Wendy has saved a life.

But why?

Did the death of Lenny liberate her from the psychic burden he embodied? Did going through her father’s death liberate her for creativity, for getting out of her self-encased and wounded narcissism? Have Jon and Wendy become less savage, more human? The film suggests yes, but in an understated way. The Savages is directed competently, not brilliantly, but that’s okay for a “little film” of this sort. And the screenplay is a cut above: it isn’t brilliant, but it is intelligent, with an eye for the ridiculous if not the sublime. What one takes away from The Savages is a sinking sense of human disconnectedness, loneliness, the stripping of human beings down to the bare reality of “the self.” We are reminded, quite viscerally at times, that selves cannot go it alone. In fact, the self cannot be fully a self in isolation. A small film but a big theme.

Jean Bethke Elshtain’s Gifford Lectures have just been published as Sovereignty: God, State, and Self (Basic Books).

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Jean Bethke Elshtain

by Matt Jenson

Alan Jacobs on Adam’s curse.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

It’s a funny thing when an idea becomes at once singularly despised and surprisingly fascinating, simultaneously passé and sexy. Take the doctrine of original sin—that complex of theological and biological commitments developed and coordinated to make sense of our sense (and Scripture’s) that we are dead ends, all of us. One wonders, though, whether it is our sense these days. Fifty years ago, evangelistic tracts did their Lutheran thing to great effect: Law, then Gospel. Evangelists established points of contact by reminding listeners that they were all sinners—who could deny it?—then moved from problem to solution and invitation. And it worked, more or less.

But things are different now. The contemporary American landscape features a striking coincidence of blatant brokenness and robust self-esteem. We know we’re broke, but we don’t think we need any fixin’. In fact, we resent the suggestion. We chafe at the occasional attempt to rehabilitate notions of innate sinfulness as world-denying, repressive, and death-dealing.

Whence, then, the recent rash of books on sin? We might expect that from academic monographs. After all, sin used to matter. Its historical fascination is patent, not least because we delight in figuring out what was wrong with our parents. But a series of wryly written and deftly marketed books on the seven deadly sins, selling for $9.95 a pop? I suspect that sin’s reemergence into the limelight is directly, if inversely, related to its perceived claim on our lives. Now that we can breezily laugh it off, sin has become interesting (if only quaintly so).

There is always more to the story, of course. Even as our moral grammar hobbles along with its emaciated spouse, our moral sense, we navigate a world in which events (take your pick: genocide, pandemics, economic stratification, moral relativism, environmental anarchy) desperately call for both sense and grammar. At home, we go for drab colors, wearing a bland combination of moral grays. Flip on the news, though, and all we see and hear screams primary colors—moral indignation, often enough moral indigestion. A strange cultural moment, this, one in which we continue to jettison the language of sin even as we scrabble for something, anything, with which to fight the bad guys.

And that leads me to Alan Jacobs’ splendid Original Sin: A Cultural History, a book endeavoring to help us say and do something about the sin which so easily ensnares (even if we aren’t sure it really exists). Jacobs’ is not an easy task. Part apologist, part peddler of cultural curiosities, part champion of the doctrinal underdog, he aims to win another hearing for original sin. Moving back and forth in history, he details commendations and dismissals of the doctrine, beginning—where else?—with Augustine, its most influential expositor. Haven’t we all, with Augustine, experienced what Jacobs nicely dubs a “forking and branching” of the will?

Jonathan Edwards argues from the way we infer that dice are loaded (how many double sixes in a row does it take?) to the common sense of the doctrine of original sin (who hasn’t shown himself to be a creep?). In Edwards’ eyes, children arrive in the world nasty, brutish, and, well, short. Despite their seeming innocence, if children are “out of Christ” they are “more hateful than vipers.” John Wesley agrees, though his theology of love holds him back from the brazen pronouncements of the Augustinian tradition. Wesley sets up a reparative pedagogy in which “the education of children consists primarily, if not exclusively, in discerning these sins and rooting them out as aggressively as possible.”

Then there’s Rousseau, whose Émile begins with an axiom: “that the first movements of nature are always right; there is no original perversity in the human heart.” Learning from nature is at the heart of his curriculum. But to do that, children must be kept free from the complicating variables of human society. The ironies of the highly artificial environment in which Rousseau’s natural pedagogy operates are not lost on Jacobs. We have, then, an education founded on the belief that our first instincts are right and good, but one that takes great pains to keep children from other natively good people out of fear that interacting with those good people will make children bad. Curious.

This should all sound fairly familiar. Intellectual history has circled around the nature/nurture debate for the last couple of centuries, and to ask about original sin is to suggest that nature remains a meaningful category. To hard-core social constructionists, Jacobs puts the question of “why the social construction of selves is so limited in its range, so unimaginatively and repetitively attached to making us cruel and selfish.” You’d think we’d come up with something a bit more interesting to be and do.

The nature/nurture debate leads us inexorably to the ancient question that haunts this book: Unde hoc malum? Whence sin? So John Milton struggled to imagine sin’s point of entry, given its (originally) utter novelty. The question rings existentially, too, often enough in less articulate forms—something like, “What the hell is wrong with me?” Pretty good question, that. It recognizes the labyrinthine character of sin, the sense of being caught, sin’s power over us, its systemic implications, and, of course, the infernal connection. Sin is hellish.

Another answer to the “Whence?” question takes up categories of internal and external. Did the Devil make me do it? Or is it that I, myself, am a bit of a devil? Jacobs sums up a long tradition, which moves from demonology to pathology: “For if it was the genius of Prudentius and his followers [in medieval morality plays] to reach into the divided self and pull out its voices, giving them bodily substance and individual identity, it was the genius of Freud and his followers to stuff them all back into the box.” Freud’s move, then, is less an evasion of biblical accounts of evil than it is a rebuke of another kind of evasion, the sinfully clever attempt to get myself off the hook in the refuge of a devil who made me do it. (Don’t miss Jacobs’ analysis of the Tom and Jerry cartoons in which a little Tom angel and devil perch atop big Tom’s shoulders.)

Hence Pascal’s comment that original sin is necessary for self-knowledge. Original sin’s deniers like to claim that the doctrine does bad things, or at least discourages us from doing good things. It deals death. So they tell us. But over and over in Jacobs’ account, we meet well-intentioned characters, only to find their happier, gentler anthropologies turning sour, leading to (or at least abetting) anarchy, eugenics, despair. Perhaps the greatest irony in this history is the discovery that knowledge of original sin gives life—by revealing us to ourselves, yes, but also by grounding a sense of universal human kinship.

As Jacobs notes, “To identify someone as kin is to grant that person a claim upon us.” Strikingly, Jacobs argues that the “confraternity” of humanity is best grounded not in our being made in the image of God but in our being made sinful in Adam: “If misery does not always love company, it surely tolerates it quite well, whereas pride demands distinction and hierarchy, and is ultimately willing to pay for those in the coin of isolation.” The history of the deployment of the imago Dei is riddled with attempts to limit its application to less than the sum of humanity—further evidence for original sin.

A final story, on the origin of the Feast of All Souls. Just before the end of the first millennium after Christ, the abbey of Cluny received a visitor. Returning from the Holy Land, this pilgrim had been shipwrecked on an island whose sole inhabitant was a hermit. The hermit told him about a hole in a rock formation from which he could hear souls groaning—and their tormentors complaining about the prayers of the living on behalf of these dead. Inspired by this tale, the young abbot, Odilo, transformed Cluny into a place devoted entirely to prayer and added what became known as All Souls’ Day, “to pray for the souls who, while not damned, had not entered into blessedness.” What marked All Souls’ with a certain égalité et fraternité was its placement in the church calendar: on November 2, the day after All Saints’ Day. All Saints’ arose out of the cult of the saints, in which the dead served as spiritual patrons. That made perfect sense in the hierarchies of medieval society. But with the institution of All Souls’, the living could reciprocate. It was, then, a robust doctrine of sin and judgment leading to earnest intercession that created what Eugen Rosenstock-Huessy called “the Christian democracy of the dead and the dying.”

Truly a revolutionary thought—that the roots of our common humanity might be found, not in our dignity or even our potential, but in our depravity.

Matt Jenson is a theologian teaching great books in the Torrey Honors Institute at Biola University. He is the author of The Gravity of Sin: Augustine, Luther and Barth on ‘hom*o incurvatus in se’ (T&T Clark). He lives with his sister in Fullerton.

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Matt Jenson

by Alvin Plantinga

Why they are like oil and water.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

As everyone knows, there has been a recent spate of books attacking Christian belief and religion in general. Some of these books are little more than screeds, long on vituperation but short on reasoning, long on name-calling but short on competence, long on righteous indignation but short on good sense; for the most part they are driven by hatred rather than logic. Of course there are others that are intellectually more respectable—for example Walter Sinnott-Armstrong’s contribution to God? A Debate Between a Christian and an Atheist[1] and Michael Tooley’s contribution to Knowledge of God.[2] Nearly all of these books have been written by philosophical naturalists. I believe it’s extremely important to see that naturalism itself, despite the smug and arrogant tone of the so-called New Atheists, is in very serious philosophical hot water: one can’t sensibly believe it.

Naturalism is the idea that there is no such person as God or anything like God; we might think of it as high-octane atheism or perhaps atheism-plus. It is possible to be an atheist without rising to the lofty heights (or descending to the murky depths) of naturalism. Aristotle, the ancient Stoics, and Hegel (in at least certain stages) could properly claim to be atheists, but they couldn’t properly claim to be naturalists: each endorses something (Aristotle’s Prime Mover, the Stoics’ Nous, Hegel’s Absolute) no self-respecting naturalist could tolerate.

These days naturalism is extremely fashionable in the academy; some say it is contemporary academic orthodoxy. Given the vogue for various forms of postmodern anti-realism and relativism, that may be a bit strong. Still, naturalism is certainly widespread, and it is set forth in such recent popular books as Richard Dawkins’ The Blind Watchmaker, Daniel Dennett’s Darwin’s Dangerous Idea, and many others. Naturalists like to wrap themselves in the mantle of science, as if science in some way supports, endorses, underwrites, implies, or anyway is unusually friendly to naturalism. In particular, they often appeal to the modern theory of evolution as a reason for embracing naturalism; indeed, the subtitle of Dawkins’ Watchmaker is Why the Evidence of Evolution Reveals a Universe Without Design. Many seem to think that evolution is one of the pillars in the temple of naturalism (and “temple” is the right word: contemporary naturalism has certainly taken on a religious cast, with a secular priesthood as zealous to stamp out opposing views as any mullah). I propose to argue that naturalism and evolution are in conflict with each other.

I said naturalism is in philosophical hot water; this is true on several counts, but here I want to concentrate on just one—one connected with the thought that evolution supports or endorses or is in some way evidence for naturalism. As I see it, this is a whopping error: evolution and naturalism are not merely uneasy bedfellows; they are more like belligerent combatants. One can’t rationally accept both evolution and naturalism; one can’t rationally be an evolutionary naturalist. The problem, as several thinkers (C. S. Lewis, for example) have seen, is that naturalism, or evolutionary naturalism, seems to lead to a deep and pervasive skepticism. It leads to the conclusion that our cognitive or belief-producing faculties—memory, perception, logical insight, etc.—are unreliable and cannot be trusted to produce a preponderance of true beliefs over false. Darwin himself had worries along these lines: “With me,” says Darwin, “the horrid doubt always arises whether the convictions of man’s mind, which has been developed from the mind of the lower animals, are of any value or at all trustworthy. Would any one trust in the convictions of a monkey’s mind, if there are any convictions in such a mind?”[3]

Clearly this doubt arises for naturalists or atheists, but not for those who believe in God. That is because if God has created us in his image, then even if he fashioned us by some evolutionary means, he would presumably want us to resemble him in being able to know; but then most of what we believe might be true even if our minds have developed from those of the lower animals. On the other hand, there is a real problem here for the evolutionary naturalist. Richard Dawkins once claimed that evolution made it possible to be an intellectually fulfilled atheist. I believe he is dead wrong: I don’t think it’s possible at all to be an intellectually fulfilled atheist; but in any event you can’t rationally accept both evolution and naturalism.

Why not? How does the argument go?[4] The first thing to see is that naturalists are also always or almost always materialists: they think human beings are material objects, with no immaterial or spiritual soul, or self. We just are our bodies, or perhaps some part of our bodies, such as our nervous systems, or brains, or perhaps part of our brains (the right or left hemisphere, for example), or perhaps some still smaller part. So let’s think of naturalism as including materialism.[5] And now let’s think about beliefs from a materialist perspective. According to materialists, beliefs, along with the rest of mental life, are caused or determined by neurophysiology, by what goes on in the brain and nervous system. Neurophysiology, furthermore, also causes behavior. According to the usual story, electrical signals proceed via afferent nerves from the sense organs to the brain; there some processing goes on; then electrical impulses go via efferent nerves from the brain to other organs including muscles; in response to these signals, certain muscles contract, thus causing movement and behavior.

Now what evolution tells us (supposing it tells us the truth) is that our behavior, (perhaps more exactly the behavior of our ancestors) is adaptive; since the members of our species have survived and reproduced, the behavior of our ancestors was conducive, in their environment, to survival and reproduction. Therefore the neurophysiology that caused that behavior was also adaptive; we can sensibly suppose that it is still adaptive. What evolution tells us, therefore, is that our kind of neurophysiology promotes or causes adaptive behavior, the kind of behavior that issues in survival and reproduction.

Now this same neurophysiology, according to the materialist, also causes belief. But while evolution, natural selection, rewards adaptive behavior (rewards it with survival and reproduction) and penalizes maladaptive behavior, it doesn’t, as such, care a fig about true belief. As Francis Crick, the co-discoverer of the genetic code, writes in The Astonishing Hypothesis, “Our highly developed brains, after all, were not evolved under the pressure of discovering scientific truth, but only to enable us to be clever enough to survive and leave descendents.” Taking up this theme, naturalist philosopher Patricia Churchland declares that the most important thing about the human brain is that it has evolved; hence, she says, its principal function is to enable the organism to move appropriately:

Boiled down to essentials, a nervous system enables the organism to succeed in the four F’s: feeding, fleeing, fighting and reproducing. The principal chore of nervous systems is to get the body parts where they should be in order that the organism may survive … . Improvements in sensorimotor control confer an evolutionary advantage: a fancier style of representing is advantageous so long as it is geared to the organism’s way of life and enhances the organism’s chances of survival [Churchland’s emphasis]. Truth, whatever that is, definitely takes the hindmost.[6]

What she means is that natural selection doesn’t care about the truth or falsehood of your beliefs; it cares only about adaptive behavior. Your beliefs may all be false, ridiculously false; if your behavior is adaptive, you will survive and reproduce. Consider a frog sitting on a lily pad. A fly passes by; the frog flicks out its tongue to capture it. Perhaps the neurophysiology that causes it to do so, also causes beliefs. As far as survival and reproduction is concerned, it won’t matter at all what these beliefs are: if that adaptive neurophysiology causes true belief (e.g., those little black things are good to eat), fine. But if it causes false belief (e.g., if I catch the right one, I’ll turn into a prince), that’s fine too. Indeed, the neurophysiology in question might cause beliefs that have nothing to do with the creature’s current circ*mstances (as in the case of our dreams); that’s also fine, as long as the neurophysiology causes adaptive behavior. All that really matters, as far as survival and reproduction is concerned, is that the neurophysiology cause the right kind of behavior; whether it also causes true belief (rather than false belief) is irrelevant.

Next, to avoid interspecies chauvinism, let’s not think about ourselves, but instead about a hypothetical population of creatures a lot like us, perhaps living on a distant planet. Like us, these creatures enjoy perception, memory, and reason; they form beliefs on many topics, they reason and change belief, and so on. Let’s suppose, furthermore, that naturalistic evolution holds for them; that is, suppose they live in a naturalistic universe and have come to be by way of the processes postulated by contemporary evolutionary theory. What we know about these creatures, then, is that they have survived; their neurophysiology has produced adaptive behavior. But what about the truth of their beliefs? What about the reliability of their belief-producing or cognitive faculties?

What we learn from Crick and Churchland (and what is in any event obvious) is this: the fact that our hypothetical creatures have survived doesn’t tell us anything at all about the truth of their beliefs or the reliability of their cognitive faculties. What it tells us is that the neurophysiology that produces those beliefs is adaptive, as is the behavior caused by that neurophysiology. But it simply doesn’t matter whether the beliefs also caused by that neurophysiology are true. If they are true, excellent; but if they are false, that’s fine too, provided the neurophysiology produces adaptive behavior.

So consider any particular belief on the part of one of those creatures: what is the probability that it is true? Well, what we know is that the belief in question was produced by adaptive neurophysiology, neurophysiology that produces adaptive behavior. But as we’ve seen, that gives us no reason to think the belief true (and none to think it false). We must suppose, therefore, that the belief in question is about as likely to be false as to be true; the probability of any particular belief’s being true is in the neighborhood of 1/2. But then it is massively unlikely that the cognitive faculties of these creatures produce the preponderance of true beliefs over false required by reliability. If I have 1,000 independent beliefs, for example, and the probability of any particular belief’s being true is 1/2, then the probability that 3/4 or more of these beliefs are true (certainly a modest enough requirement for reliability) will be less than 10(to the power -58). And even if I am running a modest epistemic establishment of only 100 beliefs, the probability that 3/4 of them are true, given that the probability of any one’s being true is 1/2, is very low, something like .000001.[7] So the chances that these creatures’ true beliefs substantially outnumber their false beliefs (even in a particular area) are small. The conclusion to be drawn is that it is exceedingly unlikely that their cognitive faculties are reliable.

But of course this same argument will also hold for us. If evolutionary naturalism is true, then the probability that our cognitive faculties are reliable is also very low. And that means that one who accepts evolutionary naturalism has a defeater for the belief that her cognitive faculties are reliable: a reason for giving up that belief, for rejecting it, for no longer holding it. If there isn’t a defeater for that defeater—a defeater-defeater, we could say—she can’t rationally believe that her cognitive faculties are reliable. No doubt she can’t help believing that they are; no doubt she will in fact continue to believe it; but that belief will be irrational. And if she has a defeater for the reliability of her cognitive faculties, she also has a defeater for any belief she takes to be produced by those faculties—which, of course, is all of her beliefs. If she can’t trust her cognitive faculties, she has a reason, with respect to each of her beliefs, to give it up. She is therefore enmeshed in a deep and bottomless skepticism. One of her beliefs, however, is her belief in evolutionary naturalism itself; so then she also has a defeater for that belief. Evolutionary naturalism, therefore—the belief in the combination of naturalism and evolution—is self-refuting, self-destructive, shoots itself in the foot. Therefore you can’t rationally accept it. For all this argument shows, it may be true; but it is irrational to hold it. So the argument isn’t an argument for the falsehood of evolutionary naturalism; it is instead for the conclusion that one cannot rationally believe that proposition. Evolution, therefore, far from supporting naturalism, is incompatible with it, in the sense that you can’t rationally believe them both.

What sort of reception has this argument had? As you might expect, naturalists tend to be less than wholly enthusiastic about it, and many objections have been brought against it. In my opinion (which of course some people might claim is biased), none of these objections is successful.8 Perhaps the most natural and intuitive objection goes as follows. Return to that hypothetical population of a few paragraphs back. Granted, it could be that their behavior is adaptive even though their beliefs are false; but wouldn’t it be much more likely that their behavior is adaptive if their beliefs are true? And doesn’t that mean that, since their behavior is in fact adaptive, their beliefs are probably true and their cognitive faculties probably reliable?

This is indeed a natural objection, in particular given the way we think about our own mental life. Of course you are more likely to achieve your goals, and of course you are more likely to survive and reproduce if your beliefs are mostly true. You are a prehistoric hominid living on the plains of Serengeti; clearly you won’t last long if you believe lions are lovable overgrown puss*cats who like nothing better than to be petted. So, if we assume that these hypothetical creatures are in the same kind of cognitive situation we ordinarily think we are, then certainly they would have been much more likely to survive if their cognitive faculties were reliable than if they were not.

But of course we can’t just assume that they are in the same cognitive situation we think we are in. For example, we assume that our cognitive faculties are reliable. We can’t sensibly assume that about this population; after all, the whole point of the argument is to show that if evolutionary naturalism is true, then very likely we and our cognitive faculties are not reliable. So reflect once more on what we know about these creatures. They live in a world in which evolutionary naturalism is true. Therefore, since they have survived and reproduced, their behavior has been adaptive. This means that the neurophysiology that caused or produced that behavior has also been adaptive: it has enabled them to survive and reproduce. But what about their beliefs? These beliefs have been produced or caused by that adaptive neurophysiology; fair enough. But that gives us no reason for supposing those beliefs true. So far as adaptiveness of their behavior goes, it doesn’t matter whether those beliefs are true or false.

Suppose the adaptive neurophysiology produces true beliefs: fine; it also produces adaptive behavior, and that’s what counts for survival and reproduction. Suppose on the other hand that neurophysiology produces false beliefs: again fine: it produces false beliefs but adaptive behavior. It really doesn’t matter what kind of beliefs the neurophysiology produces; what matters is that it cause adaptive behavior; and this it clearly does, no matter what sort of beliefs it also produces. Therefore there is no reason to think that if their behavior is adaptive, then it is likely that their cognitive faculties are reliable.

The obvious conclusion, so it seems to me, is that evolutionary naturalism can’t sensibly be accepted. The high priests of evolutionary naturalism loudly proclaim that Christian and even theistic belief is bankrupt and foolish. The fact, however, is that the shoe is on the other foot. It is evolutionary naturalism, not Christian belief, that can’t rationally be accepted.

Alvin Plantinga is John A. O’Brien Professor of Philosophy at the University of Notre Dame.

1. Reviewed elsewhere in this issue by Douglas Groothuis, in a piece covering four books dealing with atheism in one fashion or another.

2. Coauthored with Alvin Plantinga in Blackwell’s Great Debates in Philosophy series (Blackwell, 2008).

3. Letter to William Graham (Down, July 3, 1881), in The Life and Letters of Charles Darwin, ed. Francis Darwin (London: John Murray, 1887), Volume 1, pp. 315-16.

4. Here I’ll just give the bare essentials of the argument; for fuller statements, see my Warranted Christian Belief (Oxford Univ. Press, 2000), chap. 7; or my contribution to Knowledge of God (Blackwell, 2008); or Natural Selection and the Problem of Evil (The Great Debate), edited by Paul Draper, www.infidels.org/library/modern/paul_draper/evil.html.

5. If you don’t think naturalism does include materialism, then take my argument as for the conclusion that you can’t sensibly accept the tripartite conjunction of naturalism, evolution, and materialism.

6. “Epistemology in the Age of Neuroscience,” Journal of Philosophy, Vol. 84 (October 1987), pp. 548-49.

7. My thanks to Paul Zwier, who performed the calculations.

8. See, e.g., Naturalism Defeated?, ed. James Beilby (Cornell Univ. Press, 2002), which contains some ten essays by critics of the argument, together with my replies to their objections.

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Alvin Plantinga

by Alan Jacobs

Their “most simple and beautiful oneness.”

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

I have come late to the knowledge of trees, and while I would like to think that I have loved them all my life, that's probably not really true. Had I loved them all along I would know more about them by now. The most enlightening and attractive writers about trees seem to have been lifelong aficionados—one book I recently read begins, "Having been partly arboreal since the age of eight, I … "—and the ease with which they describe their old friends shames me a bit. Reading them, I feel much the same envy I feel when watching an experienced skater flow across an iced-over pond.

In the preface to his first collection of essays, Happy To Be Here, Garrison Keillor explains how he came to realize that the years he spent, at the outset of his career, trying to write a big novel were just wasted. Looking back on that fruitless time, when piles of typed pages grew on his desk without amounting to anything more than piles of typed pages, he came to see that his ignorance of trees was emblematic of his difficulties. The novel-in-progress itself

lay on a shelf over the radiator, and next to it stood the typewriter stand, up against a window that looked out on an elm tree and a yellow bungalow with blue trim, across the street. I assume it was an elm because it died that spring during an elm epidemic and the city foresters cut it down, but in fact there are only four or five plants I can identify with certainty and the elm is not one of them. I regret this but there it is: plant life has never been more to me than a sort of canvas backdrop. There was a houseplant in that bedroom too, some type of vine or vine-related plant, and it also died.

The characters in his novel, he says, spent a lot of time smoking while propped against trees; but what kind of trees he did not say. Nor did he care. In retrospect Keillor saw that the story grew dull and lifeless because its fictional world was so skimpily furnished; characters who devoted so much time to "leaning against vague vegetation" could scarcely expect to be worthy of a reader's time.

I have spent much of my own life surrounded by vegetation equally vague, though I rarely lean on any of it and haven't smoked since I was about sixteen. For one thing, as a child I was anything but arboreal: my fear of heights confined my tree-climbing to the apple and peach trees in my neighbor's garden, where I could barely get six feet off the ground, and while I could identify those trees when fruit was hanging from them, in other seasons I would have been out of luck. Almost the only tree I could name with confidence was the pecan, because our yard was full of mature, heavily-bearing pecan trees that dumped thousands of nuts on the ground every fall. (I was distinctly shocked when we moved from that house and I discovered that people paid large sums of money for pecans. I thought of them primarily as a nuisance: one of my jobs every fall was to gather up paper grocery bags full of nuts and deliver them to the neighbors, since otherwise crossing our lawn would have been like walking on ball bearings. I figured that the neighbors were doing us a favor by taking the things off our hands.) And I am not sure that I could have identified a pecan tree if I came upon it in the springtime and if it were surrounded by other kinds of tree, or tree-related plants.

Yet the very form of Tree was endlessly fascinating to me. We lived in an old ramshackle house which had the single virtue of a large L-shaped porch, and in the frequent afternoon thunderstorms of my Alabama childhood I would park myself in a dry spot on the porch and watch, almost literally mesmerized, the tall trees' dialogue with the wind. I never tired of this spectacle, nor did I ever miss an opportunity to encounter it again. The enormous creatures really did seem almost to talk to one another, and perhaps to me. Just a few weeks ago, when powerful southerly winds rushed into my part of Illinois, I was walking across the wide front lawn of the Wheaton College campus, and when I passed under an enormous oak I heard that same language and felt transported to that porch in Alabama and our cluster of pecan trees. But I didn't pause in reverie; instead I quickened my pace, because in winds so fierce that old oak could easily have dropped a branch big enough to kill me.

That trees strike us as human-like is an essential element of their fascination but is also part of the fear they can inspire. Their proportions resemble ours; their crowns are like heads, their branches arms—no wonder so many of the myths Ovid records in the Metamorphoses have people turned into them. They are the visually dominant figures of the plant kingdom, as we fancy ourselves the monarchs of the animal realm. Like us, they can in their solitude seem welcoming and friendly, though sometimes imposing; also like us, in mass they can terrify. Who has understood better than Tolkien the terrors and the companionable appeal of trees, and the way those traits are mixed imperceptibly together? In Fangorn Forest we see the first tempered by the second; in Treebeard and the other Ents the second tempered by the first. Yet in depicting these creatures of the woods Tolkien seems to many of us to have created nothing, but rather to have read our minds, and sometimes our nightmares.

On the east side of the house I now live in we have a little sunporch or Florida room where I camp out whenever the weather allows it. From my usual seat I look out across our back yard, which is open and flat but bordered by trees. An enormous twin-trunked honey locust dominates the far side of the lawn; in the back is a tall Norway spruce and a small redbud which seems to be thriving since the recent death of a crabapple that had partially blocked its sunshine. Nearest to me, and most often in my sight and mind, is a maple—but what kind of maple? The shape of the leaves is unmistakable, so that determines the species; and everything about the tree, from the texture of the bark, to its delightful helicopterish "keys" with their cargo of seed, to its droopy smaller branches and its tendency to drop lots of twigs, fairly shouts that it's a silver maple. Except for one thing: the undersides of the leaves, the very feature that gives the silver maple its name, aren't silver at all. I sometimes tell myself that they're grayish-green, but really they aren't: they're just a pale green with a matte surface. There are other silver maples in my neighborhood that anyone could recognize immediately by those highly distinctive leaves.

Individual trees within a species, and even within a distinct variety, can vary tremendously (just think about the many sizes, shapes, and colors of people), so it's perfectly possible that this lack of silveriness is well within the bounds of ordinary variation; but nevertheless it remains a source of annoyance to me that I can't confidently name this most familiar tree. It is very familiar to me, and beautiful. I have simply stared at it for many hours when I was supposed to be grading papers or writing essays for Books & Culture, and even when I have set myself the task of figuring out what kind of maple it is. Its architecture endlessly delights my eye. About twelve feet off the ground its trunk divides into three distinct sub-trunks, and from them stem, at pleasing intervals that are only slightly irregular, thick branches that extend horizontally for unusually long distances. The effect is one of elegant complexity, and different aspects of this architecture attract my attention at different seasons, in the dead of winter almost as much as in the season of full leaf or in the time when the keys spin comically through the air and crash-land on my lawn and driveway.

It's when I'm in one of my tree-reveries that I best understand what the poet Gerard Manley Hopkins had in mind when he coined the terms "instress" and "inscape." By "inscape" Hopkins meant something like the unique form or structure of a particular thing; by "instress" something like an energy or resonance—a divine energy—which binds the object to its perceiver. A thing's inscape is always there; instress is discernible only by certain people at certain times. To see it is a kind of gift of the Holy Spirit. Hopkins uses these terms to describe trees more than any other thing: "There is one notable dead tree … the inscape markedly holding its most simple and beautiful oneness up from the ground through a graceful swerve below (I think) the spring of the branches up to the tops of the timber. I saw the inscape freshly, as if my mind were still growing, though with a companion the eye and the ear are for the most part shut and instress cannot come." I feel much the same way about my tree, my silver maple—if that's what it is.

British folk write well about trees, I find, and I have a theory to explain this, one which, like most of my theories, is virtually unencumbered by evidence. Britons aren't alone in this fascination, but it takes different forms elsewhere. Americans, for instance, tend to be fascinated by notable individual trees—the Oldest Tree in the World (a bristlecone pine), the Most Massive Tree in the World (a giant sequoia), the Tallest Tree in the World (a coast redwood), all of which are in California—while Germans love and tend to mythologize whole forests. A German might have come up with Fangorn Forest, but not Treebeard; an American, vice versa. The British, however, maintain the proper balance. I think this is because they live on an island which was once heavily forested, and retains many ancient and beautiful trees, but which people over the centuries have transformed into field and pasture and meadow. Looking at the forbidding moors of Scotland one can scarcely believe that most of that country was once densely forested; yet it is so. And the trees are missing simply because humans cut them down. So some Scots are taking pains to restore at least some of the ancient Caledonian pines; and old trees there are revered, none more so than a Yew tree in Fortingall under which, it is said, Pontius Pilate once sat and thought. Similar stories can be told about Ireland and England, too.

Two recent books uphold, and extend, this great tradition: The Tree, by Colin Tudge, and Woodlands, by Oliver Rackham. Tudge is one of the best science writers I have come across—his The Engineer in the Garden remains, a decade after its publication, an exceptionally valuable book about biotechnology and genetic manipulation—while Rackham, a fellow of Corpus Christi College, Cambridge, is a near-legendary botanist and historical ecologist. Both men write vividly and charmingly, largely because they take such pleasure in their subjects. Woodlands would seem to have a more limited range than The Tree: after all, one of the four major sections of Tudge's book is called "All the Trees in the World," and you can't get much more ambitious than that, while Rackham's task is to describe not trees in general, but the various ways in which trees are found in groups and in relation to other creatures, particularly in Britain. (The book appears in the U.K. as a Collins New Naturalist field guide, and its historical sections in particular treat the British context exclusively.) But scattered throughout the 600 pages of Woodlands is an education in the biology of trees about as thorough as what Tudge offers, though in a less methodical form. There is more history in Rackham, who, because of his narrower geographical compass, can show how the woodlands of Britain have waxed and waned over the centuries, either because of changing human practices or because of their relations with other creatures. Both books are delightful, and I am very glad that I read both, but if I had to recommend just one, it would have to be Tudge. Rackham only makes it to his second chapter before introducing, with evident enthusiasm, a multi-page chart accounting for "Associations between mycorrhizal agarics and trees." Tudge does not do this kind of thing at all. Thus my choice.

It is almost impossible to describe these books without falling into a recitation of Fun Facts to Know and Tell. Some trees (mangroves and their relatives) can live with their roots in ocean water because they have developed bark that filters out the salt. Coast redwoods get about a third of their water from the fogs that roll in off the Pacific—good thing, because it is no easy trick to lift water three hundred and fifty feet in the air, which is what some of these titans do. Many botanists understand a grove of aspens as one enormous organism, among the largest found in nature, though not as large as the vast fungi that can run for dozens of acres underground, providing minerals to thousands of trees. A tree endemic to the island of New Caledonia (Sebertia acuminata, if you must know) absorbs so much nickel that its rubbery sap runs bright blue. Many trees survive and even thrive after having been blown over in storms: they just need to keep a small portion of their root system in place. And cows—this is a typical Rackham comment—cows prefer tree leaves to almost any other food, but just can't reach many of them. Sad, really.

But perhaps the most interesting fact to be gleaned from these books—and from Richard Preston's The Wild Trees—is this: much of our knowledge about trees is of recent vintage, and there is still a great deal about these creatures that we do not know. Rackham points out that two great storms that swept across Britain in 1987 and 1990 and uprooted thousands of old trees created surprise and consternation in many botanists: all along they had been describing the long taproots that anchored such trees deep in the ground, but the storms revealed that the taproots didn't exist. Even the largest trees can have roots just a couple of feet deep: they extend horizontally vast distances, but the taproots that saplings (especially oaks) send down are soon supplanted. Preston describes the work of Steve Sillett, of Humboldt State University in California, and a small group of other scientists who in the past fifteen years have discovered what really goes on in the canopies of our tallest trees—something which earlier botanists had tried, with limited success, to explore by floating above the forests in balloons. Sillett and company simply climb the trees, risking life and limb every time they do it, and in the process are discovering the phenomenally complex ecosystem flourishing in those heights. Preston, who became a climber himself and joined Sillett on some of his expeditions, found in the crowns of some Eastern trees flying squirrels so unfamiliar with human beings that they allowed him to scratch their heads, and life two hundred feet farther up, in those California redwoods, is even stranger. As one scientist vividly remarked, atop some of the tallest redwoods, with their dense and interlocking multiple crowns, you could put showshoes on and throw a Frisbee around. O brave new world indeed.

I have been able to give the merest glimpse here of how fascinating trees are in themselves—even the most cursory description of their ingenious methods of feeding and growing themselves is beyond this essay's scope—but equally fascinating, perhaps, is the story of their role in human culture. This essay appears in a magazine made of paper; I wrote much of it sitting at a wooden desk, from which I arose occasionally to get an apple—an apple I bought at the local grocery after driving there in a rubber-tired vehicle. On such jaunts I may have occasionally worn a rayon shirt (rayon is made from cellulose), and I might also have picked up a bottle of olive oil, or some cinnamon sticks, or bay leaves, or a few avocados for my justly famed guacamole.

One could be forgiven for thinking that trees are co-extensive with culture itself. In his two-volume historical masterwork The Mediterranean and the Mediterranean World in the Age of Philip II, Fernand Braudel identifies "the Mediterranean world" with the domain of the olive tree, and any reader of the Bible or of Homer will know why he says so. One of the most powerful images in literature comes near the end of the Odyssey, when Odysseus describes the marriage-bed he and Penelope shared, a bed carved from the trunk of a living olive tree. For Homer this could have been nothing less than an image of the human world, emerging from and revering the natural world as it is exemplified in the tree from which Homer's people and their descendants took the most: fruit to eat, wood for fire or furniture, oil for cooking and light and the anointing of faces.

Yet, as the aforementioned denuding of Britain suggests, humans have not always appreciated trees or our debts to them. For those who make their living from herding animals, every tree represents so many fewer square feet of pasturage; it is an impediment to life itself, or can seem so. (Often erroneously, of course.) In heavily forested areas, trees must often be banished to the periphery of human settlement in order to make that settlement possible—and to open it to sunshine that is especially welcome in cooler climates. For these reasons and others, Henry W. Lawrence explains in his City Trees, it was not until the 18th century that trees became a common and expected feature of European urban landscapes. Treeless urbanity seems horrible to us—the elimination of greenery is a key feature of almost all our dystopian images—but it must be remembered that in the Middle Ages cities were very small places indeed. Paris was probably the largest European city of that period, and you could walk from any one of its walled boundaries to any other in half an hour. So, though there was an absolute divide between the treeless city and the forested countryside, marked by any given city's walls, the countryside could be almost instantly reached by anyone ambulatory.

The practice of planting trees in European cities only began to grow once cities got larger and the countryside grew correspondingly more distant. In the 17th century the great diarist, gardener, and arboriphile John Evelyn visited Antwerp, whose leaders had, half-a-century earlier, planted trees along the whole length of the elevated city walls. "There was nothing about this City," Evelyn rhapsodized, "which more ravished me than those delicious shades and walks of stately Trees, which render the incomparably fortified Works of the Town one of the sweetest places in Europe." He was equally ravished by Amsterdam, where lindens had been planted along the length of the city's canals: of one canalside street he exclaimed, "It appears to be a City in a Wood"—the exact phrase that another traveler of the time used to describe the English town of Norwich. So the presence of many trees in an urban environment was still, then, a source of wonderment.

(Evelyn published in 1664 a compendious tome called Sylva, or a Discourse of Forest Trees. This became an enormously popular book in England, and for several generations the definitive guide to native trees. Maggie Campbell-Culver's A Passion for Trees: The Legacy of John Evelyn is a beautifully illustrated revisiting of Evelyn's famous guide. But Evelyn was not just interested in native plants: on his travels to the Netherlands he noticed a curious and beautiful flower and picked up a few bulbs to bring back to the great garden he was building at Sayes Court, his estate in Deptford, Kent, on the south bank of the Thames. Evelyn was therefore, more than any other single person, responsible for introducing tulips to England, where they soon created a kind of mania, with tulip societies springing up all over the country. Campbell-Culver reports that Evelyn's great garden fell into disrepair soon after his death, and that nothing of it remains today with the possible exception of a single mulberry tree. This is very sad, but there is consolation from Anna Pavord's remarkable work of social history, The Tulip: of the hundreds of tulip societies that once dotted England, only one remains, the Wakefield and North of England Tulip Society, in Yorkshire, and some of the marvelous specimens grown by those gifted amateurs even today are descended from the very bulbs that Evelyn brought from Holland three hundred and fifty years ago.)

Lawrence shows how different cities in Europe—and, later, in America—incorporated trees into their plans. Such plans varied greatly, from the grand boulevards of Paris to the tree-filled residential squares of London. (I am particularly fond of the latter model, which you can see followed in a lovely way in Chicago's Washington Square Park, the city's oldest. The Newberry Library sits on the north side of the square, and one of the great delights of using that excellent library involves sitting at a table and gazing through tall windows at the park's trees. Of course, this means that you don't get much work done and feel guilty later, but life consists mainly of such tradeoffs.) But it took a surprisingly long time to achieve consensus on the validity of tree-planting in cities. As late as 1771, after many of the great London squares had already been built, the anonymous author of a polemic called Critical Observations on the Buildings and Improvements of London wrote, icily, "A garden in a street is not less absurd than a street in a garden; and he that wishes to have a row of trees before his door in town, betrays almost as false a taste as he that would build a row of houses for an avenue"—that is, instead of an avenue of trees—"to his seat in the country."

But this poor critic was fighting a losing, indeed a lost, battle. By the nineteenth century it had been agreed, in most cities of the world, that trees are both beautiful and health-giving, and that therefore trees should be planted anywhere in our cities where it is possible to plant them. As we still do.

London's arboriphobic pundit was concerned that the presence of trees interferes with the well-being of people—their aesthetic well-being, anyway—but the modern conservationist takes the opposite position: that the presence of people interferes with the well-being of trees. As Oliver Rackham notes, much conservationist thinking takes as its starting-point an idealized image of woodlands untouched by humanity—the true "wildwood." This ideal is especially problematic, Rackham argues, in places like Europe where human habitation goes back a long way. He quotes from a conservationist who lamented, early in the 20th century, that by the 15th century human beings had cut down most of the primeval forests of Britain; which is true, Rackham says, if he meant the 15th century BC. It is not clear to Rackham why a state of affairs that pertained three or four thousand years ago should become the norm against which all other times are measured. Why not—this is my thought, not Rackham's—why not long for a still earlier time, the last great period of global cooling, when much of what would later be covered by trees was covered instead by ice?

Rackham's second complaint about modern conservationism stems from his first. If having more trees is always better, then, so the logic goes, they should be planted everywhere. Only by creating vast forests to replace the natural forests we have cut down can we compensate for our previous foolishness. Rackham quotes one of the earliest proponents of this view: "Truly, the waste, and destruction of our Woods, has been so universal, that I conceive nothing less than an universal Plantation of all the sorts of trees will supply, and well encounter [that is, remedy] the defect." Who was this pioneering reformer? Why, John Evelyn, of course—who else? Evelyn seems to have known enough about trees to carry out his scheme, insofar as he could, in thoughtful and reasonable ways, which is more than Rackham can say for many modern conservationists. The problem is that some patches of open ground that look like ideal sites for plantations are poor environments for trees of any kind; or it happens that the trees human beings tend to enjoy are poor choices for the environments in which we place them. Rackham takes a kind of ironic satisfaction in seeing these plantations fail, especially since when they come to be neglected or forgotten, as often happens, the various species that truly belong there gradually drift in and make themselves at home.

"Conservationists," says Rackham, "have a record of trying to play God and rectifying God's mistakes as well as humanity's. Often they make woods fit a predetermined theory (which theory depends on how long ago they were at college) rather than listening to the woods and discovering what each wood has to contribute to conservation as a whole." It's now well-understood that the most catastrophic of these attempt at God-playing was the practice—very common throughout the 20th century, especially in North America and in Brazil, and not yet everywhere rejected—of trying to eradicate forest fires. This overzealousness deprives woodland ecosystems of the vital benefits of occasional burning, and, worse, insures that when fires do start they find so much combustible material that they become superfires, with dire consequences for forests and people alike.

It's interesting to see that people who love trees and know them intimately, as opposed to those who have merely general instincts for conservation, tend not to erect ideological barriers between the human world and "Nature." Rackham's deeply committed but pragmatic and nonideological approach credits woodlands with a remarkable ability to manage themselves, and sees a great deal of wisdom in many of our ancient practices of woodcraft—practices formulated when we couldn't dominate our environment and so had to learn to be stewards of it. (There's a picture in Woodlands of Rackham slicing a length of oak into radial planks with a froe. Don't know what a froe is? Join the club.) But stewardship of an environment, let us make no mistake, is use—respectful use, with a view toward leaving something for our children to use, and to teach their children to use in turn. So also Colin Tudge, who regrets careless and ruthless exploitation of woodlands as much as anyone could, rejects the hands-off approach as an alternative. He would like to see, for instance, a far greater reliance on wood as a building material, and not just for residential purposes: "although it requires energy to turn a tree trunk into a finished beam, … it takes roughly twelve times as much to make a steel girder that is functionally equivalent." And while "timber burns, of course," it's also true that "steel, when overheated, buckles." In just a few pages Tudge makes a surprisingly strong case for a greener architecture, even for commercial buildings, based on timber.

And his thoughts go far beyond this. For instance, Tudge imagines trees as a much greater source of food than they are commonly thought to be—an especially attractive thought given trees' ability to hold soil in place and to moderate climate. In the final pages of his book Tudge grows rhapsodic in an almost Evelyn-like way: it is "marvelously and encouragingly" true that "societies can build their entire economies around trees: economies that are much better for people at large, and infinitely more sustainable, than anything we have at present. Trees could indeed stand at the heart of all the world's economics and politics, just as they are at the center of all terrestrial ecology." I'm not sure whether I believe fully in Tudge's visionary ideal, but I want to. It's a beautiful thing.

Meanwhile, back on my sunporch, I continue to be blessed by the trees around me—even if some of them are probably not ideally suited to the local soil and climate. Maples tend to do very well, though, including the one I spend much of my time staring at. And just the other day, when I was going through some old bills and receipts, I found the report of an arborist we had hired a few years ago to take down a couple of dead trees and trim some others. The crew chief had, helpfully, listed each of our trees by species, and with a slightly accelerated heartbeat I sought the answer to my old question. Turns out that my Tree of Mystery is … a silver maple. Oh, like I needed him to tell me that.

Alan Jacobs is professor of English at Wheaton College. He is the author most recently of Original Sin: A Cultural History (HarperOne) and Looking Before and After: Testimony and the Christian Life (Eerdmans).

Books discussed in this essay:

Maggie Campbell-Culver, A Passion for Trees: The Legacy of John Evelyn (Transworld, 2006)

Henry W. Lawrence, City Trees: A Historical Geography from the Renaissance Through the Nineteenth Century (Univ. of Virginia Press, 2006)

Richard Preston, The Wild Trees: A Story of Passion and Daring (Random House, 2007)

Oliver Rackham, Woodlands (HarperCollins UK, 2006)

Colin Tudge, The Tree: A Natural History of What Trees Are, How They Live, and Why They Matter (Crown, 2006)

Copyright © 2008 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromby Alan Jacobs
Page 2804 – Christianity Today (2024)
Top Articles
Cookie Clicker: Ascension and Permaslots Guide - GamePretty
Cookie Clicker: 10 Best Upgrades To Prioritize
Dlnet Retiree Login
1970 Chevelle Ss For Sale Craigslist
Davante Adams Wikipedia
Google Jobs Denver
Tyrunt
Snarky Tea Net Worth 2022
Cvs Devoted Catalog
Swimgs Yung Wong Travels Sophie Koch Hits 3 Tabs Winnie The Pooh Halloween Bob The Builder Christmas Springs Cow Dog Pig Hollywood Studios Beach House Flying Fun Hot Air Balloons, Riding Lessons And Bikes Pack Both Up Away The Alpha Baa Baa Twinkle
Taylor Swift Seating Chart Nashville
The Murdoch succession drama kicks off this week. Here's everything you need to know
Aberration Surface Entrances
Jenn Pellegrino Photos
Me Cojo A Mama Borracha
R Personalfinance
Officialmilarosee
Busted Newspaper Fauquier County Va
Melissababy
Robeson County Mugshots 2022
John Chiv Words Worth
Www.dunkinbaskinrunsonyou.con
Finding Safety Data Sheets
Webworx Call Management
Eegees Gift Card Balance
'Conan Exiles' 3.0 Guide: How To Unlock Spells And Sorcery
Hoofdletters voor God in de NBV21 - Bijbelblog
Craigslist Maryland Baltimore
Mg Char Grill
2024 Coachella Predictions
The Legacy 3: The Tree of Might – Walkthrough
AP Microeconomics Score Calculator for 2023
Flashscore.com Live Football Scores Livescore
Honda Ruckus Fuse Box Diagram
SF bay area cars & trucks "chevrolet 50" - craigslist
Umd Men's Basketball Duluth
Shoecarnival Com Careers
Parent Portal Pat Med
Peace Sign Drawing Reference
Catchvideo Chrome Extension
Tacos Diego Hugoton Ks
Fluffy Jacket Walmart
John Wick: Kapitel 4 (2023)
Lorton Transfer Station
Accident On 40 East Today
1990 cold case: Who killed Cheryl Henry and Andy Atkinson on Lovers Lane in west Houston?
Spn 3464 Engine Throttle Actuator 1 Control Command
53 Atms Near Me
Edict Of Force Poe
Where To Find Mega Ring In Pokemon Radical Red
Latest Posts
Article information

Author: Fredrick Kertzmann

Last Updated:

Views: 6196

Rating: 4.6 / 5 (46 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Fredrick Kertzmann

Birthday: 2000-04-29

Address: Apt. 203 613 Huels Gateway, Ralphtown, LA 40204

Phone: +2135150832870

Job: Regional Design Producer

Hobby: Nordic skating, Lacemaking, Mountain biking, Rowing, Gardening, Water sports, role-playing games

Introduction: My name is Fredrick Kertzmann, I am a gleaming, encouraging, inexpensive, thankful, tender, quaint, precious person who loves writing and wants to share my knowledge and understanding with you.