FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayPublic Discourse

The Virtues of Mary Wollstonecraft

In 1978, the University of Chicago Press journal Signs published a short essay introducing Mary Wollstonecraft’s lost anthology of prose and poetry she had published “for the improvement of young women.” Wollstonecraft’s anthology reproduced edifying fables and poetry; excerpted from the Bible, Shakespeare, and Milton; and included four Christian prayers Wollstonecraft had authored herself. Among the last is a lengthy “Private Morning Prayer” which reads, in part:

Though knowest whereof I am made, and rememberest that I am but dust: self-convicted I prostrate myself before thy throne of grace, and seek not to hide or palliate my faults; be not extreme to mark what I have done amiss—still allow me to call thee Father, and rejoice in my existence, since I can trace thy goodness and truth on earth, and feel myself allied to that glorious Being who breathed into me the breath of life, and gave me a capacity to know and to serve him.

Though Wollstonecraft is now regarded a canonical thinker in the fields of history, political science, and gender studies, secular feminist scholars still struggle to make sense of her religiosity. Many suggest, as Moira Ferguson did in her 1978 Signs essay, that Wollstonecraft’s own skepticism grew as she crafted her more influential political work. Meanwhile, religious thinkers tend to ignore her religiosity, subscribing to the selfsame interpretation of her as a duly secular proto-feminist.

Enter Modern Virtue: Mary Wollstonecraft and a Tradition of Dissent (Oxford) by Emily Dumler-Winckler, the first adequately theological treatment of Wollstonecraft’s pedagogical, social, and political thought. In a beautifully written, deeply learned, and insightful book, the St. Louis University theologian maintains that there is far greater continuity throughout Wollstonecraft’s work than scholars realize. The imitatio Christi commended in her early publications, including the Female Reader, is the key that unlocks her entire corpus. Deftly employing knowledge of Plato, Aristotle, Augustine, and Aquinas, but also Burke, Rousseau, Kant, Hume, and Paine, Dumler-Winckler situates Wollstonecraft in the Western canon as the seminal philosophical and theological thinker she rightly is.

In a beautifully written, deeply learned, and insightful book, the St. Louis University theologian maintains that there is far greater continuity throughout Wollstonecraft’s work than scholars realize.

 

Dumler-Winckler is at her best in revealing the kinship between Wollstonecraft and the pre-moderns, and distinguishing her from contemporaries like Kant. Such scholarly care helps Dumler-Winckler show how Wollstoncecraft’s thinking about virtue, justice, and friendship can inform those who resist various racial and economic injustices today.

However, Dumler-Winckler is less convincing in her efforts to distinguish Wollstonecraft’s pre-modern insights from those she dubs (and derides) “virtues’ defenders,” namely, Alasdair MacIntyre, Stanley Hauerwas, and Brad Gregory. Certainly, their thought—like the “virtue despisers” she more charitably challenges—would be well served by the rich encounter with Wollstonecraft that her book offers. But given Dumler-Winckler’s own judicious critiques of our modern ills, these “defenders” (especially Gregory) may be more friends than the foes she imagines.

Cultivating the Virtues through the Moral Imagination

Given Wollstonecraft’s sharp critique of Edmund Burke’s (prescient) condemnation of the French Revolution—as well as her Enlightenment-era claim of “natural rights” for women—Wollstonecraft has long been regarded a paradigmatically liberal thinker: akin to a female Thomas Paine, a feminist John Locke, a proto-Kantian. However, some scholars in recent decades have rejected these inapt comparisons, with most now placing her, rightly in my view, in the civic republican tradition that dates to Cicero. This reexamination of Wollstonecraft’s thought has also allowed for greater scrutiny of her argument with Burke over the French Revolution, with some now seeing far more affinity between the two English thinkers than was previously thought.

Dumler-Winckler’s 2023 book deepens these insights by foregrounding Wollstonecraft’s early pedagogical texts as the key that unlocks her account of the virtues, which itself is the prism through which to view her entire corpus. Throughout Modern Virtue, Dumler-Winckler strongly contests the view that Wollstonecraft’s account of virtue resembles that of the rationalist Kant, or the “naked” rationalism of the French revolutionaries Burke rightly scolds. Early on, Wollstonecraft writes: “Reason is indeed the heaven-lighted lamp in man, and may safely be trusted when not entirely depended on; but when it pretends to discover what is beyond its ken, it . . . runs into absurdity.” As Dumler-Winckler shows, respect for the ennobling human capacity for reason and its limits can be found throughout her work.

It’s in the rich Christian tradition especially that Wollstonecraft finds dynamic resources to bear on her “modern” subjects (abolition and women’s education, in particular).

 

But Dumler-Winckler also ably distinguishes the proto-feminist from the “sentimentalism” of Hume, the “voluntarism” of some Protestant contemporaries, and the “traditionalism” of Burke. Dumler-Winckler finds the golden mean by looking beyond the moderns with whom Wollstonecraft is too hastily classified to show her kinship with ancient and medieval thinkers, especially Aristotle and Aquinas. It’s in the rich Christian tradition especially that Wollstonecraft finds dynamic resources to bring to bear on her “modern” subjects (abolition and women’s education, in particular)—even as the late-eighteenth-century thinker refines the tradition for the “revolution in female manners” she seeks to inspire. Thus, the seemingly incongruous subtitle: “a Tradition of Dissent.”

“The main business of our lives is to learn to be virtuous,” the pedagogue wrote in Thoughts on the Education of Daughters (1787). For Wollstonecraft as for Aristotle and especially Aquinas, argues Dumler-Winckler, one learns to be virtuous through the gradual refinement of all the faculties in a dynamic engagement of the imagination, understanding, judgment, and affections—especially through imitating the patterns of Christ-like moral exemplars. “Teach us with humble awe to imitate the divine patterns and lure us to the paths of virtue,” Wollstonecraft writes. One “puts on righteousness,” then, not ultimately as a rule obeyed, a ritual practiced, or a philosophical tenet held, even as obedience, rituals, and understanding are, for Wollstonecraft, all essential components.

Rather, one “puts on righteousness” as a kind of “second nature,” which is the refinement and cultivation of raw, unformed appetites (i.e, “first nature”). Wollstonecraft artfully employs Burke’s metaphor of the “wardrobe of the moral imagination” to depict the way in which cultivation of the virtues refines one’s “taste” in particular matters. “For Wollstonecraft, nature serves as a standard for taste, only insofar as the passions, appetites, and faculties of reason, judgment, and imagination are refined by second natural virtues. . . . [I]t is never the gratification of depraved appetites, but rather exalted appetites and minds . . . which are to govern our relations.”

In imitating Christ-like exemplars from an early age, we develop the the moral virtues that perfect our relation with God and others—being crafted and crafting oneself, in turn. “Whatever tends to impress habits of order on the expanding mind may be reckoned the most beneficial part of education,” Wollstonecraft writes in the introduction to the Female Reader, “for by this means the surest foundation of virtue is settled without struggle, and strong restraints knit together before vice has introduced confusion.” Pointing to this dialectical design, Dumler-Winckler shows how Wollstonecraft wished for Scripture and Shakespeare to “gradually form” girls’ taste so that they might “learn not ‘what to say’ but rather how to read, think, and even pray well, and how to exercise their reason, cultivate virtue, and refine devotional taste.” Schooling the moral imagination through imitation was the key to acquiring virtue for both Aristotle and Wollstonecraft—but so was making virtue one’s own: “[W]e collectively inherit, tailor, and design [the virtues] as a garb in the wardrobe of a moral imagination.”

Wollstonecraft’s argument with Burke, then, concerns not the evident horrors and evils of the Revolution itself (in which she agrees with him) but the causes of such evils. For Wollstonecraft, the monarchical French regime Burke extols had become deeply corrupt, with rich and poor loving not true liberty but honors and property. The French peasants (and the revolutionaries that emboldened them) thus lacked the virtues that would allow them to respond to injustice in a virtuous way: “The slave unwittingly becomes the master, the tyrannized a tyrant, the oppressed an oppressor.” “Absent virtue,” Dumler-Winckler insightfully writes, “protest unwittingly replicates the injustice it protests. For Wollstonecraft, true virtue is revolutionary because it enables one to justly protest injustice, and so not only to criticize but to embody an alternative.” Throughout the text, Dumler-Winckler exhorts those who would critique various injustices today to showcase, in their particular circumstances and unique oppressions, virtuous alternatives.

Women’s Rights for the Cause of Virtue

Just as Augustine distinguished true Christian virtue from its pagan semblances, Dumler-Winckler tells us, Wollstonecraft distinguishes true virtue from its “sexed semblances,” especially in the thought of Burke and Rousseau. For the latter two, human excellence was a deeply gendered affair. For Wollstonecraft, however, virtue is not “sexed”—even as men and women, with distinctive procreative capacities and physical strength, clearly are. Dumler-Winckler yet again turns to Aristotle and Aquinas as those who “supply the material” for Wollstonecraft to call upon women to imitate Christ and live according to their rightful dignity as imago Dei. Dumler-Winckler’s own words could summarize her important book: “The affirmation of women’s ability to recognize, identify with, and even emulate the divine attributes has been so crucial for the affirmation of their humanity and equality, it may be considered a founding impetus for traditions of modern feminism.”

Wollstonecraft knew that women’s education and role in society needed to be reimagined. Dumler-Winckler writes that “unlike most of her predecessors, premodern and modern alike, . . . Wollstonecraft could see that growing in likeness of God would require a ‘revolution in female manners’ and a rejection of ‘sexed virtues.’” Mrs. Mason, the Christ-like protagonist in Wollstonecraft’s Original Stories, taught her female pupils to think for themselves “and rely only on God.” Mason’s advice did not commend the Kantian autonomy so often extolled today, but rather “virtuous independence.” Mason taught: “[W]e are all dependent on each other; and this dependence is wisely ordered by our Heavenly Father, to call forth many virtues, to exercise the best affections of the human heart, and fix them into habits.”

And thus we come to the rationale behind Wollstonecraft’s late-eighteenth-century appeal for women’s rights. It was an appeal “for the cause of virtue,” as Dumler-Winckler quotes the proto-feminist again and again. Wollstonecraft grounded rights in the imago Dei, viewing them as both important protections against arbitrary and unjust domination, and specifications of justice’s demands: “what is due, owed, or required to set a particular relationship right,” as Dumler-Winckler nicely puts it. Unlike the Hobbesian Jacobins, Wollstonecraft recognized that “liberty comes with attendant duties, constraints, and social obligations at every step.”

Recognizing how Wollstonecraft advocated rights on the basis of Christian theological anthropology, not secular Enlightenment ideals, it is odd then that Dumler-Winckler picks a fight with the likes of Alasdair MacIntyre, Stanley Hauerwas, and especially Notre Dame historian Brad Gregory. It is true that in his 2012 Unintended Reformation, Gregory laments, in Dumler-Winckler’s words, “the eclipse of an ethics of virtue with a culture of rights,” but unlike MacIntyre and Hauerwas, who reject rights as a quintessentially liberal phenomenon, Gregory endorses an older grounding for rights. But this is precisely what Dumler-Winckler has shown Wollstonecraft offers: the good and right held together, just as, both happily acknowledge, Catholic social teaching does today.

Wollstonecraft grounded rights in the imago Dei, viewing them as both important protections against arbitrary and unjust domination, and specifications of justice’s demands.

 

Unfortunately, however, modern rights theories have followed not Wollstonecraft or CST but the “conceptions of autonomy and self-legislation” that both reject. Though Wollstonecraft’s view of “liberty is not, as Burke fears, a license to do whatever one pleases or as Gregory fears ‘a kingdom of whatever,’” that surely is a prevailing view of liberty today. Indeed, it manifests itself (in Gregory’s view) in acquisitive consumerism, “an environmental nightmare,” “exploitative (and often brutally gendered)” impact on workers, the decline in the “culture of care” and much, much more, about which Dumler-Winckler and Gregory (and I) agree! But more crucial, perhaps, is a common solution: if “disordered loves are at root, Wollstonecraft suggests, a matter of idolatry,” the remedy depends, according to Dumler-Winckler, on “their reordering, on the cultivation of what Wollstonecraft considers ‘the fairest virtues,’ namely benevolence, friendship, and generosity. . . . Not to banish love and friendship from politics, . . . but to refine earthly loves.”

Dumler-Winckler and I disagree about some of the practical implications of Wollstonecraft’s vision today. She sees unjust essentializing in the TERF movement; I, in the gender ideology against which TERFS rally. She hails Kamala Harris as an exemplar, I, Justice Barrett. But that we share a common understanding—and indeed an intellectual framework—that each person “learning to be virtuous” can transform a society for the common good is itself a great advance from the MacIntyrian lament. It is one for which she and I can both happily thank an inspiring and insightful eighteenth-century autodidact, one who should be more widely read—and carefully studied—today.

A Step Forward in the Debate about Masculinity

Early in 2019, the men’s razor company Gillette raised eyebrows with a new commercial. The ad depicted stereotypically disordered male behavior—aggression, catcalling, and a “boys will be boys” indifference to both—and contrasted it with a new generation of men taking a stand against that patriarchal past: a father breaks up a fight between two boys, a young man cuts off another’s unwanted advances toward a female stranger.

Many on the right complained that Gillette drew a glib equivalence between masculinity and toxicity, and the commercial was admittedly both myopic and preachy. But the conservative dismissiveness of the ad largely settled into a defensive, abrasive machismo. The ad was simplistic, but its critics missed an opportunity to think through the meaning of masculinity, and about the importance of male—and especially paternal—role models.

Conversations about sex and gender are surely just as difficult now as in 2019. Any talk about masculinity can easily veer into the same sclerotic patterns of the Gillette hubbub: a left that paints with uncritically broad brushes, and a right that gets defensive and dumbs down its beliefs. Richard Reeves’s latest book, Of Boys and Men: Why the Modern Male Is Struggling, Why It Matters, and What to Do about It, manages to avoid predictability, blending statistical insight and easygoing wit to craft a fruitful exploration of the male malaise.

Reeves, a liberal economist at the Brookings Institution, bookends Of Boys and Men by presenting the educational, economic, and cultural challenges men face, and he proposes policy and social solutions for each. They’re all insightful and (unsurprisingly) subject to debate, especially, in my view, his discussion of fatherhood and marriage. But one of the most important lessons of the book—which Reeves introduces to reassure readers that they can care about both women’s equality and men’s struggles—is that “we can hold two thoughts in our head at once.” In that vein, we can disagree, even deeply, about some of Reeves’s premises or proposals, while also recognizing that Of Boys and Men models the sort of intellectual dexterity needed to tackle complicated matters in our polarized times.

Any talk about masculinity can easily veer into the same sclerotic patterns of the Gillette hubbub: a left that paints with uncritically broad brushes, and a right that gets defensive and dumbs down its beliefs.

 

Men Falling Behind

Reeves begins his book by pointing out how boys are falling behind in the classroom. He surveys data showing they are 14 percentage points less likely than girls to be ready to start school at age 5, as well as 6 percentage points less likely to graduate high school on time. Look ahead at college, and young men are 15 percentage points less likely to graduate with a bachelor’s. Women are narrowing gaps between themselves and their male classmates in typically male-dominant subjects (such as STEM), while stereotypically female subjects like nursing and teaching remain so.

Men are also being outdone in the job market. Worries about the wage gap—which Reeves handles with careful nuance—or about a C-suite glass ceiling have some legitimacy, but overemphasizing them paints an incomplete picture. The lack of female representation among Fortune 500 CEOs tells us that a small—albeit influential—proportion of men are doing very well. But by the same token, the people struggling the most economically are more likely to be men. An astounding one-third of men with only a high school diploma (approximately 5 million men) are out of the labor force.

To address these labor market problems, Reeves draws from the women-in-STEM push and calls for similar efforts for men in health care, education, administration, and literacy—or as he calls it, HEAL. In the same way that Melinda Gates pledged $1 billion to promote women in STEM, Reeves proposes an equal “men can HEAL” push. He envisions a combination of government and philanthropic funds for training and scholarships, and for marketing these often well-paying careers. Child care administrators and occupational therapists, two examples Reeves cites, respectively earned on average $70,000 and $72,000 in 2019.

The people struggling the most economically are more likely to be men. An astounding one-third of men with only a high school diploma (approximately 5 million men) are out of the labor force.

 

One of the most discussed policy proposals in the book deals with educational gaps, which Reeves wants to address largely by “redshirting boys,” or delaying their start in kindergarten by a year. It’s not that young boys are less able than their female counterparts, but rather that they cognitively develop at a different pace, about two years slower than girls. For Reeves, giving boys the “gift” of an extra year before starting school “recognizes natural sex differences, especially the fact that boys are at a developmental disadvantage to girls at critical points in their schooling.”

Reeves’s reasoning reflects an important aspect of his approach: he acknowledges the importance of biology, and thinks that understanding biological factors should moderate a tendency, frequently seen on today’s left, to confuse equality with sameness. After examining data on men’s and women’s different career interests and outcomes, for example, Reeves concludes that we should at the very least consider that biology and “informed personal agency” play some role in occupational choices. He rejects attributing all gender gaps to sexism, or expecting perfect 50–50 representation in all fields.

People of different political stripes can debate the scope and specifics of both the HEAL movement and redshirting boys, among other proposals in Of Boys and Men, but Reeves leaves the possibility of fruitful deliberation very much open. He helpfully frames his discussion of education and jobs so that potential disagreements will be about means, rather than  fundamental ends.

But the same can’t be said about the third aspect of the male malaise Reeves identifies: the cultural status of fatherhood.

Fatherhood without Marriage?

Reeves recognizes a sense of aimlessness has taken hold of many men’s most personal relationships: between men and women, and between fathers and children. But he primarily wants to address the latter problem, and to do so by envisioning fatherhood as an independent institution, considered separately from marriage. We should address and improve relationships between fathers and children first, irrespective of whether fathers are married to their kids’ mothers. Our safety net should expect more than mere economic support from fathers—particularly among noncustodial parents—and reward them for involvement in their kids’ lives. In a nutshell, our culture and policy should reconcile themselves with the reality that we live in “a world where mothers don’t need men, but children still need their dads.”

Why doesn’t Reeves concurrently advance a marital renewal? For one thing, he regards the traditional model of marriage as too rigidly predicated on the expectation of a male breadwinner, and by extension on the economic dependence of women. From it flows a notion of fatherhood that may have encouraged family formation in the past, but that is now “unfit for a world of gender equality.” The decline of marriage poses problems, but it’s largely indicative of positive gains in autonomy for women, in his view.

Reeves emphasizes that many unmarried, nonresidential fathers are very involved in their kids’ lives. Black fathers, for example—44 percent of whom are nonresidential—are more likely than white nonresidential fathers to help around the house, take kids to activities, and be generally present, according to one study he cites. Reeves argues that our cultural expectations of fatherhood “urgently need an update, to become more focused on direct relationships with children” (emphasis added). Since about 40 percent of births in the United States take place outside of marriage, he concludes that insisting on a model that assumes an indissoluble link between fatherhood and marriage is just anachronistic.

But Reeves doesn’t fully reckon with the gravity of divorcing fatherhood from marriage. Chapter 12 of Of Boys and Men, which Reeves dedicates to his independent fatherhood proposal, advances particular policies to support “direct” fatherhood unmediated by marriage, from paid parental leave to child support reforms, to encouraging father-friendly jobs. But he spends surprisingly little time directly arguing against the empirical and philosophical case for fatherhood within marriage as distinctly positive.

For example, Reeves writes that “there is no residency requirement for good fatherhood. The relationship is what matters.” Fair enough. But which model—fatherhood within marriage or nonresidential fatherhood—tends to facilitate more of the positive interactions needed to build healthy relationships between fathers and children? In general, the one where more of those interactions can potentially take place. A study by Penn State sociologist Paul Amato suggested this, reporting that from 1976 to 2002, 29 percent of nonresident fathers had no contact with their kids in the previous year, while only 31 percent had weekly contact.

Which model—fatherhood within marriage or nonresidential fatherhood—tends to facilitate more of the positive interactions needed to build healthy relationships between fathers and children?

 

In fact, contact with their biological father plays a positive role in advancing the two other major concerns Reeves has for boys: work and education. A 2022 report from the Institute for Family Studies found that “[y]oung men who grew up with their biological father are more than twice as likely to graduate college by their late 20s, compared to those raised in families without their biological father (35% vs. 14%).” According to the Census Bureau, approximately 62 percent of children lived with their biological parents in 2019, and 59 percent lived with married biological parents. In other words, growing up with a biological father is deeply intertwined with growing up with a married father—there are very few cohabitating biological or single biological fathers.

Beyond social science, there’s also a conceptual issue at the heart of Reeves’s proposal. It’s undeniable that many working mothers don’t need men in the same way past generations did—if what we mean by “need” is economic support. But it’s also very clear that mothers do need men. Without men, well, they wouldn’t be mothers (and vice versa) for the very simple yet profound fact that the sexes need each other in order to fulfill their biological end. This is more than a semantic trick—it’s a recognition that mutual dependence is at the heart of our biology. At the risk of putting too fine a point on it, men and women could not exist without each other—and that realization should caution us against overemphasizing independence.

Moreover, shouldn’t fatherhood also entail modeling what lasting commitment to a spouse looks like? Reeves is right to stress that fatherhood should mean more than just economic support, but he misses the fact that prospects for decoupling marriage from fatherhood are similarly discouraging in this regard. In a study by the late Princeton sociologist Sara McLanahan, 80 percent of unmarried parents were in a romantic relationship (with each other) at time of the birth of their child. But here’s how McLanahan summarized her five-year follow-up with them:

Despite their “high hopes,” most unmarried parents were unable to maintain stable unions. Only 15 percent of all our unmarried couples were married at the time of the five-year interview, and only a third were still romantically involved (Recall that over 80 percent of parents were romantically involved at birth.) Among couples who were cohabitating at birth, the picture was somewhat better: after five years, 26 percent were married to each other and another 26 percent were living together.

In divorcing fatherhood from marriage so hastily, Reeves misses an opportunity to consider why reimagining marriage is a key aspect of reinvigorating fatherhood as well. It’s the biggest drawback of Of Boys and Men.

Nevertheless, Reeves’s proposal has clearly opened a door to fruitful further debate. His case for direct fatherhood should temper a traditionalist reflex to assume that nonresidence is equivalent to abandonment—there are clearly many nontraditional, not-married, or separated fathers who embody dedication to their children. But if engaged fatherhood is so empirically and conceptually intertwined with marriage, then we would lose key aspects of both by separating them. We shouldn’t idealize marriage or how it affects fatherhood, but we shouldn’t hastily decouple marriage and fatherhood either.

Successfully reimagining marriage and fatherhood will probably even entail letting go of some of the overly rigid gender roles that Reeves criticizes in Of Boys and Men: more female breadwinners and paternal homemakers, as well as various policies and workplace arrangements to support the parent–child bond are all likely to be a part of the future of the family. But so should giving boys and men not just the tools to excel at school and work, but the habits and vocabulary to strive toward commitment, dedication, and love in the most personal dimensions of their lives.

If engaged fatherhood is so empirically and conceptually intertwined with marriage, then we would lose key aspects of both by separating them. We shouldn’t idealize marriage or how it affects fatherhood, but we shouldn’t hastily decouple marriage and fatherhood either.

 

Improving the Debate

Of Boys and Men is a book about men, written by a man, claiming that our culture doesn’t take men’s problems seriously. In the wrong hands, it would have been the latest entry in a seemingly incessant culture war, a callback to the silliness of the Gillette controversy. Yet it has been a resounding mainstream success. In matters of gender and sex, some might believe it’s impossible for there to be any interest in men’s issues across the ideological spectrum. The success of Of Boys and Men suggests that’s not the case.

Of Boys and Men has a point of view, but Reeves doesn’t close off the possibility of exchange or criticism by making a caricature of his opponents. This is the sort of book that not only exposes an often ignored issue, but elevates the quality of our conversations about it, even amid disagreement. That is perhaps its most impressive feat.

Thomas Aquinas: Revolutionary and Saint

Thomas Aquinas (1225–1274) is one of the two most famous Catholic theologians and philosophers; the other is Augustine (354–430). Seven hundred years ago, on 18 July 1323, Pope John XXII presided at Aquinas’s canonization as a saint. In 1567, Pope Pius V proclaimed Aquinas to be a “Doctor of the Church,” one whose teachings occupy a special place in Catholic theology. Thomas was the first thinker after the time of the Church Fathers to be so honored. In 1879 Pope Leo XIII issued a famous encyclical, Aeterni Patris: On the Restoration of Christian Philosophy, calling on all Catholic educational institutions to give pride of place to the theology and philosophy of Thomas Aquinas. Although today he is a well-established authority in both philosophy and theology, in his own time he was considered a revolutionary thinker whose views challenged the accepted intellectual establishment of the West.

As a young man in Italy, Thomas joined the recently founded Dominican Order. Along with the Franciscans, the Dominicans were part of a widespread reform movement within the Catholic Church. In many ways these new religious orders were urban-based youth movements and, to be effective in their apostolate of reform in the Church, the Dominicans sought out the ablest young men and quickly established houses of study in the great university centers, such as Bologna, Paris, and Oxford.

Thomas traveled north to Paris (1245) to study with Albert the Great at the University of Paris. Albert was already well known for his work in philosophy, theology, and the natural sciences. Like other Dominicans of his time, Thomas walked to Paris from Italy; and then, with Albert, he walked to Cologne (1248) where the Dominicans were establishing another center of studies. He came back to Paris (in 1252) to complete his studies to become a Master of Theology. After three years in Paris, he spent the next ten years in various places in Italy, only to return to Paris in the late 1260s. There, he once again occupied a chair as Master of Theology and confronted various intellectual and institutional challenges, especially concerning the proper relationship between philosophy and theology.

In his own time, Aquinas was considered a revolutionary thinker whose views challenged the accepted intellectual establishment of the West.

 

Aristotelian Revolution

Universities in the thirteenth century were relatively new institutions—centers of lively intellectual life. In addition to a liberal arts faculty, the universities had advanced faculties of law, medicine, and theology. Paris was especially famous for its faculty of theology. Thomas was a member of a brand-new religious order, and his professional life was often connected with this new institution in the West, the university. The growing intellectual authority, and relative institutional autonomy, of the new universities often represented a challenge to the established order, both religious and secular.

Thomas lived at a critical juncture in the history of Western culture. The vast intellectual revolution in which he participated was the result of the translation into Latin of almost all the works of Aristotle. Aristotle offered, so it seemed, a comprehensive understanding of man, of the world, and even of God. In the Divine Comedy, Dante’s Virgil calls Aristotle “the master of those who know.”

Aristotle’s works came to the Latin West in the late twelfth century and first half of the thirteenth century, with a set of very compelling, and often conflicting, interpretations from Muslim and Jewish sources: thinkers such as Avicenna, Averroës, and Maimonides. How should Christian theologians react to this new way of understanding things? Aristotle’s claims about the eternity of the world, the mortality of the human soul, and how to understand happiness seemed to represent a fundamental challenge to Christian revelation. Should the teaching of Aristotle be rejected, or at least restricted? Many Christian thinkers in the thirteenth century thought so, and there were various attempts—mostly unsuccessful—throughout the century to ban Aristotle from the curriculum of the new universities. The most famous were lists of condemned propositions in 1270 and 1277, issued by the Bishop of Paris, Étienne Tempier.

Thomas, following his teacher Albert, was among the first to see, and to see profoundly, the value for Christian theology that Aristotle’s scientific and philosophical revolution offered. Thomas was first of all a theologian, committed to setting forth clearly the fundamentals of the Christian faith. But, precisely because he was a theologian, he recognized that he had to be a philosopher and have scientific knowledge as well. His most famous theological work, the Summa Theologiae, contains many philosophical arguments that have their own formal independence and validity, yet are organized in the service of Christian truth. Thomas recognized that whatever truths science and philosophy disclose cannot, in the final analysis, be a threat to truths divinely revealed by God: after all, God is the author of all truth—the truths of both reason and faith. Thomas thought that by reason alone it was possible to demonstrate not only that there is a God but that this God is the creator of all that is. He argued that faith perfects and completes what reason can tell us about the Creator and all creatures.

Precisely because he was a theologian, Aquinas recognized that he had to be a philosopher and have scientific knowledge as well.

 

As an astute student of Aristotle, Thomas was critical not only of those who ignored Aristotle, but also of those who, sometimes in the tradition of Averroës, read Aristotle in a way that did indeed result in contradictions of Christian belief. Thomas’s philosophical position was not mainstream in his own day; rather, his more traditional colleagues’ views were more widely accepted. Nowhere was this difference more apparent than in debates about the proper relationship between philosophy and theology on a wide variety of questions about human nature and the doctrine of creation. Those opposed to Thomas came from what can broadly be called an Augustinian tradition, which resolutely insisted that philosophy must always teach what faith affirms. Thomas did reject the kind of excessive philosophical autonomy found in Averroës, but he also rejected the tendency toward fideism found in his more traditionalist opponents.

Thomas wrote extensive commentaries on major Aristotelian treatises, such as the Physics, the De Anima (On the Soul), the Posterior Analytics, the Metaphysics, and the Nicomachean Ethics, in which he sought to present comprehensive views of these subjects. One can see Thomas employing principles drawn from Aristotle, as well as from other sources in Greek philosophy, in his wide-ranging collections of disputed questions On Truth and On the Power of God, in Summa contra Gentiles, and even in his biblical commentaries.

Creation and Causality

Throughout his writings, Thomas is keen to emphasize the importance of thinking analogically: recognizing, for example, that to speak of God as cause and of creatures as causes requires that the term “cause” be predicated of both God and creatures in a way that is both similar and different. For Thomas, an omnipotent God, complete cause of all that is, does not challenge the existence of real causes in nature, causes that, for example, the natural sciences disclose. God’s power is so great that He causes all created causes to be the causes that they are.

One of his greatest insights concerns a proper understanding of God’s act of creating and its relationship to science. The natural sciences explain changes in the world; creation, on the other hand, is a metaphysical and theological account of the very existence of things, not of changes in things. Such a distinction remains useful for discussions in our own day about the philosophical and theological implications of evolutionary biology and cosmology. The subject of these disciplines is change, indeed change on a grand scale. But as Thomas would remind us, creation is not a change, since any change requires something that changes. Rather, creation is a metaphysical relationship of dependence. Creation out of nothing does not mean that God changes “nothing” into something. Any creature separated from God’s causality would be nothing at all. Creation is not primarily some distant event; it is the ongoing causing of the existence of whatever is.

In one of the more radical sentences written in the thirteenth century, Thomas claimed that “not only does faith hold that there is creation, but reason also demonstrates creation.” Here he is distinguishing between a philosophical and a theological analysis of creation. Arguments in reason for the world’s being created occur in the discipline of metaphysics: from the distinction between what it means for something to be (its essence) and its existence. This distinction ultimately leads to the affirmation that all existence necessarily has a cause.

The philosophical sense of creation concerns the fundamental dependence of any creature’s being on the constant causality of the Creator. For Thomas, an eternal universe would be just as much a created universe as a universe with a temporal beginning. From the time of the Church Fathers, theologians always contrasted an eternal universe with a created one. Thomas did believe that the universe had a temporal beginning, but he thought that such knowledge was exclusively a matter of divine revelation, as disclosed in the opening of Genesis and dogmatically affirmed by the Fourth Lateran Council (1215).

The theological sense of creation incorporates the philosophical sense. Thomas viewed creation as much more than what reason alone discovers. With faith, he sees all of reality coming from God as a manifestation of God’s goodness and ordered to God as its end. This relationship is a grand panorama of out-flow and return, analogous to the dynamic life of the Persons of the Trinity.

With faith, Aquinas sees all of reality coming from God as a manifestation of God’s goodness and ordered to God as its end.

 

As the German philosopher and historian Joseph Pieper observed, the doctrine of creation is the key to almost all of Thomas’s thought. The revolutionary nature of Thomas’s position on creation is evident in the way it differs from that of his teacher, Albert the Great, and his colleague at the University of Paris, Bonaventure. Both thought that the fact of creation could only be known by faith and that a created world necessarily meant a world with a temporal beginning. Neither fully appreciated Thomas’s observation that creation is not a change. In fact, one of the condemnations of 1277 by the Bishop of Paris noted that it was an error to hold that creation is not a change.

It may be difficult for us to appreciate the radical nature of Thomas’s thought, especially his brilliant synthesis of faith and reason that honored the appropriate autonomy of each. Thomas does not fit easily into a philosophical or a theological category. He thought that human beings could come to knowledge of the world and of God, but he also recognized that a creature’s knowledge of the Creator must always fall short of what and who the Creator is.

For Thomas, reason—deployed in all the intellectual disciplines (including philosophy and the natural sciences)—is a necessary complement to religious faith. After all, a believer is a human being: a rational animal. Faith perfects but does not abolish what reason discloses. In deploying his understanding of the relationship between reason and faith, Thomas offered what were for his time radically novel views about nature, human nature, and God. What commends these views to us—and what was later realized by the Catholic Church—is not their novelty but their truth. Ironically, in the context of today’s intellectual currents that reject metaphysics and embrace various forms of materialism, Thomas’s thought is once again a radical, if not revolutionary, enterprise.

The Ivy League Is Breeding Professional Virtue Signalers

Professor Evan Mandery has some harsh words for elite colleges, particularly Ivy League schools, because they tend to produce people like me. In the Chronicle of Higher Education, he presents an excerpt from his recent book, Poison Ivy: How Elite Colleges Divide Us, and argues that I am not among “the real villains,” despite my having written an essay in 2014 for the Princeton Tory arguing that “privilege” is a useless (at best) and pernicious (at worst) concept for understanding contemporary American life. The real villain is Princeton itself, among the other “institutions that indoctrinated” me into thinking that I had earned my success. All this despite Mandery’s admission that I—more accurately, 20-year-old me—might have deserved my critics’ “contempt.” (Mandery laments that people like me have a hard time learning gratitude, so I suppose I should first thank him for letting me off the hook.)

Mandery, who teaches at the City University of New York (where he has “never seen or heard anyone boast that their college status is deserved”) marshals a few related arguments to prove Princeton’s villainy and that of its peer institutions. His main point is that belief in the myth of meritocracy is generally corrosive, but elite colleges encourage students to believe that they have earned their success. This makes people like me “smug” and “annoying,” and prevents the “recognition that [we] are winners in a game that had been tilted in [our] favor from the start.” Ivy League graduates, moreover “are not the best and the brightest or the hardest working,” but “some of the best, brightest, and hardest working among the very rich.”

There’s a lot in there, reflecting just what a nerve my essay struck and how unresolved its subject remains, despite the moral certainty that characterized so many critical responses. Much of Mandery’s assessment, as a professor criticizing a decade-old piece, is familiar—you’d be amazed how many people have told me that my penniless, liberated-slave grandparents were rich because they didn’t have it as bad as non-whites in America—and seems to extrapolate from a few lines of my essay what it takes my position to be rather than reading it all the way through. Then, as now, most of my critics understood my position to come from a place of entitlement—that I thought I had earned everything I had by hard work alone, and resented being told that the game had been tilted in my favor.

My whole point was that Americans should be grateful for what they have inherited, rather than embrace the totalizing suspicion inherent in the “privilege” discourse.

 

I will admit I am no great fan of being told that the game is rigged, if only because that theory sorely lacks explanatory power. (How Jews, recent immigrants from Africa and the Caribbean, Indians, and white people raised by single parents figure into theories of systemic power is a topic that deserves an essay of its own.) But even getting sucked into that rabbit hole concedes too much. At no point did I deny that many other people’s choices and circumstances played a large role in my becoming who I am. Indeed, my whole point was that Americans should be grateful for what they have inherited, rather than embrace the totalizing suspicion inherent in the “privilege” discourse.

Most readers, many of them precisely the regular Americans Mandery claims I am liable to look down on, understood that. In the months following my essay’s viral takeoff, I got thousands of emails. For every one I received calling me a bigot who “just doesn’t get it” (and there were many of those), I got several more from people across the country, of all ethnicities, every tax bracket, and both sexes, telling me that I had expressed exactly the position they wished elites (people like Mandery) would understand.

It turns out that my essay had proved something of a Rorschach test. To some, my rejection of “privilege” discourse revealed that I was arrogant, ungrateful, and ignorant of the ways in which I was not solely responsible for my success thus far in life. But to others, it was evidence of gratitude and the desire to share my forebears’ recipe for intergenerational mobility as widely as possible, to reject the pessimism inherent in systemic thinking. Interestingly, most of my classmates fell into the first camp. For all Mandery’s theorizing about the Ivy League being awash with the sense that status is justly earned, I was in the minority on my campus for believing generally in just deserts.

The majority view on campus, which Mandery shares, leads to some puzzling places. The belief that elite admissions are inherently unjust would undermine my classmates’ case for attending Princeton, rather than someone who probably needs the mobility boost more than they did. One way that universities avoid this conundrum is to emphasize the college campus as a community meant to be an interesting and fun place to spend four years, and deemphasize the educational outcomes it produces. Top schools are not exactly home to the best and brightest from among the rich, as Mandery claims, but the most multi-talented and unique from among the best and brightest, with uniqueness (of cultivated skills and idiosyncratic interests) correlating closely with wealth.

For Princeton to maintain its median SAT score and intimate campus experience alongside a full complement of sports teams, a cappella groups, comedy troupes, literary magazines, and social justice clubs, it has to choose every one of its roughly 1,500 annual admits carefully. Students who take this view of the campus to heart can simultaneously hold the beliefs that they belong and deserve to be there in some sense, all without believing that those not admitted have lesser intellects or didn’t work hard enough. It is not that hard to conclude that you have to be lucky and good to go to a great college. And it seems praiseworthy to want to share what you believe to be the keys to your good fortune. Yet pessimists like Mandery would rather tear down the notion of earned success than encourage the kinds of behaviors that elite colleges select for, which really do make for vibrant and interesting communities, and whose cultivation, it stands to reason, makes our culture and economy more vibrant as well.

Nonetheless, I must admit that I agree with Mandery’s main point. Elite universities do their students a disservice when they pay excessive attention to student accomplishments at a juncture in their lives when they cannot have accomplished much. That spirit animates a great deal, though not all, of campus culture. What ends up happening is that, to stand out as a social elite among the elites, students begin to tout their unique accomplishments, frequently tied up with unique “identities.” One classmate was heralded for being the first gay man to summit some of the world’s tallest mountains (including, I think, Mount Everest). That kind of Mad-Libs self-branding was central to social climbing—a natural consequence of going from the 99th-percentile SAT score to suddenly feeling like you’re the slowest in your own dorm, not to mention the ridiculously competitive admissions process that encourages any indications of distinction from the outset.

Elite universities do their students a disservice when they pay excessive attention to student accomplishments at a juncture in their lives when they cannot have accomplished much.

 

But what seems most responsible for colleges fêting their students is the dominant cultural idea that we all deserve celebration just for being who we are. The story is well-worn: universities used to see themselves as centers of preparation for citizenship in a liberal republic, but now exist primarily as a stage for students to find and liberate their inner identities or “true selves” and show the world how terrific they are when they can live authentically. In short, colleges used to celebrate the process of becoming; now, they celebrate being. That is part of a much broader cultural force that encourages people of all ages to reject external constraints (social constructs, inherited norms, and so on) in order to achieve fulfillment through better service to the imperial self. If colleges have skimped on their obligation to fashion humble, grateful, selfless graduates, one need look no further than the liberationist movements that spawned on campus and continue to dominate there. The ones that celebrate every identity imaginable, as if to say, you are worthy of praise simply for being who you are. The “extraordinary accomplishments” referenced in every welcome-weekend speech are those of finding your unique brand; those referenced in graduation addresses refer to the strides students have made in developing the identity they will bring to bear on the world.

Such an observation only brings into sharper focus just how strange it is to blame a kind of conservative or classical liberal attitude about the relation between contribution and desert for the behaviors of individuals shaped by overwhelmingly progressive institutions. More likely is that having rejected the classical liberal view, today’s elite students have replaced the old signs of worthiness—SAT scores, a proper WASP background—with new ones befitting a progressive elite dedicated to an identity-obsessed worldview and its resultant demands for “Diversity, Equity, and Inclusion.”

The new signs are quite like those “virtue signals” we are always hearing about, but they are not just about letting your classmates know you are one of the good guys. In the elite campus context, signaling that you are committed to remedying all group-level disparities, that you understand that members of “oppressor” classes are epistemically stunted due to their privilege, and that you are up to speed on the new terminologies, identities, and problematics—all these show existing members of the progressive technocratic elite that you are prepared to use whatever power you will soon have to take up their cause. Publicly repeating the mantras that life was rigged in your favor and that Princeton is systemically racist is an excellent sign that you are a true believer who can be trusted with power, and that you have been properly trained to handle, for instance, a White House committed to equity and forgiving student loan debt. Fighting elitism qua elitism is just silly: there will always be elites, and there will always be institutions committed to producing those elites and teaching them the right signals. The key question is what kinds of virtues those signals stand for.

Signaling that you understand that members of “oppressor” classes are epistemically stunted due to their privilege, and that you are up to speed on the new terminologies, identities, and problematics—all these show existing members of the progressive technocratic elite that you are prepared to use whatever power you will soon have to take up their cause.

 

The signals are coming in loud and clear, showing that Mandery has underestimated the reach of the philosophy he seeks to advance at the expense of the one he identifies with me. He derides my “nihilistic straw man” who believes “no accomplishment is deserved.” Yet he notes the inconceivable irony that Michael Sandel’s Harvard students believe in merit despite reading John Rawls, who argues that rewarding socially beneficial behaviors is “unjust, since they’re the result of what amounts to a natural lottery.” Rawls is indeed the enemy of merit and the notion that reward should be commensurate to one’s socially beneficial activity. And joining Mandery on Team Rawls is everyone who has bought into the equity agenda, which aims to remedy disparities between groups that have emerged because of morally arbitrary mass preferences for certain behaviors (politeness, punctuality, preference for the written word, to name a few) that individuals only exhibit due to morally arbitrary factors. My “straw man” does not just exist; he has won the White House, MacArthur Genius grants, and the culture.

Allow me one more word in my defense. Nobody “indoctrinated” me into rejecting Rawls, believing that a combination of talent, work, and luck leads to success. No one brainwashed me into thinking that the whole privilege discourse lacked explanatory power, hurt those it was trying to help, and demeaned us all in the process. (And if they did indoctrinate me, it certainly didn’t happen at the opening exercises Mandery cites; I skipped those and spent the afternoon watching football.)

The privilege essay itself and the firestorm that followed is a microcosm of everything I am talking about. I came to the position I expressed based on a combination of logic, education, and experience, especially the influence of my (City University–graduate) parents. Writing it brought about many negative consequences, and some good ones, too. But nobody made me write it, and no one but me should reap those consequences. No matter who tries to rob my life of agency, or saddle others with accountability for my choices, I know that I alone had the choice whether or not to publish it, and I went ahead and did it. And, as a descriptive matter, this is what has happened: I live with the consequences, good and bad, every day—I and no one else.

I do not think I am exceptional. We all make choices against an infinite backdrop of characteristics, values, and experiences. Some elements of the backdrop are chosen, some unchosen. Some are chosen by those who came before us, who wished us good or ill. How we choose to conceive of our choice—and the choices of others in their own particular circumstances—is up to us. But that conception itself, whatever we choose, has consequences, and picking unwisely can lead only to despair, distrust, and moral backwardness, in which we punish the righteous and reward the guilty.

I will always pick believing that I and those around me have some agency in life. That doesn’t mean we deserve everything we have—we should be grateful for our good fortune—but it does mean that we can use our choices for good. Call me and my belief in desert naïve, but I will always choose to build up rather than tear down, and try to share my recipe for success—more accurately, that of my forebears—with others.

The Military Depends on Virtues That Are Fading

The U.S. military is facing a recruitment crisis. It is so severe that in September, Senator Thom Thillis referred to an “inflection point” for the voluntary-force model, threatening over a half century of precedent. In 2022, the Army failed to meet its recruitment goals across the board: active duty, reserves, and National Guard were thousands of personnel below target. The Navy met its recruiting quota for enlisted personnel, but not for officers. The Air Force met its recruiting quota for active duty, but not the reserves or National Guard. The Marine Corps barely hit its targets for active duty and reserves.

There are a few theories about the causes of this recruiting predicament, which is at its worst since the post-Vietnam era. Direct causes might be record-low unemployment rates, COVID-era restrictions that limited recruiters’ access to the public, and an increase in both mental and physical health problems among young people. The Public Interest Fellowship’s Garrett Exner, a former Marines special operations officer, points to cultural shifts in American schooling, such as lower standards and refusal to subject students to any kind of adversity. Stuart Scheller, the outspoken Marine veteran and author, blames poor military leadership and misdirected shifts in financial incentives for servicemen.

While these hypotheses probably have some degree of truth, they do not explain why some services are better at recruiting than others: the Marine Corps, after all, made all its quotas, while the Army didn’t meet any of its recruiting goals. The Marine Corps’s success is surprising for many reasons: it has the highest physical fitness standards for retention—significantly higher than that of the Navy and Air Force, and a reputation for physical and psychological duress that eclipses the other services. Moreover, the military occupational specialties (career specialties) available to Marine recruits are dwarfed by the Army’s, which casts the widest net for competencies among the services. At a time when obesity alone prohibits a shocking percentage of American young people from serving, the Marine Corps’s recruiting performance is especially puzzling.

An explanation for the military’s recruitment challenges (and the Marine Corps’s comparative success) goes deeper than trends in the labor market. Ultimately, the civic honor on which voluntary service depends has quietly been eroding some time, and it is being replaced by an ethos of individual self-fulfillment. Tragically, different parts of the military have absorbed this mentality to different degrees. Recent recruiting ads—one by the Marines and one by the Army—offer glimpses into these broader cultural shifts and the challenges that they pose to the U.S. military.

Ultimately, the civic honor on which voluntary service depends has quietly been eroding some time, and it is being replaced by an ethos of individual self-fulfillment.

 

Which Purpose?

“Battle to Belong” is an online recruitment ad for the Marines that depicts young Americans’ sense of alienation and nihilism with splendid accuracy. It was released on YouTube in September 2020, after months of COVID-era lockdowns and mere weeks after the nationwide Black Lives Matter protests. Teens were spending an average of nearly eight hours a day on screens outside of their Zoom classroom time. At the time of this writing in February 2023, the ad had over three hundred fifty thousand views on YouTube.

“Battle to Belong” opens with a young man walking down an urban street littered with advertisements and floating widgets. “Searching for meaning in a relentless world,” begins the baritone narrator, “Always connected, but somehow alone.” These words establish that meaning is the object of the protagonist’s search and that a fog of empty, virtual relationships obscures his goal. Then, a malevolent digital clone of the protagonist stops him in his path. “Trapped by illusion,” the narrator continues, as the clone offers our protagonist virtual distractions and a menacing look. Shouting, the young man lunges through the hologram clone and falls into the mud at Marine Corps boot camp. “We offer another path where the battle to belong begins.” As the music swells and the protagonist completes the grueling training, the voice continues, “Awakened by a calling, united by purpose, defined by the cause you fight for. No one can ever take away what it means to be among the few, the proud, the Marines.”

This commercial defies expectation in two significant ways. First, it fully acknowledges America’s social decline, and presents the Marines as a way out of that decline. Other advertisements, on the other hand, tend to suggest protection of the homeland as the compelling reason for enlisting. Instead, this Marine Corps commercial presents contemporary American life as culturally indistinct, gray, and alienating. Service offers an escape from home, not a fight to defend it.

Second, and relatedly, its argument for joining the military makes no explicit reference to the official Marine Corps mission—“the protection of our Nation and the advancement of its ideals.” Instead, the Marine Corps is attractive because it provides a purpose at all. Its uses words like “path,” “calling,” “purpose,” and “cause,” but it leaves the specific goal undefined, an afterthought. The incentive to join the Marine Corps is not to serve or accomplish an end, but merely to belong.

The case the ad makes for joining the Marines is brilliant: it presents fundamental human needs, purpose and belonging, both of which are currently unfulfilled. It offers respite to potential recruits who feel desperately lonely and purposeless, not only because COVID restrictions abolished their everyday work and social lives, but also because the Internet encourages the cultivation of a shallow and harmful parallel identity. The Marine Corps obviously understands the corrosive effects of digital media. The young man’s hologram version of himself is clearly a threat to his well-being.

Yet the ad’s unwillingness to define military service in patriotic instead of purely psychological terms leaves undesirable possibilities open. If the purpose of service is to pursue a feeling of belonging, who cares what the purported aim of it is? Is the current population of recruits unable to grasp the moral value of the mission? Do they find it unworthy?

The case the ad makes for joining the Marines is brilliant: it presents fundamental human needs, purpose and belonging, both of which are currently unfulfilled.

 

Serve Yourself

If the 2020 Marine Corps ad can be faulted for mild banality, the Army’s YouTube ad, “Emma: The Calling”, shamelessly presents the Army as one among many morally equivalent paths for self-fulfillment. The ad has many millions more views than “Battle to Belong,” despite being released almost a year later. But its reception was so poor the Army turned off the video’s comment section. Even Senator Ted Cruz responded, saying “Holy crap. Perhaps a woke, emasculated military is not the best idea. …”

“The Calling” depicts a real soldier, Cpl. Emma Malonelord, who works with the Patriot Missile Defense System. Despite the gravity of Malonelord’s occupation, the ad employs colorful, visually inoffensive, and almost playful cartoons. Malonelord’s narration heightens this discordance:

“It begins in California, with a little girl raised by two moms. Although I had a fairly typical childhood, … I also marched for equality. I like to think I’ve been defending freedom from an early age.” After describing the arduous recovery of one of her mothers from paralysis and her parents’ wedding, she tells us that she graduated top of her class from high school and joined other “strong women” at her UC Davis sorority. She continues:

But as graduation approached, I began feeling I had been handed so much in life—a sorority girl stereotype. Sure, I had spent my life around inspiring women, but what had I really achieved on my own? … I needed my own adventures, my own challenge. And after meeting with an Army recruiter, I found it. A way to prove my inner strength, and maybe shatter some stereotypes along the way.

The Army ad pitches two ideas in ways that may have the reverse effect of that intended. First and more laudably, “The Calling” posits that one should join the Army to defend equality and freedom. But in presenting gay marriage as the paramount manifestations of these principles, the commercial immediately alienates potential recruits with traditional values or reasonable ethical concerns.

It’s as though her friends’ expensive (if challenging) personal experiences and her military service are all morally equivalent activities on the path to finding themselves.

 

Its second and far less commendable assertion is that one should join the Army to prove oneself a strong woman. This argument begins hopefully enough with the phrase “I had been handed so much in life.” A generous interpretation of Malonelord’s statement is that it’s an expression of gratitude to America and her desire to give back. But it becomes clear that this isn’t what Malonelord means: she goes on to compare her dearth of accomplishments at graduation to the activities of her fellow sorority sisters, who, she reports, climbed Everest and studied abroad in Italy. It’s as though her friends’ expensive (if challenging) personal experiences and her military service are all morally equivalent activities on the path to finding themselves.

To Malonelord, and by extension, to the Army, service is an avenue to “shatter stereotypes,” flex one’s personal capabilities, and compare favorably to others, rather than a morally resonant act of personal sacrifice to serve one’s nation. This attention-seeking outlook recalls the online narcissism that “Battle to Belong” so incisively rejects.

The Oath

Contemporary recruitment messaging is troubling. America’s military must draw on cultural reservoirs that prize self-sacrifice and the honor due to country. After all, The Oath of Enlistment speaks of bearing faith and allegiance, recognizes a higher power, and insists on selfless service to the Constitution against her enemies. All recruitment efforts must ultimately attempt to persuade young Americans to take this oath at possible risk of life and limb, and the near-certain prospect of fear, physical strain, boredom, and homesickness.

Of course, people join the military for a variety of reasons and serve honorably once they join. The opportunities to improve oneself, through access to education and medical benefits, to escape poverty, and to file for naturalization are among the multitude of incentives that service members pursue. Every American owes a debt of gratitude to all service members, whatever their initial incentives for joining.

While some may find the Marine Corps’s call to belong compelling, this draw will not sustain the personnel needs of the entire armed services; after all, all kinds of careers, activities, and communities can give people the same sense of fulfillment.

 

Yet low recruitment rates and these commercials’ reluctance to mention the stated purpose of the military suggest that the reservoir of virtues on which voluntary service depends is running dry. A nation that is habitually excoriated by its political and intellectual leaders for such things as “perpetuating the unbearable human cost of systemic racism”, demonstrating “arrogance” in foreign affairs, wrongfully occupying ancestral land, and failing to address “the destructive nature of a system that is fueled by uncontrolled greed” (capitalism), while hesitating to celebrate its contributions to humanity, will inevitably struggle to rouse a genuine sense of duty from its youth.

To succeed at drawing in recruits, the military doesn’t necessarily need to make the oath’s normative content its central message, as it once did. The Marine Corps’s focus on the foundational human need for belonging is effective because it describes the transcendent value of character and self-sacrifice and carefully avoids the true yet controversial claim that America and its principles are their worthy beneficiaries.

Nonetheless, it’s alarming that the military must resort to generic appeals to meaning and belonging in order to recruit successfully. America’s national security is contingent on an all-volunteer force. While some may find the Marine Corps’s call to belong compelling, this draw will not sustain the personnel needs of the entire armed services; after all, all kinds of careers, activities, and communities can give people the same sense of fulfillment. Recruiters rely on Americans who believe in the moral preeminence of the United States and its constitutional principles. Our institutions—especially schools—must instill and embrace these truths: America’s national security cannot wait. 

Art, Beauty, and the Soul of the University

Valparaiso University, a Lutheran college in Indiana, recently announced plans to sell three pieces of art that are described as the “cornerstone” of the University museum’s collection. The sale—to include paintings by Georgia O’Keefe, Childe Hassam, and Frederic Edwin Church—is planned to fund renovations to the freshman dorms, the condition of which, according to Valparaiso’s president José Padilla, presents an “impediment to student recruitment and retention.” Some may laud this plan as a shrewd business decision, but divvying up university resources and selling them piecemeal presents a far greater threat to the University than outdated dorms.

Like many private educational institutions, Valparaiso University is facing hardships. The campus census peaked in 2015 with 4,544 students. By Fall 2022 that number had fallen to 2,939, which has led to a decline in tuition revenue. There’s no question that the university—as well as many others like it—is in dire need of students and the tuition dollars that they represent. So it is understandable that the board and the president of Valparaiso University see this sale as a potential lifeline for the University. But University stakeholders should pursue every avenue to halt the planned sale because doing so would undermine what it means to be a university.

Valparaiso’s plans to sell its art should interest anyone who cares about the direction in which higher education is heading: the debate that has ensued is emblematic of core disagreements about the purpose of higher education, and what role (if any) art and other “impractical and frivolous” pursuits play in a university. Beauty should occupy a privileged place in any educational scheme, but most especially at a Christian institution like Valparaiso.

Unfortunately, Valparaiso’s leadership holds a very different position. In defense of his decision to sell the University’s works of art, President Padilla has asserted that the Brauer Museum of Art is not a part of the “core mission” of the University, which he identifies as educating students. This conclusion requires a very narrow understanding of what education is.

Beauty should occupy a privileged place in any educational scheme, but most especially at a Christian institution like Valparaiso.

 

Universities should be repositories of culture and our shared civilizational heritage. They should be centers of intellectual inquiry and creativity. Universities provide space for scholarly reflection of professional academics. As such, they need artifacts and primary sources, which are the fruit of past generations’ intellectual inquiry and creativity. Similarly, universities need libraries, chapels, art collections, theaters, and concert halls just as much as they need classrooms, chalkboards, chemistry labs, and yes, dorms. Universities should seek to be what their name implies—universal communities of teachers and scholars that pursue knowledge for the sake of preserving it in the minds of future generations.

In the weeks following the withdrawal of U.S. troops from Afghanistan in 2021, Harvard University professor emerita Ruth Wisse penned an opinion essay in the Wall Street Journal regarding the decades-long absence of an ROTC program on Harvard’s campus. While Harvard’s was the first ROTC program in the nation that prepared college students for military careers, it came under fire by a few within the university community in the Vietnam War era and was effectively banned from campus. Prof. Wisse argues that the message that this has sent to students is that “a flawed America [isn’t] worth defending.” Similarly, carving up a university’s cultural artifacts for the auction block may send an analogous message—that American culture is not even worth preserving.

Art Is Intrinsically Interdisciplinary

University communities tend to segregate along disciplinary boundaries. The divisions that house the humanities—history, languages, and philosophy—usually have little to do with the divisions that house the scientific disciplines of biology, chemistry, and engineering. There are good reasons for this separation, but a community of teachers and scholars that shares universal truths needs intentional bridges between the organizational divisions. This need for cross-pollination stems from the fact that different fields of inquiry aim to reach the same ultimate places, albeit by different means and methods of inquiry. It is important for us to know why and how the Roman Empire collapsed in ways that only a historian can describe, but it is also important for us to understand how plants process materials found in their surroundings to make food in ways that can only be described by biologists and chemists. The knowledge of each of these seemingly unrelated things each has an aesthetic quality that anyone can appreciate.

Sir Roger Scruton observes that “beauty is an ultimate value—something that we pursue for its own sake, and for the pursuit of which no further reason need be given. Beauty should therefore be compared to truth and goodness, one member of a trio of ultimate values which justify our rational inclinations.” The pursuit of beauty—even particular and unrelated types of beauty—is not the work of artists and poets alone. It is the work of economists, physicists, and mathematicians, too.

Art is a convening point for many different avenues of pursuing beauty. It is the bridge between chemistry and history and between theology and engineering. Beauty is something every specialist cares about, even in the fields that seem most technical. The quantum physicist’s immediate goal might be to understand how observation affects an electron’s motion. But why is she pursuing this knowledge? Perhaps because this information will aid in developing new technologies; but more fundamentally, there’s something beautiful and arresting about the peculiar ways the smallest units of matter behave.

The pursuit of beauty—even particular and unrelated types of beauty—is not the work of artists and poets alone. It is the work of economists, physicists, and mathematicians, too.

 

The fine arts are traditionally the fields most directly devoted to the pursuit of beauty, which is why it’s so important for universities to be repositories of art. Indeed, traditional art captures exactly why the yearning for beauty is so fundamentally human. Many works are an artist’s quest to understand more fully an aspect of creation by recreating an image of it. To complete the poignant Memorial to Robert Gould Shaw and the Massachusetts Fifty-Fourth Regiment, sculptor Augustus Saint-Gaudens spent years studying the details of faces of his subjects, making cast after cast after cast until their likenesses were accurately represented. He also had to understand metallurgy and the limits of casting technology in order to design a fitting tribute that was feasible to bring to fruition.

Human minds have, in Hume’s words, “a great propensity to spread [themselves] on external objects.” The art that is created in different times and places helps us to understand the minds of those who believed that the earth was flat, who feared the beasts of the bestiary, who had yet to be convinced that germs cause disease, and those wrestling with the horrors of war and natural disaster. It is an instructive thing to have many different types and examples of external objects—paintings, particle accelerators, and reference books—at the disposal of universal communities of teachers and scholars. Studying what past cultures and eras considered beautiful isn’t just a historical curiosity—because beauty is a universal and ultimate value, it also can inform and guide our own understanding of beauty. Again, this points to the importance of universities’ being home to fine art: art instructs us in the traditions we inherit, and tutors our own aesthetic sensibility—an indispensable aspect of learning.

The Heightened Responsibilities of a Christian Institution

“Nobody who is alert to beauty,” writes Scruton, “is without the concept of redemption—of a final transcendence of mortal disorder into a ‘kingdom of ends.’ In an age of declining faith, art bears enduring witness to the spiritual hunger and immortal longings of our species. Hence aesthetic education matters more today than at any previous period in history.” Christian universities such as Valparaiso are by definition committed to a worldview that is grounded in a story (the story, they would say) of redemption; that is, a story framing who human beings are, how we are meant to flourish, and how today we can move toward that flourishing—and away from the current toxic cultural undertows of hedonism and nihilism that threaten our very being. These institutions not only have a responsibility to the creative arts, but also to recognizing the creativity and innovation inherent in the good things produced in all disciplines. Aesthetic education in the sciences and the humanities, as well as in the arts, is an urgent need that Christian universities are uniquely situated to meet.

In addition to being universal and interdisciplinary, art is also transcendent—or at least represents the transcendent. As God’s image-bearers, human beings share in God’s creative capacity; and this is not limited to our interactions with the material world. Human beings are moral agents capable of knowing and doing good and evil, and who are responsible for acting according to this knowledge. The best art is a human endeavor to represent the transcendent values of truth, goodness, and beauty in media accessible to the senses of others. It bears witness to the creator of the art and the Creator of the artist.

In short, to trade the artifacts of transcendent gifts for the immanence of capital improvement is to sell our human birthright for a mess of pottage. Or, to use the language of the boardroom and the annual report, to divest a Christian university community of its artifacts of human creativity is to betray the heart of that community’s mission.

The best art is a human endeavor to represent the transcendent values of truth, goodness, and beauty in media accessible to the senses of others. It bears witness to the creator of the art and the Creator of the artist.

 

Pragmatic Considerations of Leadership

Not all universities own art, but the ones that do are placed in a position of special trust. The purchase of a university’s holdings was made possible by those whose generosity saw the importance of keeping the art in the context of a Christian university. Benefactors of the past thought that a collection of art was an important aspect of the identity of a university and could have given their wealth to any number of institutions or endeavors—but they selected a particular endeavor at a particular institution. Valparaiso’s art wasn’t just a potential vehicle for long-term investment; it is an investment in the mission of a university. Carving out an essential part of the university’s educational resources in service of more prosaic, pragmatic ends not only offends the sacrifice of previous donors, it undermines the university’s credibility in the eyes of prospective ones.

Further, one can only assume that the present collection of the Brauer Museum of Art was built by experts around these earlier acquisitions. Moreover, faculty and staff in the areas most directly affected by the collection have made career decisions based on proximity to the collection. Carving up the collection for the auction house disrespects the careful curation done by previous and current faculty and staff. The decision to auction off these works undermines the authority and expertise of those given responsibility to steward this particular area of the University’s resources and programs.

These challenging times require creative needle-threading that seeks to reshape the revenue structures, recruiting methods, and other aspects of university operations without altering or betraying the core identity of the school.

 

And finally, the works of art that Valparaiso’s board and president intend to sell are irreplaceable. If recruitment and retention stabilize once the freshman dorms are updated and revenue climbs to a level that enables the university to invest further in the Brauer Museum, these works will not be on the market. They are never again likely to be the property of Valparaiso University. In fact, the sale of these works will very probably be a turning point in the history of the University and will represent a turn toward greater and greater decline and instability. Many institutions that ultimately close make these kinds of shortsighted decisions. This kind of move sends signals to donors and prospective students that ultimately seal an institution’s future. This decision could easily become one such future-altering decisions.

In a time when demographic realities foretell stark economic decline for higher education, responsible boards and administrators must make difficult decisions. These decisions cannot be informed by a nostalgia that will handicap the forward progress of the institution. But neither can they focus shortsightedly on the aspects of university mission and business that are most easily quantifiable. These decisions must be reasoned, principled, and made in service to the mission of the institution and with deference to the good will of past, present, and future stakeholders. These challenging times require creative needle-threading that seeks to reshape the revenue structures, recruiting methods, and other aspects of university operations without altering or betraying the core identity of the school. It is hard to see how raiding the Brauer Museum will do more than provide a temporary patch on a much larger problem. The cost will be inestimable: loss of unique resources, a loss of missional commitment, and a loss of reputation.

On Abolishing the Death Penalty: A Response to Daryl Charles

 . . . [Divine] Mercy is found even in the damnation of the condemned, for, while not completely loosening the punishment, It nonetheless lessens it short of what is entirely deserved. (Summa Theologiae Ia.21.4 ad 1).

I would like to address J. Daryl Charles’s argument published here at Public Discourse yesterday that the death penalty is a mandatory punishment for premeditated murder, necessary to achieve justice, and necessary to respect the image of God in the offender by holding him responsible for his acts. I cannot address everything that Charles argues in his essay. I will do only three things. First, I will argue that Charles has not and cannot successfully press the case that the use of the death penalty is mandatory in the exercise of punitive justice. Second, I will argue that it should be abolished in the United States, against the background of Thomas Aquinas’s argument (that Charles himself cites) that taking a criminal’s life is lawful in order to protect the common good. Third, I will conclude by reflecting on the implications of the image of God within us for justice and mercy.

The Early Church and Capital Punishment

On the first point, consider only the early Church figures that Charles cites. By and large, they do indeed recognize the legitimate authority of a political community to take the life of an offender. However, they also recognize something that Charles does not: it is within the state’s authority to refrain from this punishment and to instead extend mercy. (In this section, I summarize, paraphrase, and later quote the excellent article by Phillip M. Thompson, “Augustine and the Death Penalty,” in Augustinian Studies, January 2009, pp. 188–203.)

For example, Lactantius in one work forbids anyone in authority charged with the administration of justice from charging anyone for a capital crime, but in another acknowledges the state’s authority to put someone to death. Tertullian recognizes the authority of the state to impose death, and yet forbids any Christian from doing so. So also, the Christian author Athenagoras, whom Charles does not cite, forbids Christians from participating in it. This is not yet a point about mercy. However, it suggests a certain abhorrence on the part of the early Church for the death penalty as inconsistent with the life of a Christian. The secular or pagan state may be permitted to impose death as a punishment, but the authors suggest Christians ought to play no part in exercising that power.

The early Church figures recognized something that Charles does not: it is within the state’s authority to refrain from capital punishment and to instead extend mercy.

 

Augustine, however, is particularly important when it comes to mercy. He recognizes the authority of the state to impose death as a penalty, particularly to protect the common good from a threat to its safety. And he does not forbid Christians from participating in it, as others had. But he also pleads for mercy on the part of governing authority. In one case pleading for mercy he writes, “[W]e do not in any way approve the faults which we wish to see corrected, nor do we wish wrong-doing to go unpunished because we take pleasure in it. … [However,] it is easy and simple to hate evil men because they are evil, but uncommon and dutiful to love them because they are men.” Even if one does not agree with Augustine that one should love the offender because he is a man, as it seems Christ commanded us to do, Augustine gives evidence in the Christian community of the recognition that not only justice is the task of the state, but also that mercy is within the authority of the state, as much within its authority as is the authority to execute the offender.

However, the importance of mercy amid justice is no sectarian Christian virtue. The responsibility of governing authority to show mercy is a fact recognized by Seneca, no Christian, in his letter to Nero, De Clementia, where he argues that mercy in a ruler is essential to governing. As a stoic, however, his argument for mercy is significantly different from Augustine’s, focusing not on loving the offender as a fellow human being, but on the need to rein in both leaders’ and society’s passions of cruelty and savagery, passions that often accompany the just desire to punish. He goes on to argue that the power of the emperor to extend mercy is even greater and more manifest than is his power to condemn, “for anyone can take a life, but few can give it.” That power in a ruler is in fact godlike, according to Seneca. Aquinas agrees when he writes that among all of God’s attributes, it is in mercy that God’s omnipotence is most clearly shown. (“Unde et misereri ponitur proprium Deo, et in hoc maxime dicitur eius omnipotentia manifestari” ST IIaIIae 30.4.)

Just as one would be hard pressed to find a culture with a governing authority, biblical or otherwise, that had not at some point asserted and exercised the right to put capital offenders to death for heinous crimes, one would be equally hard pressed to find one that did not claim the authority to exercise mercy and punish short of death. Good government in the administration of criminal punishment will establish a range of possible punishments for a crime, acknowledging the need for both mercy and equity in judging which punishment is best in the circumstances. Again, this is a point recognized by the pagan Seneca, who argued that mercy does not come after the judgment of just punishment, to limit justice as it were, but enters into the determination of what justice is in a particular case.

Aquinas on Capital Punishment

Aquinas argues that this governing authority to establish the character of punishment and its application to cases is rooted in the natural law. But in the end, positive human law determines the actual force and scope of punishment. Any such “determination” of the actual punishment appropriate for a crime has only the force of human law, not the force of the natural law itself (ST IaIIae 95.2). We decide how crimes will be punished as a matter of human positive law, not by deriving them from natural law. This determination is part of our dignity as images of God: we participate in divine providence by being provident over ourselves. We use our reason both to recognize the natural law within us and to establish human law over diverse political communities and common goods (ST IaIIae 91).

In addition, like Seneca before him, Aquinas recognizes that it is also the task of judicial authority to exercise equity (epikeia) when determining punishment under human law. The judicial authority does this by taking into account circumstances not anticipated by the legislature when it crafted the law. In other words, judicial authority sets aside “the letter of the law,” lest one sin against the common good by application of the “letter of the law” (ST IIaIIae 120.1).

If we acknowledge Aquinas as an authority on these matters, as Charles seems to do in citing him, these points make it clear that it is well within the governing authority of a community to refrain from the use of the death penalty to punish crime. Political leaders can even refrain from legislating that capital punishment will be among the range of punishment for serious crimes, including premeditated murder. This decision ultimately requires reflection about how to preserve and promote the common good of a particular community in a particular place and time.

Political leaders can even refrain from legislating that capital punishment will be among the range of punishment for serious crimes, including premeditated murder.

 

A Case for Abolition

Now I would like to argue that the death penalty should be abolished, at least in the United States and many other nations as well. Of course, Aquinas’s argument about the lawfulness of a community’s taking the life of an offender is often cited by proponents of the death penalty in the way Charles cites it—as if permissibility requires the exercise of the death penalty in certain cases, that is, makes it “mandatory.” However, Aquinas’s argument is merely that it is lawful to take the life of an offender to protect the common good from threat. He does not come anywhere close to arguing that it is required or mandatory for certain crimes.

It is very important to notice about Aquinas’s argument that it is not based on principles of restitution, that is, not based on the need to redress the harm done to the one who has been wronged. Nothing can be done to redress the wrong done to the one who has been killed. Nothing can undo that, no recompense given, no restitution made. This is a point that Charles himself recognizes in the case of murder. In addition, though it might sound harsh, it is not the task of governing authority to criminally punish offenders in order to assuage the anguish of the family and friends of the murdered. Those left behind or affected by a murder may derive a certain amount of psychological catharsis in seeing the responsible parties punished, but it is probably not lasting. What’s more, in terms of restitution, nothing can undo what they have lost. That is in part the horror of the crimes done—that nothing can be done to restore to the one abused or to family and friends what has been taken from them. At best, punishment can be merely symbolic with respect to restitution in these cases.

Criminal law punishes not on behalf of the individuals who have been harmed by a crime, but on behalf of society’s common good, which has been harmed by a crime. It punishes to restore the order and peace of society that has been disturbed by the crime, and to further protect it from threat. But even here in the case of capital crimes, society cannot have restored to it what the crime has taken; society cannot have restored to it the human being killed. No punishment will return the order of tranquility and innocence lost to a society by the abuse of children, or to women and men by various forms of violence. However, a certain amount of peace and tranquility can be restored in protecting society from further threat of such things.

Close attention to Aquinas’s argument suggests that a malefactor loses, in a way proportionate to the gravity of the crime, the protections afforded by society to the innocent and thus becomes subject to the punishment of the law. This is its retributive aspect, that formally the gravity of the punishment is directed to the will of the offender in a way proportionate to the gravity of the crime willed by the offender. But, as we’ve seen above, the idea of “proportion” here is not a conclusion drawn from the natural law, but a “determination” of human law in light of the common good that punishment serves to protect. The offender becomes subject to the loss of property or the loss of freedom of movement, for example. Also, in some cases he becomes subject to the loss of life.

Aquinas’s argument is not that killing an offender is always lawful, much less that it is mandatory. It is lawful upon a condition, namely, that it is necessary to protect the common good from a threat.

 

What Aquinas does not argue, however, is that having become subject to punishment, even possibly the loss of life, a criminal’s loss of life is required or mandatory. It is not the so-called lex talionis—an eye for an eye, a tooth for a tooth, and a life for a life—that allows for the lawfulness of killing another human being. The lawfulness of ending an offender’s life is based on the need to protect society from the threat by the one who has lost the protection of society. So, one cannot claim that according to Aquinas the killing of a human being is lawful in order to pursue retribution or restitution for the wrong done. It is lawful to protect from harm, which is forward-looking, not backward-looking.

Thus, Aquinas’s argument is not that killing an offender is always lawful, much less that it is mandatory. It is lawful upon a condition, namely, that it is necessary to protect the common good from a threat. Absent that condition, Aquinas does not argue or even suggest that killing a malefactor, including one who has committed murder, is lawful. Indeed, in his discussion of clementia as mercy extended to those who are subject to punishment (ST IIaIIae 157.3 ad 1), he suggests that the desire to harm through punishment should be avoided and mitigated, even when someone deserves punishment. He also says that it is better when the one doing the punishing decides the wrongdoer has had enough, rather than pursue the full extent of punishment possible.

One can ask under what conditions governing powers should exercise mercy. The exercise of mercy ought also to take into account the common good. Aquinas’s argument that it is lawful to take the life of a malefactor to protect society from threat also suggests that mercy is legitimately exercised when no threat to the common good is posed by the offender. This would be the case when, for example, the offender can be rendered harmless to the common good by means other than killing. This point concerning the death penalty is made explicitly by Pope St. John Paul II in his encyclical Evangelium Vitae.

In the thirteenth century when Aquinas made his argument, it might not have been possible in general to render a murderous malefactor harmless short of killing him. However, it is now possible in the United States and many other modern nations to protect the common good from the threat of those who have committed murder and other heinous crimes. That being the case, we have no reason to think that it is lawful for those nations to kill human beings who in other times and places might pose a threat to the common good. Given the imperative to act with mercy as much as with justice, the death penalty ought thus to be abolished in the United States and other such countries.

It is now possible in the United States and many other modern nations to protect the common good from the threat of those who have committed murder and other heinous crimes.

 

Mercy and Justice

To conclude, Charles makes much of the idea that the image of God in human beings is not taken seriously if we do not hold offenders responsible for their crimes, particularly those who have killed with premeditation. I agree with that proposition. What Charles and other defenders of the death penalty do not take seriously enough is the thought that being made to the image and likeness of God is not a static fact of human nature, but a responsibility. Nothing can erase from the nature of a human being the image created within him or her by God, no sin or crime however heinous. We acknowledge that fact when we hold our fellow human beings and ourselves responsible for our actions. However, being made to the image of God is a vocation, that is, a call to being godlike in all that we do. So, those of us who seek justice in the punishment of others must ask ourselves what our godlike responsibility is in the circumstances in which we live.

God is not simply a God of justice, but also a God of mercy. Mercy and justice are not set against one another. As Aquinas argues, they are manifest in every act of God’s (ST Ia 21.4). Even so, divine justice is founded upon divine mercy. Even the pagan Seneca recognized that mercy is as godlike as justice. In addition, mercy is not a poor second cousin to justice. Again, Aquinas argues that among human virtues, mercy is the greatest of all virtues, greater even than justice (ST IIaIIae 30.4). Mercy does not come in after justice to limit it. Mercy informs justice. Indeed, if we take Aquinas seriously, justice strives after mercy as its goal. “It is clear that mercy does not take away justice, but is in a certain way, the plenitude of it” (Ia 21.3 ad 2). Justice must always then be informed by mercy. After all, with the responsibility to live up to the image of God within us, it is worth pondering the fact that not even the damned in Hell are punished by God as much as justice alone might allow that they deserve. How much more, then, should punishment be loosened for those among us who have not yet been damned?

Capital Crimes and Capital Punishment

Today’s essay by Daryl Charles makes the case for capital punishment for those guilty of premeditated murder. Tomorrow’s essay will be a reply by John O’Callaghan, who argues for the abolition of the death penalty.

Our culture is morally confused about many things. Conservative Christians tend to focus on concerns about hot button issues like abortion, sexuality, relativism, and education. But one phenomenon that is overlooked but no less urgent is murder. A mere listing of mass killings occurring in the United States in recent years boggles the mind. The year 2022 alone, by all counts, surely must be record-setting. As I write (early October of 2022), one reads that a fifteen-year-old in Raleigh, North Carolina goes on a shooting spree, killing five (including his brother and an off-duty police officer) as he walks through a neighborhood; that five people have been shot in a northern South Carolina home; that in Bristol, Connecticut, two police officers are fatally shot and a third wounded in responding to a domestic violence call (which is said to have been a ploy to lure the officers); and that a thirty-two-year-old is charged with two counts of murder and six of attempted murder following a stabbing melee on the Las Vegas Strip. In the same week we read that four bodies have been found in the Oklahoma River. And this just in: a Florida jury recommends a life sentence without parole instead of the death penalty for the man who murdered seventeen people at Marjory Stoneman Douglas High School in 2018.

These randomly selected tragedies, of course, only identify the tip of the iceberg. Most of our cities are currently reeling from the sheer frequency of killings that have descended on our streets, our neighborhoods, our families, and even our schools. In the four months of June, July, August, and September of 2022, there were respectively 73, 100, 71, and 70 mass shootings, with the total numbers of injured or dead being 297, 439, 262, and 272 respectively. Tucked away in these numbers are the horrendous tragedies and suffering that have visited communities such as Orlando, Uvalde, and Buffalo.  According to the Marshall Project, more mass shootings (i.e., of four or more victims) have occurred in the last five years than in any other half decade since 1966.

On the rare occasions when the death penalty is discussed today, it is almost always viewed unfavorably.

 

The barbarism behind increasing murder rates suggests that debates over capital punishment should intensify. But this has not been the case. In fact, what is striking is the relative absence of discussion and debate of the death penalty. On the rare occasions when the death penalty is discussed today, it is almost always viewed unfavorably. The abolitionist argument takes any number of forms: the mental health of the murderer; the possibility of prematurely ending a criminal’s rehabilitation; debates over deterrence; the misconstruction of retribution as revenge (i.e., retributivism); fallibility of the criminal justice system; and modern notions of “civility.” Religiously motivated abolitionists sometimes point to the annulment of the Mosaic code, assumed ethical discontinuity between the Old and New Testaments; and Christ’s teaching on forgiveness.

We have grown intolerant of meting out punishment that is perceived as “cruel” or “barbaric.” Strangely, however, our abhorrence of penal “barbarity” is displayed against the backdrop of increasingly barbaric criminal acts themselves. Indeed, until very recently, capital punishment was almost universally affirmed biblically, morally, and legally.

Christian History and Capital Punishment

Even though abolitionist arguments have been dominant in recent decades, it is worth recalling that for most of history, a variety of civilizations have used the death penalty and grounded it in serious moral reflection. Capital punishment was practiced in the earliest recorded history. Its prescription appears in the mid-eighteenth-century-BC Code of Hammurabi, in sixteenth-century-BC Assyrian codes, in fifteenth-century-BC Hittite codes, in the thirteenth-century Mosaic Code, as well as in ancient Greek and Roman law.

Among the church fathers, one finds varying perspectives on the death penalty, although there is a general recognition of the state’s responsibility to implement capital justice. Tertullian (late second century) and Lactantius (late third century) affirmed that in the case of murder divine law consistently required a life for a life. Both Augustine (late fourth and early fifth century) and Theodosius II (mid-fifth century) acknowledged the state’s role in mediating capital sanctions. In addition, various councils from the seventh century (the Eleventh Council of Toledo) to the thirteenth (the Fourth Lateran Council) followed the lead of Leo the Great (fifth century) in forbidding clerics from engagement in matters of capital justice, even as they understood the state’s legitimate role in facilitating such matters.

It is worth recalling that for most of history, a variety of civilizations have used the death penalty and grounded it in serious moral reflection.

 

In the Summa Contra Gentiles, Aquinas insisted that the community had both the right and the duty to “cut away” an individual in order “to safeguard the common good.” The common good, he reasoned, is “better than the good of the individual,” notably because “certain pestilent fellows” who serve as “a hindrance to the common good, that is, to the concord of human society.” Such persons, he concluded, therefore “are to be withdrawn by death from the society of men.”

Late medieval and Reformation-era theologians also affirmed the state’s duty before God to impose capital sanctions upon murderers. Even the so-called “left wing” of the Protestant reform movement—from which much modern religious opposition to capital punishment is thought to derive—recognized the death penalty. The Schleitheim Confession of 1527, an exemplary document adopted by the Swiss Brethren (the progenitors of earliest Anabaptism), reads: “The sword is an ordinance of God. … Princes and Rulers are ordained for the punishment of evildoers and putting them to death.” This Anabaptist declaration concurs with the 1580 Lutheran Formula of Concord, which prescribes for “wild and intractable men” a commensurate “external punishment.” In summary, the patristic, medieval, and late medieval periods generally mirror the church’s tacit acknowledgment of capital punishment in cases of murder.

There has been much debate about the place of capital punishment in Catholic social thought, but the pre-2018 version of the Catechism of the Catholic Church articulates the most sound position on the topic:

Preserving the common good of society requires rendering the aggressor unable to inflict harm. For this reason the traditional teaching of the Church has acknowledged as well-founded the right and duty of legitimate public authority to punish malefactors by means of penalties commensurate with the gravity of the crime, not excluding, in cases of extreme gravity, the death penalty. (no. 2266, emphasis added)

The Church’s teaching on the death penalty historically finds expression in Thomas Aquinas’s question “Is it legitimate to kill sinners?” (S.T. II-II, Q. 64.2). His response is predicated on “the common good,” with an analogy. If the well-being of the whole physical body requires the amputation of a limb, the treatment is to be commended. “Therefore if any man is dangerous to the community and is subverting it by some sin, the treatment to be commended is his execution in order to preserve the common good, for a little leaven sours the whole lump.” What is forbidden, Aquinas argues later, is to take the life of an innocent person (Q. 64.6).

Theological Arguments

It is no surprise that for most of Christian history, the death penalty was widely embraced. To understand the theological basis for capital punishment, we must first look to the nature of law. Fundamental to the biblical story is the depiction of the Creator as Lawgiver. Law, as the reader of the biblical narrative discovers, is of no human origin, nor does it issue out of pragmatic or popular consent. Its authority issues not out of cultural or intellectual enlightenment but from the universal created order. Whether individual cultures and regimes recognize this reality is, of course, another matter. Yet the effects of moral law are as the law of gravity: its disregard always and everywhere results in the rotting of social culture.

In a remarkable—though little remembered—series of homilies in 1976 (eventually published under the title Sign of Contradiction), Karol Wojtyła, at the time Archbishop of Cracow, delivered a sustained reflection on the question of man’s purification from sin in the present life. Guilt incurred by sin, he observed, constitutes a debt in the present order that must be paid. Punitive dealings, thus, provide necessary atonement and restore the balance of justice and moral order that has been disturbed. In theological terms, they prepare the human being for a destiny in eternity.

Punitive dealings provide necessary atonement and restore the balance of justice and moral order that has been disturbed.

 

In Sign of Contradiction, we find the necessary response to religious abolitionists who ground their bias against capital punishment in a particular (mis)reading of Jesus’ teaching in the Sermon on the Mount and a purported “forgiveness” ethic. Pitting a misinterpreted Jesus against Paul, they ignore ethical continuity in the broader sweep of biblical revelation and prefer, based on the flaws of human government, to dismiss its divinely instituted commission to punish evil and reward good.

This defect in religious thinking—“the cross is not a sword”—demonstrates conspicuous disregard for what is recorded in the text of Genesis 9, where God makes his covenant with Noah after the flood, the implications of which are stated in universal terms. It is precisely because the human being is fashioned in the image of God that one who purposely sheds the blood of another must die. Genesis 9:5–6 is to be understood foremost as an institution that protects life, in accordance with the Decalogue’s pronouncement “Thou shalt not murder.” The rationale for this morale imperative is none other than the safeguarding of human life. Retribution discourages invasions on the sanctity of the human creature. (Here let us pause for a moment: any society that advances an “inversion ethic” of killing human beings in utero while refusing to take the life of a convicted murderer should strike us as barbaric.) The Genesis narrative suggests that premeditated murder is an assault on human life and is comparable, as it were, to an assault on the very being of God. The fact that justice demands punishment befitting the crime—i.e., just retribution—reveals the essential difference in character between revenge and retribution.

Having been created in the imago Dei, human life is a sacred trust. The implication is clear: murder—i.e., the deliberate extermination of a life by another human being—is tantamount to killing God in effigy. Moreover, as confirmed in Mosaic legislation, premeditated murder is the one crime for which there exists no restitution. As C. S. Lewis noted in his wonderfully prescient essay, “The Humanitarian Theory of Punishment,” to be punished however severely because we in fact deserve it is to be treated as a human being created in the very image of God.

Justice, Charity, and Restitution

When a murder occurs in our community, we are obliged, regardless of our comfort level, to clear our throat as it were and make a public, communal declaration. The deliberate extermination of another human being is an absolute evil and therefore absolutely intolerable. Period. This was the argument advanced by David Gelernter in a 1998 Commentary essay. Gelernter, a professor of computer science at Yale who was letter-bombed in June of 1993, almost lost his life. The essence of Gelernter’s argument was that we execute murderers in order to make a communal statement: deliberate murder embodies evil so terrible that it defiles the community, and as a result the community needs catharsis that occurs through criminal justice properly construed. Gelernter correctly noted that in the face of murder, contemporary society, however, is more prone to shrug it off than to exact just retribution and affirm binding moral standards.

In the case of premeditated murder, compensation is not available. Hence, as much of human history attests and as the biblical witness affirms, it is the one crime that carries a mandatory death sentence.

 

Justice has classically been defined as rendering to each person what is due. Desert, then, is foundational to ethics; it is a part of the human moral intuition. Parents know this, children know it, and indeed people from Kansas to Canada to Kenya to Kazakhstan know it as well. Retributive justice, then, is a social good, for it corrects social imbalances and perversions. Herein we find the wedding of justice and charity. When through social attitudes and the rule of law we affirm retributive justice, we are addressing several levels of “imbalance” and disturbance: we are attentive to (1) the victimized party who has been wronged, (2) society at large, which has been offended and is watching, and (3) future offenders who might be tempted to do evil. This trifold application of justice, as Augustine and Aquinas understood it, was in fact an application of charity. And as an extreme example, both saw going to war as an act that can express the wedding of charity and justice properly construed: it is charitable to prevent the criminal (whether domestically or in war) from doing evil, and it is charitable toward society, which needs protecting.

Punishment for a crime and restitution for the victim are interrelated concepts. In the case of premeditated murder, compensation is not available. Hence, as much of human history attests and as the biblical witness affirms, it is the one crime that carries a mandatory death sentence. To suggest or argue that the ultimate human crime should not be met with the ultimate punishment being meted out by civil authorities, at least in a relatively free society, is not some “higher” ethic as some might contend; nor is it “Christian” by any stretch of the imagination. Rather, it is a moral travesty because it fails to comprehend the nature and meaning of the imago Dei.

Civilized societies do not tolerate the murder of innocent human beings; uncivilized regimes do.

Black Is Beautiful

Georges Bernanos, the great French writer, had a particular distaste for the cult of optimism, especially its American variety. The idea that things somehow, naturally, turn out for the better struck him as laughably deluded, a form of “whistling past the graveyard.” In the real world, things often don’t turn out for the better. They get worse. And the graveyard can’t be ignored away, because death is real. We all sooner or later face it, and no amount of whistling will help us sneak past it. So how a culture deals with death speaks volumes about its mental health—and its understanding of who and what a human being is.

I turned 70 four years ago. Shortly thereafter I informed my family that I’ll be very displeased, despite being dead, if the priest at my funeral wears anything but black. They weren’t surprised. In the Catholic tradition, liturgical vestments have a catechetical role. They give meaning to the moment. Green is for Ordinary Time and the virtue of hope. Red is for Pentecost, the fire of the Holy Spirit, and the blood of martyrs. White, and occasionally gold, reflect the glory of the Christmas and Easter seasons, solemnities, and the great feasts. Purple is penitential for Lent and anticipative for Advent, and the color rose—hinting at the joy to come—is used on Advent’s Gaudete and Lent’s Laetare Sundays.

This “curriculum of color” in Catholic worship, like the annual progress of the Church calendar, has always made deep sense to me, mirroring the course of life itself. What doesn’t make sense is the absence of black today at exactly the emotional and religious event that most profoundly needs it. Since Vatican II, most Catholic funerals have been celebrated with white vestments, or the alternative, purple. Black vestments are still a legitimate option, but—at least in the United States, where optimists are (or until recently, were) the dominant sect—they’re rarely used. Many parishes no longer have them.

This is instructive. The reasoning for black’s exile goes something like this: white encourages trust in God’s love and mercy for the deceased. Purple communicates grief, but not fear. Black, on the other hand, is a downer; a hammer that drives home the nail of loss. And yet, ironically, this also makes it the sanest, most powerful, of funeral colors. It acknowledges the pain of mourning as necessary to healing, and it implicitly reminds us that an immediate, happy eternity is in nobody’s automatic “win” column. If white implies a freeway diamond lane past any afterlife unpleasantness, black reminds us that many of us will be stuck in traffic. Some of us for a long time—or worse. Which suggests two important facts: that the deceased needs our prayers, and that our own lives, and their content, will be judged in due course by a God who is not only merciful, but also just.

Most Catholics and other Christians will understand the argument I’ve just made, even if they disagree with the particulars. Death is a universal fact of life, but as Charles Chaput, borrowing from the philosopher Hans Jonas, noted some years ago, man is the only creature, among all living things, that knows he must die: “No other species buries or remembers its dead as we do. The grave is a uniquely human fact. It reminds us that we’re not like other animals. Thus, repudiating the grave implicitly denies our distinctive humanity by denying one of its most important markers.”

Death has emotional “weight” because every human life is woven into a fabric of other conscious, intentional lives and loves. The loss of the individual matters to those others in a singular way. Death also has a sacred aura because, for the religious believer, each human life is unique and unrepeatable, with a God-given dignity. And the human body, especially for Christians, is not an irrelevant husk of the departed soul. It’s an essential element of his or her personhood, destined to one day rise again. Funeral liturgies are thus as much for the living as they are for the dead. They remind us of the type of creature we are; that life has purpose; and that the grave is not the end. As Jonas wrote, “metaphysics arises from graves.” Or to put it another way: the grave anchors us to reality; the reality of loss in this world, and the reality—or at least the promise—of unending life in the next.

Death also has a sacred aura because, for the religious believer, each human life is unique and unrepeatable, with a God-given dignity.

 

Death in a Secular Age

Now having said all of the above, consider the following.

The country I grew up in no longer exists. The late Paul Johnson said that America was born Protestant. And while scores of millions of us still practice some form of a biblical faith, it’s more accurate to say that America was born from a marriage of Bible-based belief and Enlightenment thought. Mixed marriages can survive and thrive, and many do—but only if the partners are truly compatible, and stay that way. The original brand of American Christianity was rigorously Calvinist. As the philosopher George Parkin Grant argued in Technology and Empire, the nature of Calvinist theology “made it immensely open [to] empiricism and utilitarianism,” resulting in a religion of material progress marked by “practical optimism” and “discarded awe.” Translation: we’re a pragmatic people. We’re addicted to technological mastery. And our real faith—no matter how we label ourselves—is a practical agnosticism focused radically on this world. In effect, we’re becoming the most thoroughly materialist culture in history.

This has shaped American attitudes toward death in different ways. Serving the needs of the deceased, and surviving family members, with compassion and dignity is a noble vocation. For Christians, it’s one of the seven “corporal works of mercy.” But it can also be a lucrative line of work, because there’s no lack of customers. Jessica Mitford’s 1963 book, The American Way of Death, attacked the funeral industry of her day for its greed and phony piety. Terry Southern did the same in The Loved One, a fiercely satirical 1965 film based on the Evelyn Waugh novel.

But that was then. Feelings toward death today reflect the more secularized, irreligious nature of the country. High-end caskets have lost some of their luster. Cremation is common. The afterlife is often viewed as a fantasy, or a guaranteed Happy Place with furniture provided by a person’s private beliefs.

Some grieving survivors have taken to scattering the ashes of a deceased loved one in a place of special meaning. In an age when green is especially good, and environmentalism can verge on the cultic, “human composting” is increasingly available. After all, why not feed Mother Nature with the rich nutrients of your decomposing corpse? And it’s attractively priced: currently at $5,500 or less. But these low-tech approaches have an unsatisfying, romantic-theatrical feel to them. The real experts know, or think they know, that the best way to deal with death is to simply kill it.

The afterlife is often viewed as a fantasy, or a guaranteed Happy Place with furniture provided by a person’s private beliefs.

 

Denying Death

Futurists like Ray Kurzweil claim that we’re just a few scientific steps away from achieving immortality. The biotech company Calico Life Sciences focuses on extending human longevity and defeating age-related disease. Other research seeks to copy a person’s identity via memory chip, and then transfer it to new bodies as old ones wear out. The Cryonics Institute offers prompt cryogenic suspension and long-term storage for victims of terminal illness. For the low, low, price of $28,000, a person can be frozen in the hope that, in the future, he or she can be revived and healed with new nano-technologies. Other companies offer similar cryo-options at price tags up to $200,000, with an economy version for the head only. And yet Leon Kass, the great Jewish bioethicist, has repeatedly asked the essential question: why would any sane person want to live another century, or five, or forever in this world, even under the best circumstances?

Survival is an end in itself only in the short term. An endless life of consuming and digesting successive experiences isn’t a “human” life at all. We have an instinctive hunger for higher meaning; for a shared purpose beyond ourselves. Without it, even cocooned in luxury, even with our senses anesthetized by noise and distractions, we end in despair. Yet this is exactly what American culture now breeds, which is why our rates of depression, mental illness, and drug use continue to climb. Even suicide can seem to make terrifying sense. Why stick around if this is all there is? Scattering ashes, human composting, radical longevity extension, cryogenic suspension: these all, in their own odd ways, involve a kind of self-delusion that masks unacknowledged, unhealable emptiness. They’re blind to what death at the end of a good life, well lived, actually is: the doorway to something greater.

An endless life of consuming and digesting successive experiences isn’t a “human” life at all. We have an instinctive hunger for higher meaning; for a shared purpose beyond ourselves.

 

In his letters, J. R. R. Tolkien wrote that the real theme of Lord of the Rings is death and immortality; that attempts to artificially prolong life in this world are a trick of the Evil One; that we’re made for another home; and that one of God’s great gifts to men “is mortality; freedom from the circles of the world.” Simply put: we’re more than clever primates with a gift for telling tall tales about some Sugar Candy Mountain in an afterlife. One of the tales is true.

I suppose what I’m saying here is simply this: all those we love, and we ourselves, will one day go through that great black door of death. We need to acknowledge that fact and not try to evade or soften it. Without God, life really is a tragedy, and our mourning is a meaningless biochemical reaction. But our story doesn’t end there. The door has another side: a side of light, with a waiting, loving God. That’s why I can write these thoughts and sleep very well tonight—although I do hope that if my pastor ever reads this, he’ll buy some black vestments. I’ll gladly kick in on the cost. Because black is beautiful.

An American Tragedy: On the Suicide of Liberal Education

The dust jacket of John Agresto’s new book, The Death of Learning: How American Education Has Failed Our Students and What to Do about It, depicts Gore Hall at Harvard in the 1870s. Perhaps this is a subtle indication of what lies within: Gore Hall, Harvard’s first proper library, was demolished in 1913. To be sure, it was replaced by the grand Widener Library, which is a treasure. But will it remain a treasure? In September 2020, the university announced in a press release that “Harvard Library has begun building an Anti-Racism team,” appointing its “first Anti-Black Racism Librarian/Archivist,” who “will work with colleagues across Harvard Library on objectives relating to centering anti-racism and diversity in our collections lifecycle.” Imagine being paid to string words together like this about a “collections lifecycle”—at Harvard.

But you don’t have to imagine: this is what has become of education in this country, not only at Harvard but pretty much everywhere. The situation, summed up in a sentence by Agresto: “There’s no way to view this as other than a tragedy.” Particularly insidious is that, unlike most examples of political attacks on education in the past, the recent “dismantling of the liberal arts comes from . . . within”:

It comes from radicalized departments of history, literature, classics, American studies, and all the myriad of other studies connected to ethnopolitical interest groups. It comes from virtually every school and college of education. This is why I have no hesitation in saying that liberal education in America is dying not by murder but by suicide.

Describing the self-destruction with grace, care, and regular doses of humor, Agresto does his best to imagine a better future. A superb writer who largely avoids inflammatory rhetoric, he is the rare gifted administrator who appears to have lost none of his humanity or sense of wonder when he entered the upper echelons of academic bureaucracy: as acting chair of the National Endowment for the Humanities in the mid-1980s, then as president of St. John’s College in Santa Fe from 1989 to 2000, and more recently as a founding trustee of the American University of Iraq, Sulaimani.

What are the liberal arts? This is Agresto’s pithy explanation, which he provides in highlighting italics: “a way of understanding the most important questions of human concern through reason and reflection.” He writes that “the liberal arts hold out the promise of freeing each of us from the captivity of prejudice, of platitudes and superstition, or of whatever it is that ‘everyone’ believes” and “aim at once to be truly radical and truly conservative,” demanding that individuals acquire a foundation in the wisdom of the past so that they can truly think for themselves.

Endangered Education

Unfortunately, he laments, “a rich and thoroughgoing liberal arts education [now] seems to me as endangered as the Sumatran orangutan.” As he goes on to note, when he first began to think about the book, nearly three decades ago, he drafted such sentences as “It’s clear that in the realm of education the words ‘liberal arts’ have always been words of high praise” and commented on how very rare it was to find a “faculty of liberal arts that does not think of itself as the crown jewel of the whole educational enterprise.” How things change.

Yet even in the 1980s, the bonfire of the humanities was well underway. Agresto knows this, of course. Indeed, he spends a good number of pages on the dismantling of Stanford’s Western Culture curriculum, which was last taught in 1987–88. Some five hundred protesters, including Jesse Jackson, initiated this fight by marching on campus chanting, “Hey ho, ho ho, Western Civ has got to go!” The march took place in January 1987; in February, Agresto’s teacher Allan Bloom published The Closing of the American Mind; and at the end of March 1988, the game was finally up when the Faculty Senate voted 39 to 4 to revamp the course to make it more global and, supposedly, less racist and sexist.

The fact is that the humanities—historically a subsection of the liberal arts: literature, music, history, etc.—have been in deep trouble for a long time. It is difficult to find a basic course on Shakespeare at most of the better-known colleges and universities; the rise of STEM, for all its wonders, has led in recent months to the Pandora’s box known as ChatGPT; and, worst of all, today’s would-be defenders of the humanities often don’t seem to have any idea what they’re defending or how to do it.

Part of the challenge is that defending the liberal arts involves clusters of questions and debates rather than a clearly articulated set of principles. Life’s biggest questions are almost never resolved to everyone’s satisfaction, and if we don’t study the differences between the Epicureans and the Stoics, between Locke and Rousseau, and between legal originalists and non-originalists, we are missing out on our own music: sometimes a battle of the bands, sometimes cacophony, always fascinating.

The diverse nature of the liberal arts means that to be educated means knowing not only the proverbial “best that has been thought and said in the world” but also the also-rans. “[W]e understand better the Founders’ Constitution,” Agresto writes, “by reading the writings of various Anti-Federalists alongside The Federalist Papers.” And beyond this: a liberal education should also “com[e] to grips with the very worst that has been said and done and understanding why.”

If we don’t study the differences between the Epicureans and the Stoics, between Locke and Rousseau, and between legal originalists and non-originalists, we are missing out on our own music: sometimes a battle of the bands, sometimes cacophony, always fascinating.

 

Agresto firmly rejects the idea that people who study the humanities are more humane, perhaps even more human, than those who do not. Obviously he is correct about this. “Are we humanists and liberal artists actually more moral than . . . owners of delicatessens?” Agresto asks. I am the grandson of owners of a delicatessen who did not attend college, and I have no hesitation in saying that they were more moral than both I and most people I have known in decades in academia.

It is one thing to extol the extraordinary, as we should all do. But it is deeply wrong to disdain or condemn the ordinary. Yet this is how the American elite is now acting. “[O]rdinary family life, heterosexuality, simple love of country, traditional virtues, traditional religious habits and outlooks”—all are under regular attack at institutions of higher learning, which at the same time present students with such gotcha questions as “Have you ever been to a gay or lesbian bar, social club, or march? If not, why not?” (courtesy of North Dakota State). It is good to remember Cardinal Newman, as Agresto of course does: “a University training is the great ordinary means to a great but ordinary end.”

For Social Justice Warriors, however, there is a horrifying new “ordinary.” Here’s how Agresto puts it, after reminding us that Social Justice was the name of Father Coughlin’s virulently anti-Semitic journal:

The last thirty years have seen the vandalizing of ever so much of higher education. The supposed reformers have entered the storehouse of centuries of accumulated knowledge, torn down its walls, thrown out its books, and toppled its monuments. For all their brave talk of justice, they have carried out what has to be seen as one of the most intellectually criminal acts of the ages, the modern equivalent of burning the libraries of antiquity. Today, acts that were unthinkable, unimaginable, just years ago now seem so very ordinary.

Agresto’s book is liberal, largely moderate, and explicitly American: liberal not as opposed to “conservative,” but in the sense of being about freedom (Latin libertas, the source of our “liberty”); moderate because moderation is “the virtue a liberal education cultivates best, as well as the virtue for which it is often criticized most”; and American for the reason that “[b]ecause we are diverse, for each of us ‘our own’ means not only what we hold in common but also what we hold separately.” This defense of liberal education will especially resonate with readers like me who at least used to consider themselves liberal, who try when possible to occupy the middle ground, and who find themselves increasingly aggressive about promoting American ideals and institutions.

It is one thing to extol the extraordinary, as we should all do. But it is deeply wrong to disdain or condemn the ordinary. Yet this is how the American elite is now acting.

 

Underestimating Outrage

Though it is at times repetitive, you can pick up The Death of Learning and read almost any chapter on its own and be edified. But the greatest flaw, in my view, is that Agresto does a better job of explaining “how American education has failed our students” than “what to do about it.” Not that he doesn’t say the right things: about the desirability of investing large sums in small liberal arts colleges; the importance of winning over the young and those who teach them (for which reason he ends the book with two heartfelt exhortations: “A Message to High School Teachers and Principals” and “A Message to High School Seniors”); and the possibilities that new universities offer—from Austin, Texas (disclosure: I am on the advisory board of the University of Austin) to the Kurdistan Region of Iraq. But, as I have already hinted, I found in the second half of the book more wishful thinking than original policy suggestions.

There are also a few spots where Agresto gets things wrong. Sometimes he should know better. Most egregious is his valorization of Anthony Fauci, who majored in classics at Holy Cross. Agresto links him with Martin Luther King Jr. as “discerning men of public presence, insight, persuasiveness, and judgment—and thus capable of doing great things,” but I for one would not use him as an exemplar of why a “liberal arts education [is] something peculiarly important and estimable” (italics in original).

On other occasions, however, Agresto overestimates goodwill and overlooks the extent of academic outrage about certain topics. He may be at his strongest when he explains how a liberal education can be not merely of value to us as individuals but of genuine use—Agresto does not shy away from this word—to our collective well-being as a country. He highlights the education and sense of civic responsibility of three of America’s Founding Fathers and its nineteenth-century “refounder,” but fails to note that in the past three years, prominent statues of Thomas Jefferson and Abraham Lincoln have been taken down or that James Madison’s estate, Montpelier, has become aggressively woke.

Fair enough, perhaps. But would he have predicted that a statue of John Witherspoon that an elite university erected as recently as 2001 would now be under serious threat? As I write, the Princeton administration is debating what to do about a prominently placed representation of the only clergyman and only college president to sign the Declaration of Independence: a citizen of the world after whom the organization that publishes Public Discourse is named. (Disclosure: Princeton fired me last year, but I remain a Senior Fellow at the Witherspoon Institute.) The controversy has made national news, including in these pages. To those who would take down the statue, I offer Witherspoon’s admonition to Princetonians of long ago: “Do not live useless and die contemptible.”

Let me end on a positive note. In his salutary reflections on an “alliance” between the liberal arts and vocational education, Agresto quotes Booker T. Washington, who was assuredly neither useless nor contemptible. Of a student who made use of grammar, chemistry, and other bookish subjects in the raising of an acre of splendid cabbages, Washington wrote, “[T]here is just as much that is interesting, strange, mysterious, and wonderful; just as much to be learned that is edifying, broadening, and refining in a cabbage as there is in a page of Latin.” He was right, and I’ll add to the discussion Pliny the Elder’s statement in the first century A.D., Brassicae laudes longum est exsequi (“it would be a lengthy business to enumerate the glories of the cabbage”).

Learning in all its forms can and must be saved. We do not live in the best of all possible worlds, and it is time for everyone to stop the destructive nonsense and return to cultivating our precious gardens, both agricultural and academic.

Eppur si muove: The Legend of Galileo

There are few images of the modern world more powerful than that of the humbled Galileo, kneeling before the cardinals of the Holy Roman and Universal Inquisition, being forced to admit that the Earth did not move. The story is familiar: that Galileo represents science fighting to free itself from the clutches of blind faith, biblical literalism, and superstition. The story has fascinated generations, from the philosophes of the Enlightenment to scholars and politicians in the nineteenth and twentieth centuries.

The specter of the Catholic Church’s condemnation of Galileo continues to influence the modern world’s understanding of the relationship between religion and science. In October 1992, Pope John Paul II appeared before the Pontifical Academy of the Sciences to accept formally the findings of a commission tasked with historical, scientific, and theological inquiry into the Inquisition’s treatment of Galileo. The Pope noted that the theologians of the Inquisition who condemned Galileo failed to distinguish properly between particular biblical interpretations and questions pertaining to scientific investigation.

The Pope also observed that one of the unfortunate consequences of Galileo’s condemnation was that it has been used to reinforce the myth of an incompatibility between faith and science. That such a myth is alive and well was immediately apparent in the way the American press described the event in the Vatican. The headline on the front page of The New York Times was representative: “After 350 Years, Vatican Says Galileo Was Right: It Moves.” Other newspapers, as well as radio and television networks, repeated essentially the same claim.

The New York Times story is an excellent example of the persistence and power of the myths surrounding the Galileo affair. The newspaper claimed that the Pope’s address would “rectify one of the Church’s most infamous wrongs—the persecution of the Italian astronomer and physicist for proving the Earth moves about the Sun.” For some, the story of Galileo serves as evidence for the view that the Church has been hostile to science, and the view that the Church once taught what it now denies, namely, that the Earth does not move. Some take it as evidence that teachings of the Church on matters of sexual morality or of women’s ordination to the priesthood are, in principle, changeable. The “reformability” of such teachings is, thus, the real lesson of the “Galileo Affair.”

But modern treatments of the affair not only miss key context surrounding the Inquisition’s condemnation of Galileo; they also misinterpret what the Catholic Church has always taught about faith, science, and their fundamental complementarity.

For some, the story of Galileo serves as evidence for the view that the Church has been hostile to science, and the view that the Church once taught what it now denies, namely, that the Earth does not move.

 

Galileo and the Inquisition in the Seventeenth Century

Galileo’s telescopic observations convinced him that Copernicus was correct. In 1610, Galileo’s first astronomical treatise, The Starry Messenger, reported his discoveries that the Milky Way consists of innumerable stars, that the moon has mountains, and that Jupiter has four satellites. Subsequently, he discovered the phases of Venus and spots on the surface of the sun. He named the moons of Jupiter the “Medicean Stars” and was rewarded by Cosimo de’ Medici, Grand Duke of Tuscany, with appointment as chief mathematician and philosopher at the Duke’s court in Florence. Galileo relied on these telescopic discoveries, and arguments derived from them, to bolster public defense of Copernicus’s thesis that the Earth and the other planets revolve about the sun.

When we speak of Galileo’s defense of the thesis that the Earth moves, we must be especially careful to distinguish between arguments in favor of a position and arguments that prove a position to be true. Despite the claims of The New York Times, Galileo did not prove that the Earth moves about the sun. In fact, Galileo and the theologians of the Inquisition alike accepted the prevailing Aristotelian ideal of scientific demonstration, which required that science be sure and certain knowledge, different in some ways from what we today accept as scientific. Furthermore, to refute the geocentric astronomy of Ptolemy and Aristotle is not the same as to demonstrate that the Earth moves. Danish astronomer Tycho Brahe (1546–1601), for example, had created another account of the heavens. He argued that all the planets revolved about the sun, which itself revolved about a stationary Earth. In fact, Galileo himself did not think that his astronomical observations provided sufficient evidence to prove that the Earth moves, although he did think that they called Ptolemaic geocentric astronomy into question. Galileo hoped eventually to argue from the fact of ocean tides to the double motion of the Earth as the only possible cause, but he did not succeed.

When we speak of Galileo’s defense of the thesis that the Earth moves, we must be especially careful to distinguish between arguments in favor of a position and arguments that prove a position to be true.

 

Cardinal Robert Bellarmine, Jesuit theologian and member of the Inquisition, told Galileo in 1615 that if there were a true demonstration for the motion of the Earth, then the Church would have to abandon its traditional reading of those passages in the Bible that appeared to be contrary. But in the absence of such a demonstration (and especially in the midst of the controversies of the Protestant Reformation), the Cardinal urged prudence: treat Copernican astronomy simply as a hypothetical model that accounts for the observed phenomena. It was not Church doctrine that the Earth did not move. If the Cardinal had thought that the immobility of the Earth were a matter of faith, he could not argue, as he did, that it might be possible to demonstrate that the Earth does move.

The theologians of the Inquisition and Galileo adhered to the ancient Catholic principle that, since God is the author of all truth, the truths of science and the truths of revelation cannot contradict one another. In 1616, when the Inquisition ordered Galileo not to hold or to defend Copernican astronomy, there was no demonstration for the motion of the Earth. Galileo expected that there would be such a demonstration; the theologians did not. It seemed obvious to the theologians in Rome that the Earth did not move and, since the Bible does not contradict the truths of nature, the theologians concluded that the Bible also affirms that the Earth does not move. The Inquisition was concerned that the new astronomy seemed to threaten the truth of Scripture and the authority of the Catholic Church to be its authentic interpreter.

The Inquisition did not think that it was requiring Galileo to choose between faith and science. Nor, in the absence of scientific knowledge for the motion of the Earth, would Galileo have thought that he was being asked to make such a choice. Again, both Galileo and the Inquisition thought that science was absolutely certain knowledge, guaranteed by rigorous demonstrations. Being convinced that the Earth moves is different from knowing that it moves.

The disciplinary decree of the Inquisition was unwise and imprudent. But the Inquisition was subordinating scriptural interpretation to a scientific theory, geocentric cosmology, that would eventually be rejected. Subjecting scriptural interpretation to scientific theory is just the opposite of the subjection of science to religious faith!

In 1632, Galileo published his Dialogue Concerning the Two Chief World Systems, in which he supported the Copernican “world system.” As a result, Galileo was charged with disobeying the 1616 injunction not to defend Copernican astronomy. The Inquisition’s injunction, however ill‑advised, only makes sense if we recognize that the Inquisition saw no possibility of a conflict between science and religion, both properly understood. Thus, in 1633, the Inquisition, to ensure Galileo’s obedience, required that he publicly and formally affirm that the Earth does not move. Galileo, however reluctantly, acquiesced.

From beginning to end, the Inquisition’s actions were disciplinary, not dogmatic, although they were based on the erroneous notion that it was heretical to claim that the Earth moves. Erroneous notions remain only notions; opinions of theologians are not the same as Christian doctrine. The error the Church made in dealing with Galileo was an error in judgment. The Inquisition was wrong to discipline Galileo, but discipline is not dogma.

The Inquisition did not think that it was requiring Galileo to choose between faith and science. Nor, in the absence of scientific knowledge for the motion of the Earth, would Galileo have thought that he was being asked to make such a choice.

 

The Development of the Legend of Galileo

The mythic view of the Galileo affair as a central chapter in the warfare between science and religion became prominent during debates in the late nineteenth century over Darwin’s theory of evolution. In the United States, Andrew Dickson White’s History of the Warfare of Science with Theology in Christendom (1896) enshrined what has become a historical orthodoxy difficult to dislodge. White used Galileo’s “persecution” as an ideological tool in his attack on the religious opponents of evolution. Since it was so obvious by the late nineteenth century that Galileo was right, it was useful to see him as the great champion of science against the forces of dogmatic religion. The supporters of evolution were seen as nineteenth-century Galileos; the opponents of evolution were seen as modern inquisitors. The Galileo affair was also used to oppose claims about papal infallibility, formally affirmed by the First Vatican Council in 1870. As White observed: had not two popes (Paul V in 1616 and Urban VIII in 1633) officially declared that the Earth does not move?

The persistence of the legend of Galileo, and of the image of “warfare” between science and religion, has played a central role in the modern world’s understanding of what it means to be modern. Even today the legend of Galileo serves as an ideological weapon in debates about the relationship between science and religion. It is precisely because the legend has been such an effective weapon that it has persisted.

Galileo and the Inquisition shared common first principles about the nature of scientific truth and the complementarity between science and religion.

 

For example, a discussion in bioethics from several years ago drew on the myths of the Galileo affair. In March 1987, when the Catholic Church published condemnations of in vitro fertilization, surrogate motherhood, and fetal experimentation, there appeared a page of cartoons in one of Rome’s major newspapers, La Repubblica, with the headline: ‘In Vitro Veritas.’ In one of the cartoons, two bishops are standing next to a telescope, and in the distant night sky, in addition to Saturn and the Moon, there are dozens of test-tubes. One bishop turns to the other, who is in front of the telescope, and asks: “This time what should we do? Should we look or not?” The historical reference to Galileo was clear.

In fact, at a press conference at the Vatican, then-Cardinal Joseph Ratzinger was asked whether he thought the Church’s response to the new biology would not result in another “Galileo affair.” The Cardinal smiled, perhaps realizing the persistent power—at least in the popular imagination—of the story of Galileo’s encounter with the Inquisition more than 350 years before. The Vatican office that Cardinal Ratzinger was then the head of, the Congregation for the Doctrine of the Faith, is the direct successor to the Holy Roman and Universal Inquisition into Heretical Depravity.

There is no evidence that in 1633 when Galileo acceded to the Inquisition’s demand that he formally renounce the view that the Earth moves, he muttered under his breath, eppur si muove, “but still it moves.” What continues to move, despite evidence to the contrary, is the legend that Galileo represents reason and science in conflict with faith and religion. Galileo and the Inquisition shared common first principles about the nature of scientific truth and the complementarity between science and religion. In the absence of scientific knowledge, at least as understood by both the Inquisition and Galileo, that the Earth moves, Galileo was required to affirm that it did not. However unwise it was to insist on such a requirement, the Inquisition did not ask Galileo to choose between science and faith.

Self-Exclusion and the Wounds of Sin: A Response to Cardinal Robert McElroy

I was thinking about Julia Flyte as I read the recent essay by Cardinal Robert McElroy in America. Lady Julia is a central character in Evelyn Waugh’s novel Brideshead Revisited. Married to a divorcé, and living out of wedlock with Charles Ryder, she has a near breakdown late in the novel when her callous brother explains why he cannot bring his fiancée Beryl to Julia’s house: “It is a matter of indifference whether you choose to live in sin with Rex or Charles or both—I have always avoided enquiry into the details of your ménage—but in no case would Beryl consent to be your guest.”

Julia’s response is powerful for expressing her awareness that, as much of an ass as Bridey might be, “He’s quite right. . . . He means just what it says in black and white. Living in sin, with sin, by sin, for sin, every hour, every day, year in, year out. Waking up with sin in the morning, seeing the curtains drawn on sin, bathing in it, dressing it, clipping diamonds to it, feeding it, showing it around, giving it a good time, putting it to sleep at night with a tablet of Dial if it’s fretful.”

This confession anticipates her final conversion of heart as her father is dying, a conversion that Charles could see coming “all this year,” and portending the end of his relationship with Julia.

Structures of Exclusion?

I’ll come back to Waugh, who in 1947 wrote a penetrating analysis of Brideshead for MGM, which was thinking of making a filmed version of the story. But first to Cardinal McElroy. The central concern of his essay is with the “structures and cultures of exclusion that alienate all too many from the church or make their journey in the Catholic faith tremendously burdensome.” Much of what he says of these exclusions will strike most Catholics as reasonable: certainly the poor, racial minorities, the incarcerated, and the disabled have all, in various ways and at various times, been marginalized in unacceptable ways. The cardinal also notes that the “church at times marginalizes victims of clergy sexual abuse in a series of destructive and enduring ways.”

What does it mean to speak of “structures and cultures of exclusion” in these contexts? The meaning will vary from case to case, but it is worth spending a moment on the last mentioned. For one might judge that in, for example, the case of Fr. Marko Rupnik, structures and cultures of opacity, secrecy, prestige, lack of concern for procedural justice, and the marginalization of women religious contributed to an egregious series of harms and a remarkable (and repulsive) valorization of the man who perpetrated those harms. Catholics might indeed hope that the institutional structures and culture that made such abuse possible over so many years will be the object of serious scrutiny and reform among the members of the hierarchy.

More surprising, and controversial, is the turn Cardinal McElroy makes late in his essay in a discussion of “exclusion” of those whose lives are discordant with the Church’s sexual teachings. Discussing those “who are marginalized because circumstances in their own lives are experienced as impediments to full participation in the life of the church,” the cardinal notes that these “include those who are divorced and remarried without a declaration of nullity from the church, members of the L.G.B.T. community and those who are civilly married but have not been married in the church.”

Cardinal McElroy is not, as he makes clear, discussing those who, having remarried, now live chastely, or those who, experiencing sexual desires or orientation toward acts at odds with Church teaching, live in continence. Rather, his concern is with those who because of their acts are “excluded” from the reception of the Eucharist. That, he argues, is at odds with the Church’s witness to “radical inclusion and acceptance,” which cannot be predicated on a “distinction between orientation and activity.”

It is unclear how human persons could know what they do of God and His relationship to human persons if they did not understand marriage and its potential for bringing forth new life, if they did not understand the norms of exclusivity and fidelity that flow from marriage, or the norms that exclude all sexual activity outside marriage.

 

Cardinal McElroy’s arguments on this point are worth more sustained consideration than I can give them here, but the four substantive points he makes seem to me mistaken or distorted in their emphasis. I will work backward, since it is the first that I want to focus particularly on, through a brief discussion of Waugh’s memo.

The fourth of the cardinal’s points is that sexual activity is not at the heart of the hierarchy of Christian truths. This claim is not, however, unqualifiedly true, since marriage is the central sacramental image in the New Testament of the relationship of Christ to His Church; it is likewise the central image in the Old Testament of the relationship of God to His chosen people. And again, marriage is a prominent scriptural image of the Kingdom of Heaven, which is to be like a marriage banquet. This imagery teaches: it is unclear how human persons could know what they do of God and His relationship to human persons if they did not understand marriage and its potential for bringing forth new life, if they did not understand the norms of exclusivity and fidelity that flow from marriage, or the norms that exclude all sexual activity outside marriage.

The third claim is essentially a restatement of the very view that Cardinal McElroy is defending: that Eucharistic inclusiveness, rather than Eucharistic coherence, should be the guiding pastoral norm for the Church. But he gives little to no attention to the opposing view, that, in Pope Francis’s words, “This is not a penalty: you are outside. Communion is to unite the community.” But, the cardinal might respond, is this not a putting of the excluded outside the community? Could he or she not be invited in?

The short answer to the first of these questions is negative: central to the Church’s teaching about sin is that it involves a self-separation of the sinner from God and His Church. The problem, which McElroy’s essay invites us to ponder, is how this claim can be squared with the appropriate answer to the second question: “Could he or she not be invited in?” For the answer to that is an unhesitating “yes,” a yes that, we will see, is central to Waugh’s understanding of Brideshead.

The cardinal also, in making this claim about Eucharistic inclusion, makes reference to Pope Francis’s Gaudete et Exultate: “grace, precisely because it builds on nature, does not make us superhuman all at once. . . . Grace acts in history; ordinarily it takes hold of us and transforms us progressively.” Perhaps then the demands of Christian morality are too burdensome for those whom grace has not fully transformed. But while it is certainly true that perfect virtue does not happen “all at once,” the Church teaches that all persons have a sufficiency of grace for the avoidance of mortal sin—the kinds of sins that result in self-exclusion from communion with the Church and a full relationship with God.

Cardinal McElroy’s second point is concerned with conscience: “While Catholic teaching must play a critical role in the decision making of believers, it is conscience that has the privileged place.” Again, the claim is only half true, since for a Catholic, the Church’s teaching must play the privileged place in the formation of the Catholic conscience.

While it is certainly true that perfect virtue does not happen “all at once,” the Church teaches that all persons have a sufficiency of grace for the avoidance of mortal sin—the kinds of sins that result in self-exclusion from communion with the Church and a full relationship with God.

 

The Wound of Sin

But now on to what I take to be the central dilemma posed by Cardinal McElroy, and to the import of Waugh’s memo regarding Brideshead. The first of the “dimensions of Catholic faith” that support Eucharistic inclusion is this:

The primary pastoral imperative is to heal the wounded. And the powerful pastoral corollary is that we are all wounded. It is in this fundamental recognition of our faith that we find the imperative to make our church one of accompaniment and inclusion, of love and mercy. Pastoral practices that have the effect of excluding certain categories of people from full participation in the life of the church are at odds with this pivotal notion that we are all wounded and all equally in need of healing.

Much, even all, of this is true. But it leaves unanswered the key question: what is the nature of the wound? And, given the nature of the wound, what is the key to radical inclusion?

The answer to the first question is sin, a concept scarcely addressed in the Cardinal’s essay. Sin is the central wound suffered by all humanity, and it is her awareness of that wound that makes Lady Julia’s response to her brother so powerful.

By its nature, the wound of sin involves rejection of the way laid before human beings by God. But that way is born neither of arbitrary command, nor of contingent means to the external end of eternal happiness. Rather, God’s way is the way of human happiness, and what He commands is only what is truly fulfilling of human nature.

God, in His commands, commands only what is fulfilling for human persons; but that fulfillment is precisely what God desires for his human creation. And so in rejecting the guidance of the natural law, or of revelation, human beings render themselves incapable of fully realizing the offer of friendship that God extends when he offers them a way to their own fulfillment. Sin damages the person and the person’s capacity for relationship with God simultaneously. It is thus a radical self-exclusion from the communion of those whom God has called both to fulfillment and to perfect communion with Him.

God’s way is the way of human happiness, and what He commands is only what is truly fulfilling of human nature.

 

“Twitch upon the Thread”

How is it that such self-exclusion is to be overcome? How is it that radical inclusion is to be achieved? Let us return to Evelyn Waugh’s memo. The answer to this very question, he says, “is in no sense abstruse and is based on principles that have for nearly 2,000 years been understood by millions of simple people, and are still so understood.”

Waugh identifies three principles that should be retained in the film adaptation of the novel:

The novel deals with what is theologically termed, “the operation of Grace,” that is to say, the unmerited and unilateral act of love by which God continually calls souls to Himself;

Grace is not confined to the happy, prosperous and conventionally virtuous. There is no stereotyped religious habit of life, as may be seen from the vastly dissimilar characters of the canonised saints. God has a separate plan for each individual by which he or she may find salvation. The story of Brideshead Revisited seeks to show the working of several such plans in the lives of a single family;

The Roman Catholic Church has the unique power of keeping remote control on human souls which have once been part of her. G. K. Chesterton has compared this to the fisherman’s line, which allows the fish the illusion of free play in the water, and yet has him by the hook; in his own time the fisherman by a “twitch upon the thread” draws the fish to land.

Waugh’s directives speak for themselves, but a few brief comments are in order. First, grace operates as the “unmerited and unilateral act by which God continually calls souls to Himself.” This is the foundation of the Church’s claim to radical inclusion: all are called by God, and this calling is truly radical, for it fulfills no need on God’s part, or desert on ours.

Second, God has a plan for each individual, and the paths by which one might be led to accept God’s offer might be radically different from person to person. For every person, however, the path will lead through sin, for again, sin is at the heart of the brokenness of all human persons. There is an additional lesson about radical inclusion here, convergent with at least some parts of Cardinal McElroy’s essay: while sin inevitably ruptures full communion, the efforts of the Church to overcome such ruptures must be unceasing.

Finally, the Church has the divinely given responsibility of continuing to “hold the line,” so to speak, identifying for its members what is and is not sin, and giving the appropriate twitch to the line to call back each member to God’s plan for him or her. Waugh’s, and Chesterton’s, metaphor is antithetical to both images we find in McElroy’s essay, that of an open door—too passive, not enough fishing—and that of an ever-expanding perimeter—too formless, no sense of boundary. The genuinely corresponding “inclusive” image to the metaphor of the twitch on the line is rather the apostles’ net, filled and overfilled at Christ’s direction, without bursting or breaking. The net is not ever-expanding and formless; rather, like the stable in C. S. Lewis’s The Last Battle, it is bigger on the inside than on the outside.

Too often the Church has been Bridey, insensitive and cloddish in its pastoral care of sinners; Cardinal McElroy is correct to note the failures. But it has also failed, and now no less than in other times, to speak truthfully while resting assured, again quoting Waugh’s memo, in “how the Grace of God turns everything in the end to good.”

There Is No Thinking without Memorizing

“With the ancient is wisdom; and in length of days understanding.” Job 12:12 KJV

In education we are often told that we don’t want to simply teach students facts; we need to teach them “how to think.” My state, South Dakota, is currently debating whether to adopt new standards for K–12 Social Science. I was part of the working group that helped draw up these standards, which focus on learning content over developing skills. These standards would require memorizing such standard American texts as the opening paragraphs of the Declaration of Independence and the Preamble of the United States Constitution.

The debate playing out in South Dakota gets at fundamental disagreements about the nature of learning. One of the foremost critiques is that the standards require too much “rote learning” and not enough “thinking skills.” The question is whether these two phenomena, memorization and analysis, are actually at odds. What if the latter requires the former? What if “critical thinking” is not a technique, but a natural outcome of learning content?

Jargon vs. Thinking

The education guild, which opposes the standards we put forth, holds up Benjamin Bloom’s 1956 Taxonomy of Learning as a kind of Ten Commandments or Buddhist Eightfold Path to better education. For Bloom, asking students to “memorize,” “define,” or “explain” promotes lower-order skills, whereas students who are able to “critique,” “design,” or “formulate” have developed higher skills. The complaint is that the standards do not use Bloom’s vocabulary.

Bloom’s Taxonomy, ironically, tends to substitute jargon for actual thinking. As George Orwell reminds us in his justly famous essay on the English language, words can be overused to the extent that the words cease to have any meaning. This is the case with education buzzwords such as “appraise” and “investigate.” As Orwell puts it, “A speaker who uses that kind of phraseology has gone some distance toward turning himself into a machine. The appropriate noises are coming out of his larynx, but his brain is not involved as it would be if he were choosing his words for himself.” The banality of educational jargon should give us pause.

What if “critical thinking” is not a technique, but a natural outcome of learning content?

 

A larger problem with Bloom’s Taxonomy is precisely that it assumes that education imparts skills rather than leading students to knowledge or truth. Like the architect encountered by Gulliver at the Grand Academy of Lagado, it represents an attempt to build a house from the top down. There is literally no foundation. In Gulliver’s third recorded voyage, Swift consistently mocks the notion of “thinking” abstracted from any real knowledge. It is quite difficult to “think for yourself” when you don’t have much to think about.

For example, some years ago I was one of many scholars and teachers who worked with the United States Department of Education to read grant proposals for the Teaching American History program (a program no longer existing). The ultimate goal of the program, naturally, was to increase students’ knowledge of American history. Many of the proposals I read claimed that they weren’t going to settle for teaching mere names and dates. Instead, they were going to teach students how to “think like historians.” A fine goal, to be sure. Still, if at the end of the day we want students to think like historians, at the beginning of the day they must know some history. While history may be more than names and dates, it is at least names and dates. Try thinking like a historian about the American Civil War, for instance, without knowing about the Compromise of 1850, the Kansas-Nebraska Act, Bleeding Kansas, the Dred Scott decision, the Lincoln–Douglas debates, the various contestants in the 1860 presidential election (especially Abraham Lincoln), Ft. Sumter, Robert E. Lee, Ulysses Grant, Jefferson Davis, Frederick Douglass, Harriet Tubman, the Emancipation Proclamation, battles such as First Bull Run, Shiloh, Antietam, Gettysburg, Cold Harbor. One could go on, but the point is made. To “think critically” about the American Civil War necessarily entails placing it in the correct half-century (which many students cannot do). Asking students to “question” something they know nothing about will yield success at the same rate as attempting to extract sunbeams from cucumbers.

If at the end of the day we want students to think like historians, at the beginning of the day they must know some history. While history may be more than names and dates, it is at least names and dates.

 

Critical Thinking

Still, especially when attempting to defend the liberal arts, educators often fall into educational lingo. As classics professor Eric Adler notes in his fine book on the history of the Classics in American higher education, the liberal arts and humanities are often defended as promoting “critical thinking.” The problem here is twofold. First, as Adler notes, studies on the matter are inconclusive about whether this claim is even true. Other disciplines may develop such thinking as much as the liberal arts. Second, which I think is Adler’s ultimate point, there is a concession built into this argument that the liberal arts and humanities cannot offer an internal defense of themselves. They must rely on social science to prove that the liberal arts produce a “skill” of “value.” As Adler notes, “[O]utsourcing claims about your value to other disciplines is a risky business.” Education conceived as conveying skills and technique, rather than wisdom and discernment, is a notion deployed by those who have forgotten—if they ever learned—what liberal education actually is. This defense ultimately reduces the liberal arts to servile arts.

Regarding thinking as a skill is not a pedagogically neutral stance. Recall my anecdote above about wanting to teach students to “think like historians” rather than simply “memorize names and dates.” The act of memorization is the acting of handing down knowledge. Bear in mind that the Latin root of tradition, tradere, means “to hand down.” The bias of modern pedagogy is that handing down knowledge, passing on an intellectual tradition, is a kind of second-rate education. Memorizing, repeating, and recognizing are among the lowest levels of learning, according to Bloom.

We deploy these faddish educational notions to the detriment of our students. What is often derided as “rote-learning” is actually essential to sophisticated analysis. Many of us were drilled in memorizing our times-tables in math. This sort of memorization creates a base of knowledge. We draw upon this foundational knowledge as we engage in more conceptual thinking. Just as learning an instrument or an athletic skill takes repetitive practice, so does learning math, science, history, and literature. As systems engineer Barbra Oakley and computational biologist Terrence Sejnowski put it, we need both fast learning, the kind promoted by memorization, and slow learning, characterized by deliberation. Slow learning works in symbiosis with fast learning. Oakley and Sejnowski, who have applied their expertise to education, find that deliberation relies on the ability to quickly recall bit and bobs of information, knowledge acquired through repetition and habit.

One thinks of the chapter in Eamon Duffy’s masterwork The Stripping of the Altars on “How the Ploughman Learned his Paternoster” in medieval England. Even a barely literate farmer could learn various prayers, devotions, and doctrinal teachings merely through repetition in liturgy, artwork, and religious theater. My citation of Gulliver above is of like manner. I was able to draw upon a piece of knowledge, namely the plot of Gulliver’s Travels, and employ it in a different context when attempting to explain a complex concept.

Thinking is a not a skill that must be taught. It emerges naturally from the acquisition of knowledge. In other words, students will instinctively think, even critically think, about what they know. The job of the teacher, especially at high grade levels and at the university, is to facilitate such thinking, much as the midwife facilitates childbirth.

Just as learning an instrument or an athletic skill takes repetitive practice, so does learning of math, science, history, and literature.

 

New over Old

Emphasis on thinking as a skill, and the concomitant de-emphasis on memorization, favors love of the new over respect for the old. It is no accident that our famous twentieth-century dystopian novels, such as 1984, Brave New World, and Fahrenheit 451, all include destruction of the past. In Fahrenheit 451, Ray Bradbury’s fictional government now burns books by official policy, but the novel clearly states that the elimination of books was originally by choice. Individuals would rather be amused by television, living merely for the present, than engage with the past via books. The book’s protagonist, Montag, meets a band of intellectuals, who, like the monasteries of old, have worked diligently to preserve ancient ideas. They have done so through memorization. If we apply Bloom’s Taxonomy, these intellectuals have engaged in the lowest level of learning. Yet from Bradbury’s point of view, they have preserved civilization.

This aspect of Fahrenheit 451 rings true to our time more than the other dystopias. In 1984 and Brave New World literature is suppressed by some form of authoritarianism. Yet, as I noted above, in Fahrenheit 451 it is not the government that initially suppresses books. People simply stop reading them. We, as in Bradbury’s fiction, have achieved the goals of Big Brother and the World Controller without the need for the surveillance and suppression of 1984 or the conditioning of Brave New World.

If you don’t introduce people to the past, it simply goes away. A people bereft of a tradition, all these dystopias argue, is easy to manipulate, easy to deceive. As Orwell famously put it, “Who controls the past controls the future; who controls the present controls the past.” The ideology of critical thinking carries within it a bias against the past, against the good of passing on the teaching of our ancestors. The job of the critical thinker is never to accept, never to venerate. It is always to debunk. Its approach to the past is not one of humility, but of cynicism. To this extent, the educational guild’s pedagogy is an enemy of wisdom.

Something to Think About

The first step to creating actual thinkers is giving them something to think about. Call it what you will: cultural literacy, liberal education, the grammar of the ancient trivium. Whatever the appellation, students need a firm grounding in the history, literature, and traditions of their people. This gives them a myriad of references on which to draw as they learn.

Much as people today regularly draw on popular culture, noting how this or that event mirrors something in a Marvel movie or how a particular notable figure reminds them of someone from Game of Thrones, better thinkers immerse themselves not just in the culture of today, but in the rich heritage of a civilization. This is true for two reasons, both anticipated in C. S. Lewis’s well known essay “On the Reading of Old Books.” First, the new has not been tested, while the old has. While we must avoid antiquarian credulity, we should generally put more faith in the value of the old than the new. Second, each age has its prejudices. A healthy way to combat our temporal prejudice is to get outside our times and read and contemplate older works.

Disciplining our minds to bring multiple, rich references to bear on a particular matter makes for superior discernment. This is not a “skill” or a “technique,” but instead an organic outgrowth of a wide knowledge base. The tendency of young people, as Mark Bauerlein notes in a recent First Things essay, is, to a degree, toward self-absorption. Any exercise that brings young people outside their own times is useful. Young people today tend to lack not just knowledge of the past, meaning names and dates, but also a historical imagination, the ability to sympathize with the past. Again, this is one of the pathologies of “critical thinking.” It encourages students and teachers to treat the past as a foreign country that we judge by today’s standards, lazily assuming that today’s standards must be superior.

Each age has its prejudices. A healthy way to combat our temporal prejudice is to get outside of our times and read and contemplate older works.

 

In Bloom’s Taxonomy, understanding the past as it understood itself is low-level thinking, whereas critiquing is considered high-level. It doesn’t encourage students to inhabit the minds of those who went before, or engage in the effort to encounter literature that might be alien to our times. Robert Frost wrote in his poem “Carpe Diem”: “The present / Is too much for the senses, / Too crowding, too confusing— / Too present to imagine.” An education that encourages sympathy for the past, holding the possibility that it might have something to teach us, allows us to escape the echo chamber of our time and gain perspective, which is the seed of wisdom.

Such a robust education would both hand down the treasures of civilization and encourage real thinking. One does not wish to suggest that students should easily agree with the past, though our tendency now is to easily disagree with it. And, of course, the past is not a monolith. To study a tradition is not to study a monologue, but a dialogue, even a polyphony. In memorizing old tales and history, students will as a matter of course encounter different views. They will, with only the slightest encouragement needed from a good teacher, start to think about those different views. Maybe even critically.

Liberalism and Leo XIII

Some integralist Catholics on the American Right look to Leo XIII for magisterial backing in their condemnations of the American experiment and the “liberalism” they argue undergirds it. Leo’s teachings on liberalism and the relationship of the Catholic Church to the state appear most prominently in his encyclical, Immortale Dei. Generally speaking, integralists tend to argue that Leo provides magisterial teaching on the proper relationship of the Church to the state. Leo suggests that the state must cooperate with the Church in order to enable individual human persons to reach their true End, Heaven. Under the Church’s direction, the state must curtail liberties most associated with “liberalism”: religious, economic, and political freedom.

This is a plausible reading of Leo’s writings on politics and the Church. But it universalizes his teachings beyond the confessionally Catholic states of Europe that existed in Leo’s day, and thereby ignores the development of Catholic social teaching since his time. This reading also fails to take into account the advent of pluralism that Vatican II came to recognize in the Church’s statement on freedom of religion, Dignitatis Humanae. Indeed, it’s critical to look at what Leo XIII actually said, examine his writings as a whole, and especially to consider the context of his writings before judging their applicability beyond the late nineteenth century.

Liberalism and the Legacy of Rerum Novarum

Most Catholics know little about Pope Leo XIII beyond his 1891 encyclical Rerum Novarum (or “On the Condition of Labor”). This encyclical provides crucial context to Leo’s arguments about liberalism in Immortale Dei (which I will address later) and, studied alongside it, provides a fuller picture of Leo’s political teachings.

Rerum Novarum helped launch modern Catholic social doctrine by addressing pressing questions about labor, private property, and justice. Leo’s main concern is reflected in the encyclical’s title: the physical, mental, and spiritual well-being of workers. The massive scale of industrialization had left workers feeling alienated, impoverished, and victims of unjust wages. Leo’s concern was with justice for all citizens, particularly but not only the poor. Explaining that “the right of private property must be held sacred,” Leo connected property ownership to the need for just wages and promoted not policies but the principles of subsidiarity, charity, and recognition of the dignity of the human person. He counseled that it was precisely to protect the poor that the state must, in justice, ensure that all “private property . . . remain inviolate.” He condemned state socialism as inherently unjust, inhumane, more likely to hurt the “working man” than the wealthy, and rooted in the sin of envy.

Forty years later Pope Pius XI celebrated Rerum Novarum in Quadragesimo Anno (or “In the Fortieth Year”), which further developed the principle of subsidiarity. Pope John XXIII, thirty years after that, linked economic growth and human dignity in Mater et Magistra by arguing that in a just society economic growth will promote human dignity. Most importantly, Pope John Paul II in 1991 issued Centesimus Annus, which followed the collapse of communism and Soviet control in eastern Europe and the dissolution of the Soviet Union itself. “Marxism had promised to uproot the need for God from the human heart,” wrote John Paul II, “but the results have shown that it is not possible to succeed in this without throwing the heart into turmoil.” John Paul argued that democracy and free enterprise, which find their roots in subsidiarity, encourage solidarity, another core principle of Catholic social doctrine.

These encyclicals not only develop Catholic social doctrine, but also address particular questions within their historical contexts. And as they address the specific problems of their eras, they are guided by John Paul II called “fundamental principles.” The questions of 1891 were not the same as those of 1991. Neither was the condition of labor or capital the same. While Leo XIII recognized the dangers that state socialism posed to the individual human person and to society, he was mostly confined to theorizing about fundamental principles and socialism because at the time the world had yet to experience a successful communist revolution.

The illiberal, bloody history of communist states in the twentieth century, however, proved just how astute Leo had been about the soul-deadening and immoral nature of collectivism. In other words, the long twilight and then seemingly sudden demise of Soviet-backed communism between 1989 and 1991 proved Leo XIII right. The Church in reflecting on the rise and fall of communism and the benefits of free enterprise came to a new appreciation of democracy and a more nuanced understanding of liberalism or, rather, liberalisms. The Church emerged much more hesitant to wholly condemn anything merely for being remotely “liberal.” It turns out that not all democracy is like that of the French Revolution; not all liberalism is unhinged from virtue and moral norms; and a free economy is anthropologically sound and therefore more conducive to human dignity and flourishing than a state-controlled one. Democracy and freedom, John Paul II said in Centesimus Annus, come as a package.

Leo connected property ownership to the need for just wages and promoted not policies but the principles of subsidiarity, charity, and recognition of the dignity of the human person.

 

Beyond Rerum Novarum

Leo’s other encyclicals offer additional clarity about his views of liberty, liberalism, and the role of the state in human society. Those who condemn American liberalism often generously quote Leo’s critiques, but they miss that liberalism’s values such as human rights, freedom, and human dignity comport with Catholic social teaching as Leo presents it. Studying Leo’s other encyclicals suggests the pope had more nuanced views on these themes than liberalism’s detractors might suggest.

In 1888, three years before the release of Rerum Novarum, Leo XIII promulgated Libertas, or, “On the Nature of Human Liberty.” “Liberty,” wrote Leo XIII, is “the highest of natural endowments.” Its exercise figures prominently in human dignity because liberty can be used to reach “the highest good and the greatest evil alike.” Moreover, man is free “to obey his reason, to seek moral good.” The highest good for which he was made is to know, love, and serve God. So why, Leo wonders, is the Church accused of opposing liberty? Because, according to Leo, the Church’s critics “pervert the very idea of freedom” or, worse yet, “extend it at their pleasure to many things in respect of which man cannot rightly be regarded as free.” In other words, like everything else, one’s liberty can be put at the service of good or evil.

The Catholic understanding of sin, vice, and virtue has long been clear: these are properties of the will, not of the intellect. One can know right from wrong and still do wrong. Choice matters. There is, however, an important connection between will and intellect: it turns out that knowledge matters, too. Therefore morally reasonable people should attain the knowledge necessary to exercise human liberty properly. All choices are also moral choices, which means they involve judgment, reason, and knowledge about what is good. Without knowledge, according to Leo, “the freedom of our will would be our ruin.”

Those who condemn American liberalism often generously quote Leo’s critiques, but they miss that liberalism’s values such as human rights, freedom, and human dignity comport with Catholic social teaching as Leo presents it.

 

If reason is how individuals come to know moral truth attain their good, then laws—guided by the natural law—help direct societies and citizens toward their good. To demonstrate this truth, Leo XIII cites law as a “guide of man’s actions.” Reasonably, we conclude, there must be law and order. Natural law confirms this and is “written and engraved in the mind of every man, and this is nothing but our reason, commanding us to do right and forbidding sin.” This is the same as eternal law, according to Leo—all of God’s creatures are inclined toward their proper end, the ultimate purpose for which they were created. And so, “what reason and natural law do for individuals, that human law, promulgated for their good, does for the citizens of States.”

To make these arguments even clearer, Leo XIII draws heavily from St. Augustine: “There is nothing just and lawful in that temporal law, unless what men have gathered from this eternal law.” Augustine continues, explaining that if “something be sanctioned out of conformity with the principles of right reason, and consequently hurtful to the commonwealth, such an enactment can have no binding force of law, as being no rule of justice.” Here, Leo enjoins believers to civil disobedience of bad laws. Real liberty, concludes Leo, is experienced when we “live according to law and right reason,” if we live under just laws. This freedom to do what we ought to do is “true liberty.”

Leo’s writings on liberty suggest that laws discordant with right reason, and that are corrosive to human freedom rightly understood, are unjust and therefore not real laws. Here, he provides a much clearer principle by which to evaluate politics than any wholesale condemnations of liberalism do.

Which Liberalism?

In his earlier encyclical, Immortale Dei, Leo does urge that “in a free State, unless justice be generally cultivated, unless the people be repeatedly and diligently urged to observe the precepts and laws of the Gospel, liberty itself may be pernicious.” This warning captures integralists’ concern about liberty going awry in liberal societies, which is an understandable and legitimate worry. But note that when Leo says people must be “urged to observe the precepts and laws of the Gospel,” he does not appoint the state to do the urging; this is, first and foremost, the Church’s responsibility. The lack of an established church in liberal regimes therefore should not be seen as a failure.

After surveying Leo’s other writings, it becomes apparent what kind of “liberalism” Leo does condemn: the liberationist project of rationalist philosophers who make reason that is independent of natural, divine, and eternal law the supreme judge of truth. He wrote in Libertas: “These followers of liberalism deny the existence of any divine authority to which obedience is due, and proclaim that every man is the law unto himself.” This is an ethical system that masquerades “under the guise of liberty” and that substitutes license for true liberty. These liberals understand true freedom to be defined not “in any principle external to man, or superior to him, but simply in the free will of individuals.” The result is “that the authority in the State comes from the people only. . . . Hence the doctrine of the supremacy of the greatest number, and that all right and all duty reside in the majority.” Any liberalism that sees individual will as the ultimate authority, therefore, should be vehemently rejected. But of course, not every version of liberalism does this.

Leo XIII was operating in the context of the late nineteenth century, as “a prisoner of the Vatican.” He was thinking and writing amid the growing social question surrounding industrialization, socialism, and secularization in Europe. Context matters. Real events matter. Leo XIII even says this in Libertas—the Church must sometimes hold its nose and cooperate with states, to the benefit of the Church and the citizens.

To determine which arguments transcend time and which ones are context-driven, we must take historical events, disputes, and doctrinal developments seriously. Doing so helps us see how these factors shape and reshape usage and definition of terms like “liberalism.” The English Catholic historian Christopher Dawson was right when he said of “liberalism” that “there is no word—not even democracy—that has been used so loosely to cover such a variety of divergent elements.”

❌