FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayAn und für sich

Sacred Foundations and the mechanism of political theology

A few days ago, when I was about halfway through the book, I wondered aloud on Twitter whether Anna M. Grzymała-Busse’s Sacred Foundations: The Religious and Medieval Roots of the European State might need to join the mini-canon of Schmitt-style genealogical political theology. Having finished it, I now think it provides a key point of reference for a lot of projects in that strange field, though it is very much not in the “style” of the most influential works (of the kinds of works that I have advocated adding to the mini-canon, like Caliban and the Witch).

It is a sober empirical analysis, at times even a little boring, but it supplies something crucial: an actual concrete mechanism for the kind of “secularization of theological concepts” that are our stock in trade. In a way, Grzymała-Busse’s lack of conceptual or theological ambition is necessary for her to uncover what has been hiding in plain sight: state institutions in medieval Europe quite literally copied practices and procedures from papal models. The reasons for this are both grandiose and mundane — on the one hand, the papacy obvious carried with it a unique kind of spiritual authority, but on the other hand, the church was the only institution that looked like it knew what it was doing. For things like literacy, documentation, regular procedures, disputes based on precedent and evidence, etc., etc., the church was for many centuries the only game in town.

The motivation to adopt church models for governance grew out of the papacy’s temporal ambitions, which produced a rivalry with secular states. In Grzymała-Busse’s telling, it also arguably led to a secularization of the church itself, as the papacy’s growing administrative efficiency and ability to project power went hand in hand with growing corruption and declining interest in spiritual and theological matters in favor of law. States that were lucky were able to adopt church templates and create their own parallel structures, allowing them to administer justice, collect taxes, and do all the other things at which the church excelled. States that were unlucky — such as the Holy Roman Empire or the divided Italian peninsula — found themselves intentionally impeding from developing the kinds of centralized power structures that would allow such ecclesiastical borrowings.

Grzymała-Busse’s main goal is to argue against purely secular accounts of state formation, the most popular of which attribute state centralization either to the demands of warfare, the need to develop some form of consensus to collect taxes, or both combined. As she shows — fairly conclusively in my view — those theories simply cannot be right. And in her concluding pages, she suggests that the idiosyncratic process of state formation in Europe, which was the only part of the world that had a powerful autonomous trans-national religious institution at the crucial period, should lead political theorists to make less sweeping claims about the universality or necessity of European state structures, much less the processes that led to them.

To me, the most interesting part of her argument from the perspective of “my” preferred brand of political theology is the view that the notion of territorial sovereignty actually grows out of the papacy’s contingent political strategies during the high middle ages. Grzymała-Busse argues that the notion that all kings are peers and no secular ruler has power over a king in his own territory was actually meant to head off the rise of a powerful emperor figure to displace the pope — but the more the pope grew to function as precisely that type of figure, the more the notion of territorial sovereignty became a weapon against papal interference as well. In short, the Westphalian/United Nations model — in which the world is parcelled out among sovereign territorial units that are all to be treated as peers, with interference in their internal affairs being prohibited except in extreme cases based on international agreement — that has hamstrung any attempt at global regulation of capital or any binding climate action, effectively dooming humanity to live on a permanently less hospitable climate… turns out to stem from an over-clever political strategy on the part of some 13th-century pope. It sounds almost absurd when you put it that way.

Grzymała-Busse’s book abounds in such ironies. Every innovation that the papacy introduced to shore up its power in the short run wound up empowering temporal rulers in the long run. The very religious authority that provided the popes the opportunity to fill Europe’s power vaccuum — and in the case of Germany and Italy, fatally exacerbate it with such skill and precision that it would persist for centuries after the conflict between church and state was decisively won by the latter — prevented the papacy from assuming the imperial prerogatives it worked so hard to prevent anyone else from having. Perhaps we can see now why Carl Schmitt was so enamored with the ius publicum Europaeum — it is quite literally a secularization of the papacy’s attempt at the indirect governance of Europe. (Meanwhile, I am at a loss for what this book could offer to the “politically-engaged theology” construal of political theology, because so much of what was formative for the “positive” aspects of secular modernity came from the “bad” period of papal history.)

There is more to say about this book, though perhaps my suggestion on Twitter that this book could serve as fodder for a book event was premature. It is a little too specialized and conceptually dry to spur the kind of discussion we normally aim to have. But I hope my political theology colleagues will read it, and if any of them have thoughts about it that go beyond what I say in this post, I would be happy to host them here.

0691245088.01._SCLZZZZZZZ_SX500_

akotsko

Hanging out

I must really be back in blogging mode, because I feel compelled to do that most bloggy of things — explain why I haven’t been blogging. My excuse is simple: I’ve been making great progress on my aforementioned book on Star Trek, which has left me very little energy for other writing. But I’ve been mulling a post on this topic since returning from my trip to the UK. The conference — at which I delivered my first official presentation on the Qur’an — was rewarding, and the trip My Esteemed Partner and I planned around it really hit the spot, with a more casual vibe in Edinburgh and a busy couple days in London. But what really stood out to me was how energized I was by simply hanging out with my academic friends. The combination of genuine friendship, shared intellectual interests, and — crucially — unstructured time was absolutely rejuvenating. We weren’t catching up over coffee in an appointment made months in advance, we were all simply there and available and up for conversation.

I italicize all these seemingly insignificant aspects of the situation just to highlight how bizarrely rare they tend to be in my life, and I assume most of our lives. I hit the trifecta late in college, then again in grad school. Cultivating various kinds of “third spaces,” like regular bars where I at least knew the bartenders enough to have random unstructured socialization, was also a good strategy, though primarily in my grad school years (when there were a lot of other Chicago-area grad students who similarly hovered around my regular place). At times, the old independent Shimer could approximate that feel, too, as there was a critical mass of people who had common intellectual points of reference and a willingness to kill some time, but the smaller size of the current Shimer faculty and student body has made that more difficult to achieve (though I’m hopeful that can change). I assume my experience now is more like that of a typical “busy” academic, who is relatively isolated outside the classroom and at many schools will not have ready intellectual interlocutors due to the perceived need for specialization and “coverage.” Often conversations among colleagues at the same institution will veer toward the shared topic of office politics, which can be cathartic but is not rewarding in the same way.

When we got home, I brainstormed with My Esteemed Partner about how to cultivate something similar in my normal life, and we came up short. Realistically, I probably need to wait for those recharge moments at conferences — or maybe I should plan to head to Berlin next year, so that I can have ready access to (apparently) every American academic in a humanities discipline. Yet it seems like a sad commentary that the settings that appear best positioned to provide that kind of intellectual community instead wind up dividing and exhausting us, so that we don’t have any energy to spend our rare free moments together on anything but venting impotently about the institution.

Part of the problem, surely, is that we haven’t read and thought about the same things. That is definitely crucial to my friend group, as to the conference itself. (The structure of the conference brought that home, as there were two “streams” with very different backgrounds and interests, so that it became very difficult to have productive conservation with the whole group.) That doesn’t have to be a static “canon” — I have definitely picked up things simply because people I admire are reading them and I want to keep up. Specialization and the demands of scholarly productivity seem to militate against the formation of that kind of open-ended sharing of interests in many cases, as do heavy teaching and administrative loads. I’m exceptionally lucky, as so often, in the Shimer setting, because we have all read the same evolving core body of texts, and hence I can always redirect the inevitable office-politics venting to something more worthwhile, at least for short bursts. But often even people in the same small department will not have that kind of overlap.

And of course there is the sheer issue of time, especially as so many of the young academics who most hunger for this kind of contact are starting families. I don’t know what to do about that other than to suggest potentially radical changes in our living habits — let’s start a commune! — that I myself don’t actually want to do. But it does seem like there’s still room to rebuild some of the social habits and casual third spaces that collapsed in the pandemic. For instance, I notice that the culture of riding the train together has fallen by the wayside among my North Central colleagues who live in the city — maybe it’s worth making more of an effort to aim for the same train home? I don’t know! Or I could just hang around in the local Irish pub on “if you build it, they will come” grounds. That’s it — I’ll do that. Problem solved!

928px-_The_School_of_Athens__by_Raffaello_Sanzio_da_Urbino

akotsko

Enemies for Your Sake: The Figure of the Jew in Paul and the Qur’an

[I delivered this paper at the conference “Figuring the Enemy” at St. Andrews University, June 6-8. Thank you to Scott Kirkland for the invitation!]

In this paper, I want to draw a comparison between the treatment of the figure of the Jew in the Pauline Epistles and the Qur’an, with the goal of illuminating the necessarily polemical nature of historical, revealed monotheism. I will begin by providing some background as to why such a juxtaposition has been only seldom attempted, explain how I came to see these two texts as related, and briefly suggest how the parallels might have come about. I will then develop a more detailed comparison and contrast, laying the groundwork for a conclusion in which I draw out some implications for our understanding of monotheism, in critical dialogue with Jan Assmann.

I.

Paul is conspicuously absent from the Qur’an. The sacred text of Islam mentions many figures from the New Testament, dwelling at great length on Jesus, Mary, Zechariah, and even mentioning that Jesus — alone among the Qur’anic prophets — had a special group of followers known as the “disciples.” All of those references, however, are solely to the Gospels, or to adjacent apocryphal literature, such as the Protoevangelium of James. Only the Gospel is mentioned alongside the Torah as an authentic earlier scripture in the Qur’an’s reckoning.

Yet even if Paul had been mentioned, the Qur’an also claims that the actual scriptural deposits held by contemporary Jews and Christians have been corrupted and that the Qur’an’s retelling of biblical stories are restoring the authentic originals. Hence there has historically been little curiosity among Islamic scholars about the Bible — why bother with the corrupt version when you have the real thing? Virtually the only extended engagement with Paul in the medieval Islamic tradition is ‘Abd al-Jabbār’s 10th-century Critique of Christian Origins, which portrays the Apostle as a scheming Jew who was instrumental in corrupting the original Gospel message. Along the way, he introduces many fanciful and sometimes offensive stories and evinces only a fragmentary knowledge of the epistles themselves.

Modern scholars of the Qur’an have largely left Paul aside as well. The motives, to the extent we can assign motives to this kind of lacuna, likely vary. On the one hand, scholars who are broadly sympathetic to Islam accept the basic traditional narrative that the Qur’an was revealed to Muhammad during his lifetime and assembled within the lifetime of his close companions. Most members of this group therefore usually attempt to square the circle between a secular perspective on the Prophet’s activities and the Islamic claim that he was an “unlettered prophet” — i.e., he didn’t directly study previous scriptural texts, but may have vaguely picked them up by osmosis. On the other hand, there are more hostile scholars who often advance far-fetched and, in my view, borderline conspiratorial narratives of a late origin of the Qur’an, which cuts the figure of Muhammad out of the picture. In place of any serious engagement with the Qur’an as a theological text with cohesive themes, they attempt to trace the lost Syriac text (or whatever) that lay at the basis of it.

Nevertheless, simply as a reader who is familiar with both texts, I cannot help but think there is an important connection to be made here. I first approached the Qur’an for the sake of my teaching, as my dean called upon me to fill in a gap in our course offerings after the faculty member who taught Eastern religions retired. As a scholar of Christianity, I figured that Islam would be the nearest reach. Hence I set to work reading the suras of the Qur’an in approximate chronological order — with no particular “angle” or agenda, and perhaps even a little irritated that this demand was interfering with my summer research plans.

When I got to Sura 2, The Cow (which comes early in the printed text but relatively late in the Prophet’s ministry), I was struck by the following passage: “They say, ‘Become Jews or Christians, and you will be rightly guided.’ Say [Prophet], ‘No, [ours is] the religion of Abraham, the upright, who did not worship any god besides God’” (2:135; using Haleem translation here and throughout; all brackets represent attempted clarifications by the translator). This sounded very similar to Paul’s attempts in Galatians and Romans to get back behind the Law of Moses by connecting his preaching to the more primordial figure of Abraham. When I got to Sura 3, The Family of Imran, the parallel had become unmistakable:

People of the Book, why do you argue about Abraham when the Torah and the Gospels were not revealed until after his time? … Abraham was neither a Jew nor a Christian. He was upright and devoted to God, never an idolater, and the people who are closest to him are those who truly follow his ways, this Prophets, and [true] believers—God is close to [true] believers. (3:65-68)

How can one not think of the passages where Paul argues for the priority of Abraham’s pure faith over against the later covenant of Moses? In Galatians, he specifies that “the law, which came four hundred thirty years later, does not annul a covenant previously ratified by God” (Galatians 3:17), and in Romans he points out that Abraham’s faith was reckoned to him as righteousness “not after, but before he was circumcized” (4:10), so that he can be the ancestor of both uncircumcised and circumcised believers.

No one who had attended Ted Jennings’ seminar on Romans could possibly miss these parallels. Nevertheless, I did not find much affirmation of my intuitions in the scholarship. In my admittedly far from exhaustive study of literature on the Qur’an, I have found almost zero mention of any relationship between the Qur’an’s deployment of the figure of Abraham and the Apostle Paul’s. The one exception is the biography of the Prophet by Juan Cole, probably known to some in the audience as an anti-Iraq War blogger. Breaking with both trends of the scholarship I mentioned above, Cole both accepts the historicity of the Prophet Muhammad as the vehicle for the revelation of the Qur’an over a relatively short period of time but nevertheless avoids the kid-gloves approach to the notion of the “unlettered prophet.” For Cole, Muhammad was a successful merchant and hence would have been multi-lingual as a matter of course. Moreover, he was a spiritual seeker long before he started receiving the Qur’anic revelations, and so he would have eagerly devoured any theological or scriptural literature he could get his hands on.

Hence Cole is able to make connections to texts from Judaism, Christianity, and various heretical or non-mainstream sects of both, as well as many other religious and intellectual movements — including the letters of Paul. Indeed, he takes the connection further. In an unpublished SBL paper he generously shared with me, Cole suggests that Pauline studies might provide a useable paradigm for Qur’anic studies, and in fact his biography of the Prophet, which aims to use the Qur’an rather than the traditional biographical narratives as the primary source, is constructed on the model of a biography of Paul drawn from the internal timeline implied by the letters rather the account in Acts.

Although he is definitely an outlier in terms of reconstructing the Prophet’s literary borrowings, even Cole stops short of making any strong claim that Muhammad sat down and read the literal texts of Paul. I don’t want to make any strong claim, either. Instead, I want to suggest that it ultimately doesn’t matter. Even if Muhammad studied the texts of Paul, that does not, in itself, “explain” why those precise rhetorical moves appeared at such crucial moments in the Qur’an. As the man says, “the cause of the origin of a thing and its eventual utility, its actual employment and place in a system of purposes, lie worlds apart.” Whether they are borrowed or independently discovered, the rhetorical and theological parallels are rooted in their shared rhetorical and theological situation as messengers who have arrived “too late” and must shore up their legitimacy in the face of a pre-existing monotheistic tradition (or range of traditions).

The situation is doubtless more complicated in Muhammad’s case given that the centuries that separate him from Paul saw the rise of Christianity, in all of its bewildering and mutually antagonistic forms. In a sense, though, Paul himself is already dealing with multiple versions of Christianity as well, as shown in his “so I says to the guy”-style account of his debate with Peter in Galatians or the delicate tightrope he walks with Apollos in 1 Corinthians. For both Paul and Muhammad, one would initially assume that the debate with Christians is the more salient one—in Paul’s case because he is primarily concerned with the requirements for Gentile believers to join the Christian movement and in Muhammad’s because orthodox Trinitarianism and Christology clearly violate the Qur’anic prohibition of “associating partners with God” — yet both focus much more on the original bearers of the monotheistic revelation: the Jews.

II.

At first glance, Paul and Muhammad’s respective position with regard to the Jews could not be more different. Most notably, Paul is himself a Jew of impeccable credentials: “circumcised on the eighth day, a member of the people of Israel, of the tribe of Benjamin, a Hebrew born of Hebrews; as to the law, a Pharisee; as to zeal, a persecutor of the church; as to righteousness under the law, blameless” (Philippians 3:5-6). As the Apostle to the Gentiles, Paul has the role of bringing an essentially Jewish message to the previously excluded nations.  Muhammad, for his part, claims no Jewish descent, and the Qur’an proudly declares itself an Arabic revelation for Arabs. Paul is also preaching a message that he understands to be the culmination or fulfillment of the Jewish revelation, whereas—at least until the later years of Muhammad’s career — the Qur’an presents the Prophet’s message as merely the latest in a series of such messages to individual nations.

Despite those very important differences, however, both Paul and Muhammad are clearly rankled by the failure of the Jews to accept their respective messages—albeit not to the same degree. Whereas the Prophet recognizes that the Jews have a special relationship to God and to the Scriptural heritage, making their failure to acknowledge his prophetic message a clear challenge to his prophetic legitimacy, that is not as pressing an issue as the refusal of Paul’s fellow countrymen’s to accept their own ostensible messiah. The gap begins to close as Muhammad becomes a political leader in Medina, where some Jewish groups explicitly ally with him and thus become quasi-insiders, but even then, Muhammad’s ultimate goal is to establish an ecumenical community of monotheists: “The [Muslim] believers, the Jews, the Christians, and the Sabians—all those who believe in God and the Last Day and do good—will have their rewards with their Lord. No fear for them, nor will they grieve” (2:62). (No one is quite sure who the Sabians are, but surely that only highlights the open-ended inclusiveness of the Qur’an’s vision!)

Once we take into account those key differences in their respective situation and degree of emotional investment in Jewish acceptance of their message, what’s remarkable is how similar their strategies are for negotiating their relationship with the Jewish monotheistic heritage. For the sake of imposing some order on two infamously disorganized bodies of literature, I will assess their approach to three interrelated issues in turn: the question of Jewish privilege, the place of the Jewish law, and the prospects of salvation for the Jews.

So first, Jewish privilege. As we have seen, both Paul and the Qur’an attempt to displace Jewish privilege by using Abraham to make an end-run around Moses. Both also deploy the primal scene of the Garden of Eden to emphasize the universality of their respective messages, though this move is much more prominent in Paul than in the Qur’an. This likely reflects the fact that the Prophet’s mission in the Qur’an is not initially envisioned as having the same universal scope as Christ’s. Indeed, the very fact that no prophet is fully definitive, that each is part of an open-ended series, seems to displace the privilege of previous prophets such as Moses or Jesus.

This strategy is especially evident in the earlier revelations, which intersperse the familiar biblical prophets with messengers to various ancient cities of Arabia — whose ruins would regularly be seen by traveling merchants — and radically downplay the fact that the biblical prophets are all part of the same family tree. For the Qur’an, a perennial monotheistic message has come down through many prophets at many places and times, and only very late in the Qur’anic revelation do we get any hint that Muhammad, as “seal of the prophets” (33:40), is anything other than one prophet among many.

Yet a form of particularism does return, precisely through the figure of Abraham. As we have seen, Paul turns Abraham into the father of those who believe in Christ and, in Galatians, reinterprets Christ as the singular “offspring” who will receive God’s promises to Abraham (3:16). Similarly, the Qur’an introduces a Muslim particularism into the Abraham story by stealthily replacing Isaac with Ishmael on Mount Moriah (37:99-111) and then has the father-son pair found the shrine at Mecca (2:125-129). Hence the “religion of Abraham” is both the perennial monotheism announced by all the prophets and the specific Mecca-centered rites preached by the Seal of the Prophets. In both cases, Jewish privilege is displaced and then reappropriated for the new movement.

This brings us to my second point, namely the role of the Jewish law as the most visible marker of Jewish particularism. Here the convergence between Paul and the Qur’an is most remarkable, given their very different starting points. For his part, Paul has two core concerns that bring him into collision with the Jewish law. Like all Jesus-followers, he must account for how the messiah’s shameful crucifixion and death as an outlaw fit into the economy of salvation, and in terms of his specific mission, he is deeply committed to the idea that Gentiles share in the messianic reality precisely as Gentiles—not as converts to Judaism. In Galatians, that leads him to identify the law as a curse (3:10-14) and a form of slavery (4:22-5:1) and to joke darkly that anyone who is interested in circumcision “would castrate themselves” (5:12). In the later epistle to the Romans, this view has softened: though the law is “holy and just and good” (7:12), its role is the fundamentally negative one of highlighting the omnipresence of sin. In both letters, then, Paul concludes that one is better off joining Christ outside the sphere of the law.

For the Qur’an, the conflict arises from a repeated concern that people should not simply make up prohibitions that God has not actually revealed. In the case of dietary restrictions, the Qur’an repeatedly states that God “has only forbidden you carrion, blood, pig’s meat, and animals over which any name other than God’s has been invoked” (2:173). This raises the question of whether the Torah’s much more restrictive dietary rules represent the kind of imposture that the Qur’an decries. Though the Qur’an is generally comfortable performing a line-item veto of individual laws or plot points from biblical stories, disqualifying the majority of Jewish practice as a fraud is apparently a bridge too far. It decides that the restrictions are real, but they apply only to the Jews, as a form of punishment: “For the wrongdoings done by the Jews, we forbade them certain good things that had been permitted to them before” (4:160). The Qur’an also suggests that the Sabbath may have a similar punitive origin, claiming that “The Sabbath was made obligatory only for those who differed about it” (16:124).

For both the radical messianist and the Seal of the Prophets, then, the Jewish law is reversed from a blessing to a curse, or at least from a privilege to a burden. This obviously complicates my third point of comparison, the ultimate fate of the Jewish people, which is also the place where their strategy most differs. The key passage here in Paul is the labyrinthine Romans 9-11. On the one hand, Paul is unequivocal that “gifts and the calling of God are irrevocable” (Romans 11:29). Jesus is and always remains the Jewish messiah, and his death and resurrection make possible the unexpected extension to all nations of God’s promises to the Jewish people. Even the fact that they have rejected the messiah and become “enemies for your sake” (11:28) only highlights the Jews’ special role in the economy of salvation—they had to step aside temporarily in order to make room for Gentiles to come in. Yet Jewish particularism is only redeemed through accepting Christian particularism, as their jealousy at seeing the Gentiles enjoy the benefit of God’s promises will lead them to relent and accept that Jesus is the messiah (11:11). Only in that sense can Paul say, “all Israel will be saved” (11:26).

By contrast, in the Qur’an, the standards for the salvation of a Jewish person are the same as for anyone else: believe in God and the last day, pray, and live a righteous life. The prophetic message provides no information about what happens if they don’t fulfill their special dietary obligations, though it does include an enigmatic story in which God tests the Jews’ faithfulness by sending them a fish that surfaced only on the Sabbath (7:163-167). They give into temptation, leading God to declare that “until the Day of Resurrection, He would send people against them to inflict terrible suffering on them” (7:167) — a dire punishment, to be sure, but one that only applies to the life of this world. One assumes, as with every commandment in the Qur’an, that the faithful Jew’s duty is to do their best to obey their extra commandments and not be too hard on themselves if some necessity or misunderstanding prevents them. The displacement of Jewish privilege thus leads to a situation where they have no particular advantage or disadvantage in salvation. Indeed, the Qur’an, in a distant echo of Paul’s rude comment from Galatians, can sometimes taunt Jews who believe that their special relationship to God means they are automatically saved that they should therefore wish for death (62:6).

Nevertheless, this live-and-let-live attitude does not exclude a bitter enmity, at least for a certain subset of Jews. The direct cause here is the fact that some Jewish groups explicitly submitted to Muhammad’s leadership through the so-called “Constitution of Medina,” then failed to fulfill their obligations. As Juan Cole is at pains to clarify, many verses that seem to vent fury at the Jews as a whole should instead be interpreted as applying only to specific groups who behave in the specific ways described. That contextualization is helpful, yet it does not dispel my impression that the Qur’an’s attitude toward the Jews is on something of a hair trigger—their presumed favor and cooperation is especially coveted, making their opposition all the more galling. The fact that Christians, who on the face of it violate the Qur’an’s radical monotheism, do not face the same love-hate dynamic only heightens this suspicion. That may simply be a result of a lower or less unified Christian population in the Hijaz — or it may reflect a sense that Christians, as another post-Jewish monotheistic movement, are more natural allies.

III.

Overall, then, both the Apostle Paul and the Prophet Muhammad arrive at very similar strategies for negotiating their relationship with the original bearers of the monotheistic message. Both tend to displace Jewish particularism and replace it with their new movement’s own particularism. Both reinterpret the Jewish law as a burden or punishment rather than a sign of God’s blessing. Thankfully, these negative moves do not lead either to exclude the Jews from salvation, but both nonetheless insist that any redemption they experience will conform to the standards of their new revelation—which was of course the true meaning of the old revelation all along. The specific theological paths they take to get there differ based on their respective historical contexts and emotional investment in the Jewish community, but the fact that such different starting points can lead to such similar results is, at the very least, unexpected—especially for two texts that the medieval and modern scholarly traditions have taken to be completely unrelated.

My contention is that this convergence ultimately stems from the very nature of historical, revealed monotheism. Here I am drawing on the work of Jan Assmann. In The Price of Monotheism, Assmann argues that revealed monotheistic traditions represent “secondary” religions. Whereas the “primary” traditions, retrospectively called polytheism, have an open-ended and inclusive quality — new gods can always be added to the pantheon, and the gods of other groups can be “translated” into their local equivalents—the secondary religions style themselves as the correction of the errors of paganism and have a built-in intolerance. This intolerance extends not only to the unwashed masses outside the monotheistic circle, but to the backsliders and compromisers within it, who fall short of the monotheistic demand encapsulated in a scriptural deposit. Living traditions will inevitably lapse into such “betrayals,” and they will just as inevitably be met with Reformation-like demands to “return” to the pure religion represented in scripture.

All of this seems to me to be basically right, and classroom use has showed me that students find Assmann’s concepts helpful for negotiating the differences between polytheistic and monotheistic traditions. Where Assmann seems to me to stumble is in his account of the relationships among the existing monotheistic traditions, which for him grows out of the tensions between universalism and particularism in the monotheistic idea. On the one hand, monotheism has universal implications—the God it reveals is the God of everyone. On the other hand, monotheism insists on particularity—the God is reveals is a particular named God who has participated in specific historical events, and all other gods are false and/or demonic. Judaism manages this tension in a straightforward way:

In Judaism, the universalism inherent to monotheism is deferred until a messianic end-time; in the world as we know it, the Jews are the guardians of a truth that concerns everyone, but that has been entrusted to them for the time being as to a kind of spiritual avant-garde. For Christians, of course, this end-time dawned some two thousand years ago, putting an end to the need for such distinctions. That is why Christian theology has blinded itself to the need for such distinctions. (17)

He then adds, almost as an afterthought, that Islam suffers from a similar blindness, which is why both traditions have at times embraced an intolerance and violence that seems to contradict their universal message.

This is fine as far as it goes, but it doesn’t seem to provide much explanation for why so much of that intolerance and violence has been directed at a group that both traditions agree to be fellow worshippers of the one true God: namely, the Jews. Here the problem doesn’t seem to me that Christianity and Islam can’t admit that their universal message is rooted in particularism. It’s that they can’t admit that their new revelation is new. This is not merely bad faith or willful blindness, but a structural necessity of participating in the revealed monotheistic tradition. The one true God is not a vague philosophical principle of unity or a trans-historical ideal that may be reflected in many different ways — he is a particular, named deity who has reportedly done particular things at particular times and places. The claim to follow this God therefore requires maintaining some form of continuity with the existing deposit of revelation, even as the felt demand for a new approach demands some critical distance.

The ideal outcome from the perspective of a budding new prophet would of course be that the Jews accept the new “purified” version of monotheism en masse. Every new monotheistic movement seems to include a moment of delusional optimism that this will occur — even Martin Luther expected that Jews would rush to embrace the “real” Christianity that had been obscured for so many centuries. This blessed outcome somehow never occurs, leading to the kinds of mental gymnastics I have documented above.

The necessarily polemical nature of revealed monotheism is therefore directed precisely at the original bearers of the monotheistic demand. Their very faithfulness to the divine command is recast as a form of stubbornness or arrogance. Their entire history — here drawing on authentic threads in the Hebrew Bible — is interpreted as one of rebellion and disbelief. In the Qur’an, which proclaims to the Jews that God has “blessed you and favored you over other people” (2:47), the chosen nation is reduced to a perpetual object lesson for the believers. Even worse, the Christian tradition of typological exegesis that finds its beginning in the Pauline epistles reads the entire history of Israel as a series of unwitting anticipatory pantomimes of the life of Christ.

The Jews are wrong, yet necessary, indeed necessarily wrong. They become, in Paul’s words, “enemies for your sake,” constitutive enemies at the foundation of a new tradition that can structurally never understand itself to be new. And so the new revelation’s declaration that the Jews’ special favor with God, represented by the divine law that structures their lives and sets them apart from all nations, is actually a burden and a curse becomes a grim self-fulfilling prophecy. For the crime of bringing revealed monotheism into the world and sustaining its demand against all odds, the Jews are condemned to perpetual suspicion, exclusion, and persecution—precisely by their fellow monotheists. The dynamic may be more virulent (as in Christianity) or less (as in Islam), but it is nonetheless real and destructive.

I began this paper by considering the understandable reasons that scholars have ignored the possible connection between Paul and the Qur’an, and I will end by asking why Assmann might downplay the toxic theological dynamic that my comparison has highlighed. The answer, it seems to me, is that he is attempting to extract some redemptive core to the monotheistic revolution, which will allow him to declare it, despite everything, a progressive step toward the inclusive secular world order he is clearly hoping for. That core, he claims, is the universal demand for justice, which could form the basis of a universal law, though never a universal religion. Yet the dynamics I trace in this paper call the justice of God deeply into question, as an entire nation’s history is reduced to an object lesson or a ladder to truth that can be safely kicked away. It is no mistake that Romans 9-11, where Paul grapples with the salvation of the Jews, is also Christian theology’s locus classicus for the doctrine of predestination, which seems to reduce God to an arbitrary monster. The hope for a universal justice is surely a valid one, but to get there, we need to break more definitively with the habits of thought that the secondary monotheisms have bequeathed to us.

quran

akotsko

The pandemic — which isn’t over, by the way!

Once in grad school, Anthony Paul Smith and I had the same temp job. It was a terrible job, doing tedious data entry to convert the Sunday circular coupons into a clickable webpage. Seldom has a temp job felt more purely pointless and degrading. And yet, a few months later, we caught ourselves fondly recalling those times, and Anthony suggested that we need to resist the urge to be nostalgic for something simply because it’s in the past.

I find myself thinking that about the pandemic lately. I am finally having a calm summer vacation at home, and especially now that My Esteemed Partner usually works from home, our routine is reminiscent of the pandemic. We followed stricter guidelines than most, for longer than most. For me, the “lockdown” lasted a year and a half, and even when I returned to teaching, our social life was very constricted.

I am a relatively healthy person, so I wasn’t especially worried about catching the novel coronavirus myself. It was more a matter of not wanting to accidentally harm others, since My Esteemed Partner was potentially more vulnerable, and I live in a building full of elderly people. The effect was much the same, though — the instinctive fear and avoidance of other people, who could all be a disease vector, the preference to huddle at home whenever possible. I remember once, the first pandemic summer, My Esteemed Partner wondered aloud if we could get a rental somewhere in Michigan, so that we could get out of the city but still be isolated. I always have some degree of travel anxiety, but this time around I was straight-up afraid — not just of the travel itself, but specifically of traveling to a place full of covid scofflaws. (Eventually, right-wing militants in Michigan would be arrested for plotting to kidnap the governor over covid restrictions.)

As I’ve written before, we had ideal circumstances — we lived in a very covid-compliant area, we both kept our jobs, we didn’t have childcare to worry about, and we even came out financially ahead. Ultimately we bought an apartment, something we wouldn’t have seen as a possibility just a couple years earlier. But I can tell it has lingering effects. My emotional equilibrium still feels “off.” I have less resilience, and I feel like my moods swing more than they should. Above all, my social muscles feel like they have atrophied. I have a friend who recently moved back to town, who has invited me out for last-minute drinks a few times, like back in grad school. I almost always turn him down, and though I can always think of some particular excuse, I am realizing that part of the problem is that I simply need more lead-time to build up my reserves and face a social situation.

Part of the issue is that I was never truly alone for any considerable amount of time. I know others who were single when the pandemic began who are dealing with a whole other level of difficulty, so I’m grateful that My Esteemed Partner was with me. Yet solitude has been a major part of my emotional equilibrium since I was a very young child, and I have been largely deprived of it for years. During the pandemic, my only time alone in the apartment would be when My Esteemed Partner was walking the dog. Eventually, doctors appointments or haircuts would provide longer windows. But I have had maybe 2-3 instance in the past few years where I had the house to myself for the better part of a working day, and that makes it hard for me to recenter. Thankfully My Esteemed Partner is doing a couple days in the office most weeks now, which seems to be helping.

Of course, I didn’t do myself a lot of favors with the transition to normal life. When I went back to teaching, I was so excited to be in-person again that I accepted a hugely undercompensated teaching overload — teaching more than I’d ever taught before in my life, two semesters in a row, complete with an extreme early-morning schedule. And this past year, I took on a faculty governance role that proved more time-consuming and otherwise stressful than I had anticipated. In other words, work has objectively been more stressful and demanding, so we would expect that my social and emotional resilience would be down from historic levels. And yet I feel like I was starting from a weaker baseline. These past couple years would have been hard no matter what, but would they have been this hard?

And ultimately, of course, I did get covid, and I did give it to My Esteemed Partner. We both had a very mild case, and she recovered faster than me. My main concern was not the physical discomfort, but rather the isolation. I also technically had long covid — though it manifested only in a persistent cough and a strange rash (verified by an actual doctor as a post-covid symptom!), rather than the scary symptoms we all picture when that dreaded, mysterious condition comes up.

None of this is to say that the disease isn’t serious for others, nor that we shouldn’t have taken more or different precautions as a society. I’m just suggesting that for me, and probably for many, the most enduring post-covid effects are due to the isolation itself rather than the disease.

And I wish that our public sphere were not structured so that bringing up those concerns automatically triggers the reaction of “OH! so you wish the elderly and immune-compromised would just die for your convenience, huh?!” No, I don’t. I sacrificed a not-inconsiderable percentage of my life to try to avoid harming others — primarily My Esteemed Partner, but also many people I barely know or have never met. That imposed a cost on me that is very real and that I am still dealing with. And it still seems like every day on social media I see a stray remark implying that “lockdowns never happened” or social isolation is easy and fun. Even the repeated refrain that “the pandemic isn’t over, by the way!” grates, because it seems to carry with it the assumption that we should return to that level of anxiety and restriction.

What is the point of this post? I’m not sure. Ultimately, this is my blog and I sometimes blog about personal things. If there’s a political point here, it might be the suggestion that maybe liberals don’t need to add a constant reminder of that terrible pandemic that ruined all our lives to their litany of smug truisms and that maybe it doesn’t need to be an article of faith that the side-effects of all pandemic restrictions are either non-existent or by definition swamped by their life-saving intentions. In any case, not seeing constant reminders of the pandemic or implicit accusations that I’m a bad person for feeling bad in its wake would help my mental health.

screen_shot_2015-05-12_at_3-31-31_pm

akotsko

Why write about TV?

I’ve written a great deal about TV — three short books on negative character traits in contemporary television, a peer-reviewed article and now a planned book on Star Trek, and countless blog posts and online publications. I’m even teaching a course that’s primarily about television this fall, namely a study of Watchmen and its HBO adaptation (with the latter being the main object of interest for me). Yet I find myself a big exhausted and disengaged by the culture of TV commentary. Part of that is simply the fact that there has been a vast overproduction of commentary and “takes.” Many of these pieces are written by people I admire and are of very high quality, but the sense of being rushed or forced somehow haunts even the best pieces for me.

I would like TV analysis to be “insight recollected in tranquility,” and the current online publication culture simply is not compatible with that. Trying to keep up is the only way to effectively get read, at all. In six months, no outlet is going to publish your piece about how you just realized something about Succession — there’s a window, and that window is now. I can blog about it and my friends will see it and maybe even like it, but that’s no way to build a reputation or a career as a writer. I understand that it’s a privilege that my full-time teaching job allows me (and in many ways requires me) to sit that out, and perhaps part of my fatigue is a form of survivor’s guilt, because there are many possible alternative timelines where I might have been pushed out of academia and seen the TV commentary game as the only way to maintain some kind of intellectual engagement in my work.

I don’t think that overproduction or weird personal vibes are the only factors here, though. There’s a fundamental unclarity about the task of TV writing. Sometimes, as in episode-by-episode write-ups, the task seems to be to help people remember what happened or process basic plot points — or keep up with events on the show without actually watching it. I notice that sometimes people respond to those write-ups as though they contain “smart” commentary, when it seems to me that they are mostly just summary. Everything about that corner of the TV writing game makes me feel sad — though I would totally accept a TV write-up job for a Star Trek series if offered.

The write-up partly makes me feel sad because I can tell that the writers know the task is beneath their dignity and beneath the dignity of their readers. This is not the case for the true lowest of the low — the kind of TV commentary that suspends disbelief permanently and responds to events as though the characters were real people. This seems to characterize a lot of the Succession takes circulating right now. They amount to gossip columns about fictional characters. At a slightly higher level, perhaps, are speculations about what might happen, especially if they are keyed into what would please or surprise fans the most. Though the latter concerns are superficial, they at least bring into view the show’s status as an intentionally crafted aesthetic object, rather than a window into a fictional but “real” world.

But this is the problem — the TV show’s status as an aesthetic object is never fully secure. Even “prestige drama” is haunted by the anxiety that it’s still just… TV. Is Mad Men a soap opera? Is Succession a weird kind of sitcom? Clearly they are. But are they just that? It’s never okay for a TV show to be precisely and exactly a TV show, and especially to typify a TV genre. The greats have to somehow transcend their medium. The Wire was, famously, like a Victorian novel. Except it wasn’t a novel — it was a TV show, with visual storytelling parcelled out in serialized hour-long units. Even film seems to have enough prestige at this point to be an object of aspiration, so that the most poorly-paced blob of formless content on Netflix can be pitched as a “10-hour movie.” And surely much of the prestige of “prestige TV” comes from the adoption of cinematic-quality production values and performances, though that gap has been narrowing.

If we can’t hold firm to the TV show as a worthy aesthetic object, then, we inflate its importance in another direction — usually by turning it into a source of political insight. Every show produced in the US can be pressed into service as a window into the American soul, almost by definition. How this is supposed to work is unclear to me. The American people did not produce the show. There was not an election in which they got to choose which shows would be made. Ratings provide some kind of measure of popularity, which must mean there’s some kind of resonance there. But I’ve seen similar claims made that, for instance, Star Trek: Enterprise — by all standards a failed show, which struggled to stay above a million viewers in its final seasons — demonstrated how Americans tried to navigate the tensions in a post-Cold War world or whatever. How can we draw any real evidence for American attitudes in general from such a marginal entertainment product?

Even less plausible than the political reflection thesis is the quest for a political prescription in the TV show, which of course always manages to fall short of the critic’s (usually unstated) standards of “correct” politics, or “correct” representation, or what-have-you. Sometimes such pieces seem to veer toward a disguised form of “Monday-morning show-runner” — the political prescription serves as an alibi for the critic’s preference for the plot to have gone in another direction. Strangest of all, though, are the ones that want to see positive political guidance from the TV show, or at least political “lessons.” The sense that this is what TV is somehow “for” leads to a related syndrome of lamenting that a portrayal of bad politics will somehow give people the wrong political ideas — because presumably people get their political ideas directly from TV shows.

What I’d like to see — and what I hope to practice — is a form of analysis that centers the TV show as a work of narrative art with its own strengths and limitations, its own genre expectations and standards. This would mean pausing before lamenting that the show didn’t take your preferred direction and asking why the writers did choose what they chose. It may turn out that their implicit reasons don’t make sense or work at cross-purposes with something else, such that we can lament that the urn is not as well-wrought as we wish it could be. Similarly, before reading off political messages (positive or negative) from a show, we might ask ourselves why such issues are being foregrounded.

For instance, in Andor — widely praised for its gritty political realism — we might note that the goal is to impart a kind of sophistication into an IP that is primarily oriented toward children. The same would presumably hold for the HBO adaptation of Watchmen and its unexpected centering of racial issues. The politics are not the “goal,” they are part of the aesthetic effect. And I guess sometimes people are basically saying that they like TV shows better when they align better with their politics — which is only fair, but is perhaps a point that could be stated more forthrightly, instead of dressing it up in this weird quasi-normative stance. There is nothing preventing a show from genuinely having good political lessons or — more likely — supplying powerful political metaphors, nor is it by any means impossible that a show’s politics could have deleterious real-world effects (e.g., West Wing). But I can’t help but feel we’d get a better handle on that kind of thing if we contextualized it in a formal-aesthetic analysis of the show.

Of course, there is no audience for the kind of criticism I’m calling for, because it feels like English class and everyone hated English class for stealing away their naive enjoyment of literature or whatever. So I’m left blogging, or writing for academic or para-academic presses, or just tweeting out complaints about writers who are really just doing their best. You do you, everyone! Everything is fine and nothing matters.

Succession elephant

akotsko

Who is my neighbor?

In the wake of the killing of Jordan Neely on the New York City subway, a new meme has emerged on the right: the killer, Daniel Penny, was acting as a “Good Samaritan.” A more craven and blasphemous distortion of Jesus’s parable is hardly imaginable. In fact, I almost hesitate to dignify it with a response. Neely himself is so obviously the victimized party here, and if anything, his murder shows what happens when a “Good Samaritan” doesn’t show up. Moreover, the fact that the story a story that is so obviously about moral decency that crosses lines of ethnic enmity and distrust — the Jewish victim’s co-religionists pass him by, while a member of a hated, supposedly half-breed sect provides generous help — can be deployed to apply to a member of a privileged in-group using lethal violence against a multiply marginalized person displays the kind of willful, spiteful ignorance that only committed racists can pull off.

This isn’t the first time the story has been misunderstood. There are numerous accounts of preachers crafting a contemporary version of the parable where a priest and a deacon pass the victim by, while an atheist (or an illegal immigrant, or a trans person, or whoever else) generously helps. The punchline is always that the parishoners — who have presumably known this story all their lives — inevitably find this retelling offensive and insulting.

This misunderstanding is all the more puzzling given that Jesus clearly intends for the listener to identify with the victim. The interlocutor asks Jesus “who is my neighbor,” presumably to get out of the exhorbitant demands of Jesus’s teaching by applying them only to a limited in-group. Jesus tells the story and then asks essentially, “Okay, who was that guy’s neighbor?” The pride and presumption of the interlocutor, who wants to be able to pick and choose his neighbors, is undercut by a scenario in which he is radically vulnerable and is no position to turn away any neighborly assistance.

Except! Yes, that’s right, there is a catch, and it’s the fateful last exchange: “Which of these three, do you think, was a neighbour to the man who fell into the hands of the robbers?’ He said, ‘The one who showed him mercy.’ Jesus said to him, ‘Go and do likewise.’” Go and do likewise — suddenly the interlocutor is no longer identified with the victim, but with the hero. What’s more, this isn’t just any guy off the street, but a teacher of the law. There is nothing a teacher of the law knows better than how to get out of things, so we can imagine the gears turning: “Go and do likewise — but in what respect? Was it not the case that the Samaritan was helping my fellow Jew, filling in for the neglectful priest and Levite? Perhaps I, like the Samaritan, should help Jews in need so that they aren’t put in the embarrassing position of relying on the help of a Samaritan, who surely has his own Samaritan problems to attend to. And if I’m supposed to take away the message that Samaritans are worthy of respect, surely the best kind of respect is not to impose on them, right?” And so it goes.

There are other trap doors as well. Could we not see the scenario as precisely a failure of policing? Again, we wouldn’t have needed to bother the poor Samaritan if our Roman men in uniform had done their jobs! Better, perhaps, than cleaning up after someone is victimized would be to intervene before it gets to that point, right? In this interpretation — presupposing, of course, the racist premise that Neely was somehow primed for violence, which no empirical evidence supports — Penny was a true neighbor to everyone on that train, a kind of super-Samaritan! And the fact that he is being persecuted for his actions by the usual rogue’s gallery of liberals and reporters and various social jusice warriors shows that he must have done the right thing. Maybe he’s even a little bit like Jesus! In fact, I wonder if we can detect some Christ-like imagery in these dramatic photos portraying Penny between two subordinate figures, like Christ crucified between two thieves:

All these various plot holes and “outs” may be an indication that entrusting the moral formation of one’s society on a half-remembered story that may have been told by an apocalyptic preacher in first-century Palestine is a questionable move. This is not, I hasten to add, because those stories are garbled or incoherent. No, the reason this is a risky procedure is that they are designed as traps. At one point, the disciples ask Jesus why he preaches in parables. His answer is not that they are more memorable or easier to understand or anything we might expect on a common-sense level. Instead, he offers a more paradoxical answer:

He answered, ‘To you it has been given to know the secrets of the kingdom of heaven, but to them it has not been given. For to those who have, more will be given, and they will have an abundance; but from those who have nothing, even what they have will be taken away. The reason I speak to them in parables is that “seeing they do not perceive, and hearing they do not listen, nor do they understand.” With them indeed is fulfilled the prophecy of Isaiah that says:
“You will indeed listen, but never understand,
and you will indeed look, but never perceive.
For this people’s heart has grown dull,
and their ears are hard of hearing,
and they have shut their eyes;
so that they might not look with their eyes,
and listen with their ears,
and understand with their heart and turn—
and I would heal them.”

As my theology professor Craig Keen loved to paraphrase this passage, Jesus is saying, “I tell them parables rather than preaching straightforwardly because otherwise they might turn and be saved.”

A parable, in other words, is not a memorable tale or a moral lesson. It is a judgment — or better, it is a way to get people to pass judgment on themselves. We can try all we want to point out to these people identifying a cold-blooded murderer as a Good Samaritan how much they have misunderstood the text, but the text is doing what it is meant to do — it is giving them the opportunity to demonstrate that they are well and truly lost. What we do with that information is unclear, given that we do not expect the imminent coming of the Kingdom of God, but the information itself is unequivocal. Anyone who could bring themselves to utter such a blasphemous thing is beyond help, beyond hope. They are damned, and to live as they do is surely a living hell.

the-good-samaritan

akotsko

What I’ve learned

On Monday I submitted grades, and this afternoon I reviewed my teaching evaluations. That closes the books on my 14th year as a college professor. I am currently 42 years old, so by my math, I have been doing this for roughly one-third of my life. That is strange to think about! I’ve been a higher ed teacher for longer than I myself was in higher ed, and longer than I was in public schools. Over the next couple years, I will be going through a major evaluation, so I’m in a reflective mood. Obviously the way I’ve chosen to live my life indicates that learning is very important to me. What have I learned?

Other than a two-year period as visiting faculty at Kalamazoo College, my entire teaching career has been spent as part of the Shimer Great Books program, first at the independent school in Chicago and subsequently at North Central College. As I’ve written many times before, that program has a very distinctive ethos. All of our courses are discussion-based seminars based on important primary sources — no textbooks, no lectures, no high-stakes in-class exams. Since joining North Central I have been called upon to teach outside the Shimer program and have needed to fold lecture-based pedagogy back into some courses, but the discussion model remains my center of gravity. My goal is always, somehow, to get as close as I can to the day where my students can sit in a circle and talk open-endedly about what they’ve read.

This consistent pedagogical training has had a huge impact on me as a person. First of all, it has mellowed me out. I am still in many ways the irritable and impatient person that this blog made infamous when I was in my 20s, but that part of myself comes out much more rarely, and essentially never with students or colleagues. I’ve always been interested in some kind of intellectual community, but engaging with my students in extended dialogue day after day, year after year, brought home to me how much culvitating that community is an act of service and care. I was and still am attracted to the “sage on stage” model (which I am able to indulge periodically in invited lectures), but the kind of teaching I’ve been called upon to do has forced me to to stay more in the background, facilitating the process of other people working through ideas rather than showing off what I have figured out. This has led me to claim that I’m one of the only male academics out there who knows how to shut up and listen to others.

But it’s not just about listening. I decided early on that I would always be honest and straightforward about my own viewpoint and interpretation, whenever it seems appropriate to share. I find it off-putting and arrogant when professors proudly announce that, for instance, they want students to be able to get to the end of the semester not knowing whether they’re an atheist or a believer, or a Democrat or Republican. When I put forward a particular interpretation of the text — supported by textual evidence, of course! — my students set to work assessing it for themselves using those same means. Often I learn that my reading is half-baked and I need to go back to the drawing board. I’ve never had a student adopt my viewpoint because it was my viewpoint. This experience makes it hard for me to take seriously the idea that students are or could be indoctrinated in college.

That being said, I don’t think it’s boastful to say that I know a lot about a surprising range of topics. That’s the other side of the Great Books curriculum — since the center of authority is the class materials rather than the professor, scholarly expertise is not only unnecessary but can be a positive obstacle. I’ve spent most of my career at Shimer teaching outside my areas of expertise, and I’ve always found it pedagogically helpful for me to be learning alongside the students.

In fact, a big part of our training was to literally be learning alongside the students, by auditing courses. My first semester at Shimer College also marked my return to the classroom as a student, where I sat in on Humanities 1: Art and Music. Fine arts topics have since become a staple of my teaching, as I try to incorporate some art and music into every course where it’s halfway plausible. It’s been incredibly rewarding to reconnect with classical music and to gain a deeper appreciation for painting and sculpture. As a direct result of these teaching experiences, I am now a regular at the Chicago Symphony Orchestra and I have the holdings of the Art Institute of Chicago virtually memorized.

The Shimer program pushed me to develop a whole new teaching competence in Islam, which is now becoming a scholarly interest for me as well. It gave me the unique opportunity to teach courses in the history of science, including actual labs — how many theology PhDs can say they have taught lab science courses? It has broadened and deepend my knowledge of the classics of the Western tradition and pushed me to engage with the classics of other traditions as well. I left grad school as a committed generalist, but my teaching has transformed me into what I consider a truly educated person — not just someone who happens to know a lot, but someone who has learned how to learn in a wide and expanding number of areas. As My Esteemed Partner characterizes my approach to teaching, my motto is, “I haven’t taught that — yet!”

That experience of broad and deep learning and the genuine satisfaction and joys it brings has in turn looped back to my teaching. I may have a special knack for this kind of thing, but I believe everyone can become an open-ended sincere learner and that they deserve the chance to do so. At the old Shimer College, we had students of all ability levels, and everyone who took the process seriously grew as a thinker and as a person — “it works if you work it.”

We had the advantage back then of working with a self-selecting group of students who had opted into a particular kind of intellectual culture. Now we have to work harder to reach students who might be simply checking off a box, who might even resent more traditionally academic classes as an unnecessary waste of time when all they want to do is get a job. I haven’t found the secret formula by any means, but I do manage to reach some — to connect them with a part of themselves that is curious and interested and therefore interesting as well. Sometimes I feel like I’m putting out so much energy that I’m shaving days or weeks off my life expectancy, but enough students seem to find my performance of intellectual curiosity compelling that my classes can basically work as sites of some kind of inquiry — even that oddball gen-ed senior seminar that half of them were told was going to be about sprucing up your resume and practicing mock interviews but turns out to be a study of utopian and dystopian futures.

I don’t want to paint an unrealistically rosy picture. There are bad days in class and just plain bad classes. There is drudgery and conflict and stress and precarity and status anxiety and all manner of disappointments and frustrations that I have not yet learned to handle with as much grace and dignity as I’d like. But I’m still collecting regular paychecks, against all odds, and still living the life I’ve always wanted to live.

great books

akotsko

Summer plans

My last post feels like a lifetime ago, along with the positive hopeful attitude it reflects. The end of the the semester is always a sprint, but it has become much moreso now that I have taken on a faculty governance role that entails participation in faculty meetings and Board meetings. I feel drained, exhausted, and irritable. But soon it will be over, and I will be able to experience my first “normal” summer in many years — uninterrupted stretches of time to devote to activities primarily of my own choosing. Between the pandemic, buying a house, and then doing a ton of travel, this hasn’t happened in a while. Other than the pandemic, all of those things were net positives for my life, yet they didn’t represent the kind of recharge and regroup I’ve thought of as a normal part of my annual routine.

This summer, we’re planning only one trip, to the UK, taking advantage of a conference invitation. There I will be presenting some initial research drawing parallels between the Apostle Paul and the Qur’an — which I believe to be a genuinely novel scholarly niche I’ve discovered. Before the conference, we will hang out in Edinburgh, then we will spend a weekend in London before heading back home. And then, aside from a weekend trip or two, I will be settled at home for the entire summer — a prospect I am relishing.

Obviously I have to prep for classes somewhat over the summer. For two of them, that shouldn’t be a problem. I am offering Logic and Critical Thinking through the philosophy department for a second year in a row. The first iteration was successful, and I have good notes that should make an updated syllabus the work of an afternoon. I also have the unique opportunity to teach a half-semester Honors Seminar on Watchmen, covering both the original comics and the HBO series. Reviewing those works and some of the critical literature on them should not be burdensome. Finally, I’m offering Shimer’s seminar course on logic and math for the first time — something I only wanted to take on once I had the standard textbook approach to logic under my belt. A retired colleagues has offered to work with me on that and give me some guidance in how to organize and run the class, which due to the subject matter tends to be more structured than our typical free-wheeling discussions. I’m hoping that working through some Aristotle and Euclid at first hand will lay more groundwork for me to eventually return to Hegel’s Logic — I have a book on Hegel and Aristotle pencilled in for my winter break reading.

My biggest project will be a book on Star Trek, for a University of Minnesota Press series on franchise storytelling co-edited by Gerry Canavan and Ben Robertson. I plan for it to be a relatively short work, somewhere between the length of Why We Love Sociopaths and Neoliberalism’s Demons. I am taking a somewhat unusual approach. Most scholarly overviews of Star Trek focus on the foundational moment of The Original Series and the huge popularity and cultural impact of The Next Generation, then conclude with a decline narrative as subsequent spin-offs enjoyed less commercial and creative success. Often the last nail in the coffin is the hated prequel series Enterprise, which is regarded as a total creative dead end.

I’m taking a different approach and actually starting with Enterprise, viewing it as the beginning of a new era for Star Trek — an era in which it becomes much more self-conscious of its status as a franchise, more self-referential, and increasingly obsessed with returning to its foundational moments. My chapter outline is as follows (with an intro and conclusion, of course):

  1. Enterprise
  2. The “Novelverse” (a sprawling continuity that developed in the tie-in novels after the cancellation of all the shows meant that they were unlikely to be “overwritten” in the foreseeable future)
  3. The Abrams reboot films and IDW tie-in comics (since the production team initially chose not to have novel tie-ins for the new films, comics took on an unexpectedly central role in fan culture after being marginal for most of Trek history)
  4. Discovery (covering all four extant seasons, despite the bizarre 900-year time jump that happens in the middle)
  5. Picard (which conveniently ended its run mere weeks before I plan to start writing, providing something of an organic “stopping-point” for the streaming era of Star Trek)
  6. Homage Series (covering Lower Decks, Prodigy, and Strange New Worlds)

The goal is to use what I call “late Star Trek” as a model for the challenges and opportunities that franchise storytelling opens up, as well as the obstacles and deadlocks that model inevitably confronts. And since this is Star Trek, I will naturally have to address the ways that commercial imperatives distort or contradict the franchise’s anti-capitalist post-scarcity ethos and the unexpected directions this famously “optimistic” and even “utopian” franchise takes its social commentary. I have actually managed to keep up some semblance of a research routine even in these extremely busy weeks, so I have some good momentum here — and it’s obviously a topic I am quite passionate about and have been thinking about a lot.

My hope is to have a full manuscript to submit by the end of summer vacation, though I don’t think it would be a big deal if I wound up taking a little longer. And because I am a weird person with a unique lifestyle, this whole process does not stress me out or intimidate me in any way. I am genuinely looking forward to it. I miss writing, and this is both a “fun one” and a labor of love. If I don’t point out the genuine artistic achievements of Enterprise, who will? Literally no one, that’s who!

And more than that, I am hopeful that this will help me continue my journey toward feeling more like “myself” after some really hard years marked by a lot of stress and uncertainty and some pretty serious burnout. Some of those things will obviously continue — I have one more year of meeting-o-rama due to this governance role, and my Shimer colleagues and I are going through back-to-back years of evaluations to finally regularize our faculty status at North Central after a long probationary period — but I hope I can regrow some resilience through a period of relative solitude and self-directed creative work.

So there you have it! Maybe not the best blog post I could have written, but I feel like I have to put some points on the board to keep up with Beatrice’s Substack, to which all AUFS readers will definitely want to subscribe.

beach-read

akotsko

Nature is healing: Reports from a self-imposed sabbatical

As long-time readers know, about a year ago, I declared a self-imposed sabbatical from all academic work that wasn’t directly required by my job. While I created a carve-out for invited lectures, I announced that I would say no to an invited contributions to journals or edited volumes, any op-ed writing, and (especially!) any peer reviews. My only writing outlet would be the blog, which I hoped would help reconnect me to the fun of writing again.

I think it — worked? While I was still a little tentative and insecure with the talk on Star Trek I gave at ACLA, I was much more confident and engaged with the talk on neoliberalism I gave this past week. I’m also working actively on the research for my forthcoming Star Trek work, which I’m finding energizing and productive rather than indimidating and draining. I’m looking forward to having a normal-for-me summer — one academic conference parlayed into a vacation with My Esteemed Partner in early June and then a wide-open summer for writing.

The real indicator that my brain is healing, though, was the way I spent my morning: book shopping. The Seminary Co-op has been an intellectual lodestar for me since late college. The amount of student loans I had forgiven under the public service program this year was likely approximately equal to what I spent there over the course of grad school. During the pandemic, when the store was struggling and they asked for donations, I actually gave a substantial sum. This was shortly after the death of Ted Jennings, and when I mentioned my donation to My Esteemed Partner, I spontaneously said, “I already lost Ted, I can’t lose the Co-op too.” Yet over the past couple years, going there felt stressful and exhausting — almost guilt-inducing. Every book stared at me from the shelf as an accusation and a demand for more work. The joy of browsing was gone. I remember taking an out-of-town visitor down there, thinking it was the ultimate academic’s escape, then realizing that I had made a mistake and couldn’t cope.

Today, I had a couple specific titles in mind and, having one last day more or less free from obligations before the end-of-semester sprint, decided to combine my cafe work routine with book shopping. I checked a few things off the to-do list in the outdoor area of the nice cafe adjoining the bookstore, then went shopping shortly after they opened. I found my ultimate purchases early on, but I lingered over the front table and new releases, before wandering to various haunts.

On a deep dive, I will usually hit the Islamic studies area pretty thoroughly, swing through philosophy, and then see where my curiosity takes me. This time around, I considered picking up a volume of Hegel’s Philosophy of Nature or Aesthetics but passed on them — too expensive, plus they’ll keep. No one is going to be snatching them off the shelves any time soon. If forced to choose, I’d probably have preferred Philosophy of Nature, due to how often old-timey science figures into my teaching in the Great Books program. (I was astounded at how helpful my teaching of the history of chemistry proved for Hegel’s Science of Logic.) Maybe next time! This time, I wandered to the Classics section, browsing to see if they had a particular student’s edition of Virgil’s Aeneid about which I’d heard good things — in other words, I was feeling ambitious about language work, which hasn’t happened in a long time. I combed through the medieval studies section, to see if they had particular titles by Bynum and Le Goff (they did not). The Soviet history section was right there, so I gave it a look as well.

This was my most expansive and leisurely shopping trip in years. I picked up my purchases — Agamben’s new one, When the House Burns Down, and Anna Gryzmała-Busse’s Sacred Foundations: The Religious and Medieval Roots of the European State. I thought about popping over to the Regenstein to see if they had any other recent Agamben works in Italian that I could browse, but decided I already had the requisite PDFs and the bureaucracy of getting in would be annoying. Instead, I made my way to Powell’s (pictured above). I hit similar targets — Classics (right by the entrance, of course, as a crowd-pleaser/impulse-buy for the Hyde Park set), philosophy, Islamic studies, then Judaic studies and general theology. I considered a translation of Hegel’s lectures on logic, but decided that they, too, would “keep.” In theology, I found one of my long-time white whales: Blumenberg’s Legitimacy of the Modern Age, which is super expensive new but was $10 (marked down so sharply because of what appeared to be 20 pages with some markings in pencil). This was a once-in-a-lifetime find, so I had to act immediately. Having picked up that weighty tome, I reasoned that my bag was already pretty loaded down and decided to call it a day.

This type of trip used to be a weekly occurence for me in grad school, if not more. I didn’t buy things every time, but I did pick out targets, visit familiar aspirational volumes and sections, and basically daydream about all the things I could explore. Losing that sense of joy over the past couple years frankly scared me. I wondered if that part of my life was over and I should just become a full-time assessment coordinator or something. Having it back today felt exciting — I felt like myself again, in a way I haven’t in a long time.

I realize this achievement may be fragile and I have no plans to push myself too hard, especially given that I still have a very busy last week of classes and finals week (along with many, many meetings) between me and summer vacation. And I also realize that I need to be very deliberate about what I take on in the coming year, especially since I’ve committed to a book project for this summer — a short one, and a fun one, but still a book! I’m well aware I could relapse, but for now it feels like the worst of my burnout is behind me, and when I think about how my normal trajectory would have been to dig myself deeper into that for the sake of a few more superfluous CV lines or $250 checks, it makes me understand how unsustainable my pace has been for a long time. But still — it worked! I’m back, baby! Kotsko is good again — awooo….

Powells

akotsko

Some rambling reflections on truth and violence

I have never advocated political violence in any published writing or in any talk. You can read the talk I posted yesterday, for instance, and you will find no recommendation of left-wing political violence, indeed no mention of that possibility. Yet it inevitably happens, in Q&A sessions, that the topic comes up. The way it generally unfolds is that my listeners or readers observe that I make the following claims: the existing political system lacks democratic legitimacy; those in a position to wield institutional power are unresponsive to popular demands; and both major parties fully support police violence, with the Republicans growing ever more tolerant and even encouraging of vigilante violence. Hence, in order to reach the kind of goals I lay out, it seems like some form of political violence would be inevitable. So am I advocating political violence?

I personally do not intend to commit any political violence, nor would I encourage anyone else to do so. I’m at a loss, though, for why anyone considering such a thing would view me as an appropriate confidant or mentor. I am far from an activist. My praxis is objectively that of a middle-class liberal intellectual, and even on the level of individual choices and the various virtue-signals one tends to send, I am not particularly left-coded (e.g., I’m not a vegetarian or vegan).

In fact, I don’t want to be advocating anything at all — I want to undertake a purely analytic and diagnostic project. The problem is that contemporary academic culture will not allow me to do that. If I didn’t put down some kind of prescriptive agenda, people would simply hallucinate one on my behalf. So yes, I end Neoliberalism’s Demons by saying that we need to eliminate the market society in favor of a radically democratic planned economy. That’s the only way to make sure something like neoliberalism can never happen again. That’s the political goal that informs my analysis. It’s not “realistic,” but at least it’s explicitly stated, so I don’t have to bat away a bunch of phantom political agendas that people arbitrarily foist on me.

How do we get from here to there? I don’t know. I’m an idea guy. If we can get there by reading books and talking about them, then I should be among the leaders of the movement. If it takes something else, then maybe I can play more of a supporting role — ideally teaching classes, but maybe writing up some propaganda or even washing dishes or something.

My real agenda, my personal investment, for my intellectual project is that I want to figure out and share what I take to be the truth about our political situation, to the best of my ability. And as far as I can tell, the truth is that we are in a really, really bad position. The power of nonviolent resistance has been exhausted at this point. The media is too corrupt and the political class too brazen and arrogant to concede to popular demands, no matter how much we maintain the moral high ground. The electoral system continues to be a site of political blackmail rather than a venue for the public to influence the direction of public policy. No governing party in any major country seems to be at all serious about addressing the most urgent problems we face — and those problems are genuinely urgent.

In that situation, with all formally legitimate means of political change cut off, my question — which I repeated quite forcefully in the Q&A for my talk on Milton Friedman at Wabash College — is what the powers that be think is going to happen. We are told that the human race is facing environmental ruin that will kill millions, and yet no one with the power to do anything actually does it. Is it not inevitable that someone will take matters into their own hands?

It does seem inevitable — but it largely isn’t happening. And that in itself is an aspect of our situation that I struggle to understand. It seems like in a world where people open fire into random crowds because they can’t get laid or drive into protestors because they’re worried the Jews are going to replace them, someone would get it into their heads to physically threaten the people destroying the world. Why aren’t they?

Maybe the problems seem simply too big, or the system too unassailable. (I’m presumably contributing to the latter in some small way with my buzz-killing analysis.) Maybe the record of the times that the left gave itself permission to use unlimited ultra-violence have disillusioned people — maybe that means so poisons the end as to render it unattainable. Or maybe everyone has decided, collectively, that if we can’t get it done via the institutional structures that happen to exist at this moment of history, then that’s a sign we shouldn’t do it. It works if you work it! And center-left parties can eke out small, largely symbolic gains that the right completely swamps, until our cities flood and our crops fail and millions upon millions of people die in unescapable heat waves.

What can we do as individuals in the face of this mass apathy and willful ignorance? Not fucking much! Not much at all. As the man says, the wrong life cannot be lived rightly — yet there is a certain duty to truth, a certain obligation to acknowledge it as the wrong life. But despite everything, despite the inauguration of Donald Trump, despite the fall of Roe v. Wade and the resulting rise of almost unimaginable misogynistic state violence, despite the fact that the weather is obviously unfixably permanently broken and half the country burns down every year — despite all that, there are a lot of people out there who really don’t see that it’s the wrong life, who in fact are offended that I would criticize the team they’ve chosen to back in the game of politics, a team that is surely made up of good people who are doing their best.

And to that, all I can say is: no. That’s the one ethical duty that I fully and unhypocritically live by. I don’t know what they will do with it — probably nothing. It will probably even make them dig in their heels and become even more apathetic and willfully ignorant, more attached to whichever team of ghouls and empty suits they’ve pathetically identified as fans of. But they can’t say no one ever told them.

Trust_In_The_Law

akotsko

Jacob in the Bible and Abraham in the Qur’an

A question that might occur to the reader of the Hebrew Bible is why exactly Jacob, who becomes the namesake of the nation of Israel and father of the twelve tribes, is portrayed in such a negative light — scheming, manipulative, always striving for advantage. My dear friend Bruce Rosenstock, who sadly passed away recently, once gave what must be the right answer: somebody has to want it. Every other character in Genesis simply hears and obeys, but Jacob alone actively seeks out the blessing. The fact that he does so in morally questionable ways only reinforces the point.

Teaching my class on the Qur’an, I was recently thinking related thoughts about the figure of Abraham. This is not to say that the Qur’an portrays Abraham as morally ambiguous — that would be completely contrary to the theological goals of its appropriation of the biblical heritage. Instead, Abraham seems to be portrayed as a kind of meeting place between reason and revelation. He doesn’t fight and scheme to get God’s blessing, but he does “independently” want it, because he reasons his way to it before God explicitly reveals himself.

In the Bible, Abraham simply receives the commandment to leave his family and country and obeys. The Qur’an gives us many more episodes (reminiscent of rabbinic traditions in some cases) from Abraham’s youth and his conflicts with his family’s idol-worshipping ways. One key episode is found in Sura 6 (Livestock):

Remember when Abraham said to his father, Azar, “How can you take idols as gods? I see that you and your people have clearly gone astray.” In this way We showed Abraham [God’s] mighty dominion over the heavens and the earth, so that he might be a firm believer. When the night grew dark over him he saw a star and said, “This is my Lord,” but when it set, he said, “I do not like things that set.” And when he saw the moon rising he said, “This is my Lord,” but when it set, he said, “If my Lord does not guide me, I shall be one of those who go astray.” Then he saw the sun rising and cried, “This is my Lord! This is greater.” But when the sun set, he said, “My people, I disown all that you worship beside God. I have turned my face as a true believer towards Him who created the heavens and the earth. I am not one of the polytheists.” (Qur’an 6:74-79, Haleem trans.)

This is a systematic, almost “scientific” investigation — Abraham turns to successively greater heavenly bodies, eliminating each in turn as limited and concluding (albeit with a bit of a logical leap from our perspective) that there must be something even bigger beyond them all. Here I detect a distant echo of this sequence in another text I frequently teach, ibn Tufayl’s Hayy ibn Yaqzan, where the study of astronomy becomes a crucial step in the title character’s journey to God.

Abraham’s logical bent is also seen in a story from 21:51-70, wherein he smashes idols and claims that the gods were fighting among themselves — forcing the idolaters to admit that idols can neither act nor speak. But the most crucial moment is in the Qur’an’s reworking of the sacrifice of Isaac– which may in this case be the sacrifice of Ishmael, the legendary ancestor of the Arabs:

When the boy was old enough to work with his father, Abraham said, “My son, I have seen myself sacrificing you in a dream. What do you think?” He said, “Father, do as you are commanded and, God willing, you will find me steadfast. When they had both submitted to God, and he had laid his son down on the side of his face, We called out to him, “Abraham, you have fulfilled the dream.” This is how We reward those who do good—it was a test to prove [their true characters]—We ransomed his son with a momentous sacrifice, and We let him be praised by succeeding generations. (37:102-109, Haleem trans.)

Crucial here is the fact that his unnamed son is an adult and that the two reason through the revelation together before concluding it must be real and agreeing to follow through with it. I can’t help but think of the hadith stories about Muhammad’s much less measured early reactions to his revelations — which were much less distressing than an apparent commandment to sacrifice one’s own son! — and the way that his wife Khadija had to “talk him down” from his fears that they may have been poetic inspirations from demons rather than authentic divine messages.

Qur’anic storytelling is remarkably spare in its use of details, so the fact that any distinctive character trait of Abraham clearly emerges from the stories is a strong signal. In this case, the reasonableness of Abraham fits well with another important feature of Abraham in the Qur’an’s rhetorical economy — the fact that, in an unexpected echo of Paul’s arguments in Galatians and Romans — Sura 2 (The Cow) repeatedly tries to find a way “back behind” the historical revelations of Judaism and Christianity to reconnect with the foundational religion of Abraham. And you can tell this religion is foundational because it is so simple, so reasonable, so free of the accretions of spurious laws, morally ambiguous stories, and challenges to monotheism that mark Islam’s predecessors. Believe in God and the last day, say the prayers, give to the poor — you could almost reason your way to that on your own!

But in another turn of the screw, Abraham also becomes the foundation of Islamic particularism, because the same sura presents Abraham and Ishmael as the builders of the shrine in Mecca. Hence his destruction of his father’s idols becomes a prefiguration of the cleansing of that shrine, which becomes the crowning achievement of Muhammad’s career as a prophet. This is where Abraham and Muhammad cannot finally follow Hayy ibn Yaqzan. The monotheistic demand, despite its universal claim on human reason, is always already embedded in a particular history — and that history is always one of failure and betrayal, which the supposed reasonableness and self-evidence of the revelation only serves to throw into starker relief.

Blake Jacobs Ladder

akotsko

What is Star Trek About? Federation, Fan-Service, or Freedom

[The following is the paper I delivered earlier today at the American Comparative Literature Association’s annual meeting in Chicago, as part of the seminar “Franchise Cultures,” organized by Benjamin Robertson. It was made up primarily of authors planning to contribute to the University of Minnesota Press “Mass Markets” book series, co-edited by Robertson and Gerry Canavan. As I indicate at the end of the talk, I am set to write a book on what I call “late Star Trek” — i.e., the material that has appeared in the 21st century, after the end of the Next Generation era.]

The Star Trek universe is one of the most robust commercial storyworlds in existence. Aside from the DC, Marvel, and Archie comic book universes, it is arguably the oldest to be in more or less continuous operation. New Star Trek stories have come out essentially every year since the early 1970s, even when the show and its spin-offs were off the air. And though this observation opens up serious ontological questions, it is the oldest fictional universe that purports to take place—outside of the brief interregnum of the JJ Abrams reboot films—in “the same” universe and timeline since the original series began broadcasting. Despite fans’ love of theories involving forked timelines, the clear intention of the writers and producers is that there has been no Crisis on Infinite Earths, no reset, nothing overwritten, nothing lost. Unless it is very explicitly flagged otherwise, everything we see on TV really happened within what is known as the Prime Timeline.

The Prime Timeline represents an exceptionally long span of time. Discounting a couple visits to the origin of life on Earth and the Big Bang itself, it stretches from the 20th century (where our heroes periodically visit) to the 32nd. Within that broad sweep, four main eras have been established in greater detail: the mid-22nd century (home to Enterprise, the unsuccessful prequel series), the mid- to late 23rd century (home to the Original Series and original cast films, as well as Strange New Worlds and the early seasons of Discovery), the late 24th century (which includes Next Generation, Deep Space Nine, Voyager, Lower Decks, Prodigy, and Picard, the last season of which has finally crested the 25th century) and the early 31st (seen in the later seasons of Discovery). Among the shows currently running, then, all major historical eras other than the 22nd century are represented, with three shows set at staggered points within the Next Generation era. And in the second season of Picard, they double down on one of the strange curiosities of Star Trek lore that fans have for decades treated as a sacred shibboleth—namely, the idea that Star Trek’s timeline “forked” from our own at some point in the 20th century, meaning that when Admiral Picard and friends traipse around present-day Los Angeles, it is not actually “our” Los Angeles.

That is a ton of fictional history to grapple with—even leaving aside the secondary histories found in the ancillary novels and comic books, which have been periodically swept away by the arrival of new on-screen “canon.” The question I would like to explore in this paper is whether the Star Trek universe is ultimately a complicated sandbox or whether all of those aggregated stories add up to a single story. In other words, paraphrasing Jesse Pinkman’s famous characterization of Saul Goodman: is Star Trek a story-world, or a story-world? This question is not merely a personal idiosyncrasy. As I hope to make clear, it is one that the writers and producers clearly struggle with, and increasingly so as the franchise continues to expand.

The question of what Star Trek is ultimately about is a lens through which to view each new installment’s struggle to justify its existence—a task that becomes ever more difficult as the burden of history grows greater and greater. On the one hand, it would seem that if a new show is to have a purpose for existence, it must make a difference to the Star Trek universe. From an artistic perspective, it should offer some new take on the material, which dovetails nicely with the commercial imperative to reach audiences that are not already invested in Star Trek as it currently exists. On the other hand, making a difference would constitute “changing things,” which fans hate (especially if those changes take place under the auspices of a prequel series). This is a risky move, since there is no guarantee that alienating old fans will advance the cause of gaining new ones.

In my detailed study of Star Trek canon, I have isolated three basic strategies that the writers and producers use to escape this dilemma. The first is the search for freedom from the constraints of Star Trek canon (as well as the constraints of Star Trek’s storytelling style, ethos, etc.)—in short, the declaration that this is not your father’s Star Trek. The second is fan-service, which simply gives long-time devotees of the franchise further adventures with familiar settings and characters. The third, which tries and fails to synthesize the previous two, is to claim that all of these stories really do add up to one big story, namely the story of the Federation.

These three strategies can only be ideal types. As the Star Trek fans in the seminar room have probably already recognized, any given production is likely to blend the strategies in some way. From a commercial perspective, this is understandable as a way of hedging their bets—but I want to claim that something deeper is at work. I will argue that, in a kind of space-age cunning of reason, the contradictions in each strategy necessarily call forth the others. (Hegel: the final frontier.)

So first: freedom. This is not your father’s Star Trek. The first example that springs to mind is probably the JJ Abrams films, particularly Star Trek (2009), which goes to such lengths to have multiple characters proclaim, on screen, that their fates are not determined by what came before. But this is only an extreme version of what Star Trek does literally every single time it starts a new era. The Motion Picture starts off with the basic coordinates of the Original Series unfixably broken—not only has the old gang broken up, but everyone (other than maybe McCoy) is acting completely wrong. Kirk is insecure and making all the wrong calls. Spock is a virtual zombie. Clearly this is not going to be just another episode. The Next Generation starts off nearly a century into the future with all-new characters and a radically new situation: peace with the Klingons. Deep Space Nine inverts the exploratory concept of Star Trek by putting our heroes on a space station, and in case we didn’t notice the implicit rebuke, our new commanding officer tells Picard to his face that he hates him.

Those of us who have all of Star Trek memorized are doubtless jotting down a rejoinder for Q&A. But Adam, the plot of The Motion Picture is a thinly disguised remake of an Original Series episode! And The Next Generation not only clearly models Q on “The Squire of Gothos,” but does a remake of “The Naked Time” in its second episode. In both cases, though, the intention is, paradoxically, to distance themselves from their sources. No one on the refit Enterprise thinks to say, “Hmm, isn’t this kind of like the time we encountered Nomad? Could we talk V’ger to death, too?” It’s as though it never happened—and indeed, Gene Roddenberry resisted the idea that the Original Series should be “canon” for the films or Next Generation. As for “The Naked Now,” the entire point of the episode is for them to refer back to the logs of Kirk’s Enterprise and realize that they have to find their own solution. It’s not quite as heavy-handed as bringing Jean-Luc Picard to Deep Space Nine for a brutal dressing-down, but the fundamental gesture is similar. Yes, this past Star Trek happened. We realize that. We’re doing something different—deal with it!

The second strategy is fan-service. The best recent examples are Lower Decks and Strange New Worlds. Neither has any pretensions of changing the face of the franchise. Just the opposite: they seek only to pay tribute. Lower Decks provides a fun and ironic window on the most successful era of Star Trek, while Strange New Worlds returns to the Original Series’ episodic style and heavy-handed moralism with modern production values and better-looking actors. This is also something that Star Trek has always done, though normally in less authoritative material. The Animated Series is perhaps the earliest canonical example, as approximately a quarter of the episodes are direct sequels to the Original Series. The novels and comics, despite their often underrated quality and creativity, have also been structurally committed to providing more of the same.

In the Next Generation era, the strategy of homage was used sparingly, most often in lampshaded tribute episodes that signalled respect for the Original Series while highlighting the decisive break between the two eras. In the final season of Enterprise, by contrast, fan-service became a survival strategy, as season 4’s new production and writing staff sought to win back alienated Star Trek fans with episodes that explored the mysteries of the Eugenics Wars, the Orion Slave Girls, and the urgent question of why Klingons on the Original Series don’t have the same make-up as in later productions. (Shockingly, this last-ditch effort at fan-service did not forestall cancellation.) Much the same arguably happened with Deep Space Nine, which increasingly called back to concepts from the Original Series—including the Mirror Universe, the Orion Syndicate, and the dangerously fertile tribbles—as a way of establishing its Star Trek bona fides for skeptical fans. (In another context, I would pause here to make the argument that what we call the “Star Trek universe” is, perhaps ironically, an emergent property of Deep Space Nine.) Even the reboot films, after so forcefully proclaiming their independence, spent much of their run in homage mode, with Into Darkness clumsily rehashing Wrath of Khan and Beyond hinging, improbably enough, on plot elements from the unpopular Enterprise—as if recognizing that fans were already tired of this bold new version of Star Trek and hungered for any information at all from their beloved Prime Timeline, even the most hated instance of it.

In the case of Enterprise, I tend to think of the fan-service-heavy season 4 as an “apology tour.” The series had intentionally broken with tradition in many ways—from the tacky Rod Stewart-esque theme song to the unfavorable portrayal of the Vulcans—and even intentionally clouded the question of its relationship to the other series through the complex Temporal Cold War plot, and ratings had clearly suffered as a result. But the repetition of the pattern makes me think that there is a deeper necessity to this retreat from the freedom strategy into the fan-service strategy. Simply put, most of the time the writers free themselves from the weight of tradition only to find themselves with no clear purpose. If your goal is to get away from Star Trek, why are you doing Star Trek at all?

On a more pragmatic level, returning to the familiar routine is much easier than heroically reinventing the concept of Star Trek. The latter does happen, but only rarely. To pick the most obvious example, Wrath of Khan did heroically reinvent Star Trek, establishing that actions have consequences and no one has plot armor. The next movie undid literally all of that stuff—but subsequent generations of Star Trek have repeatedly (and mostly embarrassingly) tried to recapture Wrath of Khan itself. Similarly, the Borg represent arguably the greatest creative achievement of the Next Generation era, if not Star Trek as a whole, and First Contact radically reimagined them while rewriting Star Trek’s own fictional history—introducing concepts that would be foundational for the later seasons of Voyager, all of Enterprise, and the first two seasons of Picard. The bold gesture of reinvention is thus relentlessly assimilated by the tradition, to the point of becoming invisible or worse: a cliché.

This is not simply an example of a familiar dynamic where artistic innovation gives rise to its own kind of tradition. In the case of Star Trek, the problem is exacerbated by the existence of the ongoing storyworld. No true break is possible because everything is happening within “the same” history. Hence it was “always like that.” This happens literally in the text of First Contact, which retcons the Borg as an intrinsically hybrid species that assimilates its victims using nanoprobes, at the behest of a creepy yet undeniably sexy Borg Queen, and which inserts our future heroes at the origin of their own fictional history. In both cases, it was always like that, as Seven of Nine—a character who basically embodies everything First Contact introduced into Star Trek canon—establishes when she declares, on screen, that the events of the film were a predestination paradox.

The storyworld does provide a safety net, guaranteeing that no permanent damage can be done. After all, a failed innovation can always be reversed or explained away. Yet it produce contradictions, as every attempt at fan service is in danger of being perceived as a heretical innovation. When filling in a character’s backstory or reconciling an apparent continuity error, there is always the possibility of contradicting a novel that had a solution fans preferred, or breaking with established fan theories, or simply not “feeling” right. Innovation becomes fan-service, fan-service becomes innovation—where does this dialectic lead?

Clearly the mechanism of continuity mandates that all stories ultimately collapse into something, but what? The out-of-universe explanation, of course, is that they all collapse into a marketing strategy that forces fans to consume every new series and episode in order to stay up to date on their favorite fictional universe. I admit that I fall victim to this, lavishing intellectual energy on the mediocre current-day shows while refusing to dignify The Orville with my viewership. Even if The Orville is “better”—as everyone assures me—it’s still not “real” Star Trek. But again, surely I’m not invested in Star Trek merely for the sake of completism, much less out of a desire to contribute to the revenue of Paramount Plus’s rip-off streaming service. I am drawn to Star Trek as an atmosphere, an ethos, a set of values, even if they are incoherent ones. At the end of the day, I’m drawn to Star Trek because it’s a world I would want to live in, even as a lower decker, even if I were consigned to Raffi’s sad little trailer.

If Star Trek is ultimately the story of that world and how it came to be, how it lives up to its values or fails to, then that means Star Trek is the story of the Federation. It takes a while for that story to emerge clearly—in many early Original Series episodes, it’s not totally clear who Kirk and friends are even working for—but by the Next Generation era, it’s well established. Next Generation is about the Federation triumphant, while Deep Space Nine is about the underside of the Federation’s values and what happens to them when they face a serious existential threat. Voyager is about getting back to the Federation and spreading Federation values as you go. And of course, the big selling point of Enterprise is that it was going to show us the foundation of the Federation—something the finale stops just short of doing. Discovery starts off by retconning the Federation’s history vis-à-vis its greatest enemy, the Klingons, then jumps to the distant future so that it can re-found a Federation that has collapsed. And Picard’s first season clearly wants to be more than just a continuation of the title character’s personal arc—since he is the embodiment of Federation values, it has to be about the Federation itself, how it has failed and how it can (abruptly, unconvincingly) right its course.

The problem is that there’s no story to be told here. The Federation is one of the fundamental presuppositions of Star Trek, part of the furniture. It’s always-already there, even in Enterprise, when Daniels, Captain Archer’s time-traveling guardian angel from the distant future, reveals to him that his role is to found the Federation. It’s not all Daniels’ fault, though. The same dynamic recurs when he is not explicitly involved, as in “Dear Doctor,” an episode that purports to give an origin for the Prime Directive but can only presuppose it as obviously desirable.

After so many shows documenting the implacable rise of the Federation, the only genuinly new story to tell is the Federation’s equally predestined decline and fall. Discovery flirts with this idea, but barely deigns to explore the post-Federation world it has posited and instead devotes its first distant-future arc in season 3 to exposing the source of the natural disaster that shattered the Federation—ushering in an era of isolation, the reintroduction of large-scale slavery, etc., etc.—as a weird one-off fluke that is easily avoided and corrected in the future. By the end of season 4, we have virtually returned to the status quo ante. Similarly, in Picard, the revelation of a mole in Starfleet motivated by a tragically mistaken extremist ideology is enough for everyone to forgive and forget the betrayals that had led our hero to resign in disgust.

The recognition of this basic stasis in the Federation concept is enough to undermine Star Trek’s pretense of a coherent “history” with recognizable “eras.” What truly differentiates them? Is it the characters? Well, by the magic of time travel, any character can appear in any era. Is it the technology? The ship travels at the speed of plot regardless—even Archer’s primitive Enterprise NX-01 can get to the Klingon homeworld in a jiffy. Is it the political situation? That is always subject to revision and upheaval at any time, as when the Klingons casually break from the era-defining alliance with the Federation in Deep Space Nine, only to return to it just as casually in a later season. The writing and aesthetics of the current streaming shows confirm this—aside from the animated series, both of which look radically different from each other and any other Trek, all the shows look broadly the same, with the same deficient lighting and the same angular ship design. More than that, the themes of each new season seem to echo the last season to air, no matter which show and which era. Discovery season 2 had a dangerous AI that threatened to wipe out all biological life—and right afterward, so did Picard season 1, set 150 years in the future (or 700 in the past). Picard season 2’s jaunt into the early 21st century raised the vexed question of genetic engineering, and wouldn’t you know it, a week after that season concluded so did Strange New Worlds season 1 (set 150 years in the past, or 250 years in the future).

So what is Star Trek about? No Hegelian synthesis is possible here—we are dealing with a pure “bad infinite.” Star Trek is about generating riffs on familiar scenarios and concepts and character types, about chewing on bits and pieces of lore until you can make something out of it, about making bold declarations of one’s value and importance and only occasionally backing it up. It’s about itself, it’s about our future, it’s about the future of a timeline that diverged before any of us were born, it’s about selling streaming subscriptions and novels and comics, it’s about liberal humanist values until it’s about conservative militarist values. At this late date, it’s essentially “about” the deadlocks of franchise storytelling itself—of which it is the most venerable and accomplished example and, despite its recent questionable choices, arguably the only long-running franchise that still maintains the capacity for genuine creativity and insight, as shown by (most of) Discovery season 1. But that is a topic for another day, namely the day when I finally submit the manuscript for my book on all the redheaded stepchildren of the Star Trek franchise, from Enterprise to the streaming era.

Kirk Khan (1)

akotsko

The Moral Cost of Capitalism

[The following is a talk I gave this afternoon as part of a faculty colloquium on “Radical Futures” at North Central College, part of the Intellectual Community series co-sponsored by the Faculty Development and Recognition Committee (of which I am chair) and the Center for Advancing Faculty Excellence and organized by my colleague Sean Kim Butorac.]

Since I teach in the Shimer Great Books program, I will begin with an experience teaching one of the all-time greats, Aristotle’s Nicomachean Ethics. In my ethics class this semester, we were discussing Book 1 and came to a passage where Aristotle had isolated three possible human goods that seemed to be good candidates for happiness—by which he means the human good that we pursue for its own sake, with no need for further justification or explanation. The first is pleasure, which is presumably self-explanatory. The second is honor, which we could paraphrase as respect or esteem. The third is contemplation, which we could see as a form of knowledge or understanding. In all three cases, Aristotle believes, it wouldn’t make sense to ask why we are pursuing these goals. Why do you want pleasure? Why do you want people to like and respect you? Why do you want to figure things out? The question doesn’t make sense.

The list feels pretty exhaustive, but Aristotle goes on to introduce a fourth possible candidate: money. Initially it seems to fit the bill—all things being equal, no one will turn down more money. But Aristotle points out that money is not truly an end in itself, but rather a pure means. We only want money because of the things we can do with it. And this, I point out, is an area where Aristotle is out of date. He can’t imagine living a life for the sake of stockpiling as much money as possible, much less orienting an entire society around it. We can.

The shift to a pure market society, to a kind of totalitarianism of capitalism, was so successful that it has become almost invisible to us. Like many other analysts on the left, I choose to call that transition—ushered in by Pinochet, Thatcher, and Reagan, and then adopted by virtually every governing party in the West and every international organization—the shift from the postwar Fordist economic model to neoliberalism. One way to gauge this shift is to think in terms of means and ends. In the postwar era, the existence of an alternative economic model in the form of the USSR—which at the time was experiencing the highest economic growth in human history up to that point—meant that capitalism had to justify itself. It had to prove that it was better, not just at stockpiling money, but at creating broadly shared prosperity that lays the groundwork for national greatness. And through a combination of heavy government intervention, very high marginal tax rates on the wealthy, and high union concentration, the capitalist system really did mostly fulfill its promises, at least for the stereotypical white suburban family. Hypothetically speaking, capitalism set itself an empirical standard that certainly included economic criteria but was not limited to them—in other words, capitalism was a means to an end.

Since the fall of the USSR, capitalism has felt increasingly unburdened by the need to justify itself. Instead, competitive markets are taken as ends in themselves and as models for every area of social life. The reason we want markets is not that they produce better results or they’re more efficient or whatever else—we want markets because we want markets. Market logic is self-evident, the final standard, the final word. It is no longer the means, but the end. And once money is set up as the ultimate end—not even personal wealth that someone could potentially use, but the depersonalized money of endless “economic growth” and endless increases in asset prices—then everything else becomes a means. Where once we made friends, now we network, in the hopes that our social contacts will advance our career. Where once we relaxed and had fun, now we practice self-care, in order to recharge and guarantee increased future productivity. And to bring it a little closer to home, where once we went to school to develop our full intellectual capacities, now we seek the hot job skills employers crave.

Of course, the “before” of this “before and after” dynamic is a bit simplified and idealized in my presentation. There were no good old days when people at large pursued only the highest ends with unmixed motives. Yet I would submit that in past eras, people were better equipped to discern that such ends existed and that the mixed motives were less than ideal. This comes through clearly, for example, in a famous essay by John Maynard Keynes, “The Economic Prospects of Our Grandchildren.” As is well known, this text predicts that within two generations, humanity would essentially begin “cashing out” of capitalism by trading productivity gains for reduced working hours so that they could spend time on what was really valuable in life.

By my math, this would have been my parents’ generation, so obviously this did not occur. But for my purposes, the most interesting thing about his failed prediction is how he characterizes the benefits of the transition. One of the architects of the postwar capitalist order, as well as a gifted financial speculator, Keynes proposes that once humanity has leveraged the immense productivity of capitalism to set itself free from economic necessity, it will be a relief to admit that none of those wealthy businessmen was really as admirable as we pretended they were, that there was something a little disreputable and sad about the way they’d chosen to live their lives.

The neoliberals did everything they could to squelch that insight, to the point where we are supposed to believe that an obviously broken and miserable man like Elon Musk is a genius-level benefactor of humankind, for instance. It’s easy to point and laugh at Musk’s pathetic army of admirers on Twitter, but we academics are guilty of our own distortions. The other day I was meeting with a major in our program, a very strong student who I had not had in class before. We wound up talking for a good half hour, and at a certain point the thought slipped into my head that it was a good thing I was doing this and making her feel so supported, because we really need majors…. A very rewarding part of my job, which I was doing for its own sake and even enjoying, suddenly felt like a cynical manipulation.

I know I’m not the only one to fall victim to this line of thinking, because I’ve heard similar remarks in many other discussions. For instance, once a faculty discussion about providing mentoring and support for students of color devolved into a reflection on the importance of reaching Latinx students for our bottom line. A question of justice becomes a question of finances. No one intended for that to happen—it just rolls right off our tongues.

And more broadly, of course, we are all well-versed in defending our disciplines in market terms. The humanities provide valuable job skills! Employers tell us they want liberal arts majors who can think on their feet! Liberal arts majors eventually catch up to and even exceed the incomes of their STEM counterparts! I understand that such rhetoric is tactically necessary, especially in a media sphere full of misinformation about the value of different fields of study. I also happen to think these things are true! But even though it’s true, it’s harmful to frame the value of education in such narrowly instrumental terms. I did not get into this line of work so that Johnny can get that big promotion years down the road or Suzie can contribute to better quarterly results for her department.

But of course Johnny and Suzie need to be able to get those employment outcomes, or else they aren’t going to be able to pay off the student loans that are financing their education here. And this brings me to another way in which the full-saturation capitalism that I call neoliberalism degrades our moral sense: it shrinks our political horizons. Once installed in a given area of life, marketization produces a feedback loop that constrains our choices within a very, very narrow range. And living a life where the most important choices about our lives and livelihoods are made by an impersonal mechanism—by everyone and no one—cultivates habits of deflection and irresponsibility. We aren’t making decisions or value judgments—we are simply responding appropriately to the demands of the market, and if we don’t, we will lose out to someone who does. In practice, this leads to the conformism of “best practices” that one of our colleagues criticized today—the alibi that we should do what everyone is doing because everyone is doing it.

I am not sure in detail how to get out of this self-reinforcing doom loop of marketization, though in my book Neoliberalism’s Demons, I suggest that we need to embrace the abolition of the market and the establishment of a system of democratic economic planning as a long-term goal. In our more immediate context, I would suggest that—beyond changing our rhetoric about the cash value of majors or the financial urgency of student retention—we need to find a way past our competitive zero-sum approach to curriculum design. Instead of outsourcing those decisions to student choices, we need to find ways to discuss, collaboratively and creatively, how we can best deploy the North Central faculty’s massive talents and expertise to deliver the kind of education we want our students to have. Our marketized system has deeply internalized habits of cynicism and fatalism in most of us, but as Aristotle teaches us, the only way to develop more virtuous habits is to practice.

Aristotle_Altemps_Inv8575

akotsko

Rebuilding the Closet

Gender and sexuality are a spectrum. In common discourse, we lose sight of what that means. Very Online approaches to gender and sexuality seem to say that gender and sexuality are a spectrum, but everyone is at a very specific and static spot on that spectrum. That fits with the more everyday discourse that was able to absorb the normalization of homosexuality on the condition that every individual clearly fits into one specific box. But that’s not how it is, and everyone probably understands that. Even among people who are exclusively heterosexual, there is a spectrum of how attracted they are to the opposite sex — how many partners they seek, how much monogamy is a struggle for them, how sexually motivated they are at all, etc. Enough people seem to be able to rest more or less content with monogamy that the whole thing basically “works,” but if we’re being honest, there are some people for whom it was never going to happen and who therefore never should have been expected to get married or have exclusive relationships.

Everything relating to sex and gender is like that. On the spectrum of same-sex desire, for instance, there are those for whom it’s a non-negotiable exclusive preference and others who could make a basically heterosexual lifestyle work, and a whole range in between. We see this from history — there are a lot of men, for instance, who were known to be primarily same-sex attracted but were able to hold together a marriage and have children. By the standards of the time, those marriages may have even been relatively happy! And on gender identity, there are people who absolutely need to transition or else their life will be constant suffering and others who can tolerate living in public as their assigned identity as long as they have some private release, and a whole range in between.

The political strategy of the “closet” was to require those people who exist in the more liminal spaces to hide, then relentlessly stigmatize and persecute the people for whom conformity was simply never going to be an option. The latter incentivizes the former — you’d only choose to live as homosexual or trans if the cost of denying it was worse than the social costs of acknowledging it. All but the youngest generations are familiar with this dynamic at first-hand. Every 80s kid, including myself, looks back and is horrified at the casual homophobia that was flung around the schoolyard in those tense days just before public acceptance of homosexuality gained critical mass. We were being groomed, from a very young age, to be homophobes. And the goal of that project was emphatically not to convert homosexuals or trans people, at least not among intelligent conservatives. The goal was to use the non-negotiable homosexuals and trans people to make sure that everyone who could stand to conform, would conform. Those who couldn’t conform and were never going to be able to conform were made into living sacrifices to normative heterosexuality.

That’s why the strategy of coming out of the closet was so powerful. The entire system depended on the idea that sexual minorities were freaks and monsters, and the majority could sustain that belief because so many people with those inclinations kept them secret. Once they stopped keeping it secret — often at great personal cost to the earliest generation of activists — the dynamic could no longer hold. Sexual minorities were not those strange scary outsiders. Everyone knew someone who belonged to a sexual minority, often intimately. The strategy was so powerful that it led to the legalization of gay marriage by a conservative Supreme Court — a move that would have been unthinkable in my childhood, but seemed obvious and long overdue when it finally happened.

Now the first generation of children is growing up for whom this new regime of gender and sexuality is normal. And what many — including myself — would now like to see is an inverse of the strategy of the closet. Instead of a default assumption of conformity unless the non-normative is totally irresistible, the approach should be to allow young people to experiment and see what really works for them. Hence young people should ideally be allowed to follow their curiosity and attraction before claiming a sexual orientation. More kids will wind up dating or even having some intimate contact with people of the same sex than wind up “being” homosexual in the long haul, and that’s okay and natural.

The same would apply to gender identity. In the past, only those who were in indisputable agony could pursue some kind of gender reassignment, and only at the cost of pathologizing themselves. Now, however, people who are uncomfortable with their gender identity — which, if we’re honest, includes almost everyone, at least at some times and to some degree — should be able to experiment, including living publicly as a member of the target gender (i.e., “socially transitioning”). The assumption is that more people are going to experiment than wind up adopting a gender identity other than the one assigned at birth, and especially more than wind up surgically transitioning. And that’s okay!

In contrast to the closet system, which aimed to churn out as many passable cisgendered heterosexuals as feasible, this system aims to make sure that no one whose life would have been enriched by non-normative gender or sexual practice is missed. The reality of evolution probably indicates that the majority of people will still remain other-gender attracted and have gender identities that correspond with that assigned at birth. But the number of people who wind up claiming non-normative identities will be larger than it was under conditions of systematic persecution and repression.

And the number of people who temporarily try out those other identities can be expected to balloon, given the realities of the teenage libido and the quotidian body-horror of going through puberty. More people are going to pursue that faint stirring of attraction to someone of the same sex when they don’t have to worry about being beaten up after school (including by that cute boy or girl) and more people are going to see if living as the opposite gender is the solution to their discomfort with their own body than they would in a situation where such a thing would have been simply unthinkable — both conditions that held during my lifetime (meaning during the lifetime of people who are raising young kids today).

All of that is happening now, at least in areas where policy enshrines some kind of openness to gender and sexual minorities. The fact that it is happening was predictable, and it is good. It opens up a situation where fewer people have to live lives of quiet despair for the sake of fulfilling an arbitrary role. It is the one way in which our children’s lives might be better than ours.

And so of course, a vocal minority of parents absolutely hate it. In response to this massive, positive social change, they are trying to reinstitute the closet. The strategies are the same as always — tarring all sexual minorities as pedophiles, equating all non-normative practices with the most extreme (e.g., acting as though social transitioning is tantamount to irreversible surgery), stripping gender and sexual minorities of basic political rights, etc., etc. The goal cannot be to eliminate homosexuality and trans experience — every intelligent person knows that’s impossible. The goal, rather, is to make the cost of expressing homosexual inclination or trans identity so high that the marginal few who could go either way find a way to make conformity work. In other words, a hard core of people who have no choice but to express homosexual inclination or trans identity will have to live thwarted, persecuted lives to marginally increase the odds that some bigot’s son or daughter will suck it up and settle into a “normal” marriage and produce a grandchild or two, so that the next generation can in turn suck it up and conform as well.

It’s an ugly political strategy that draws in ugly people and makes them uglier. People are going to die — whether by vigilante violence, or “gay panic” or “trans panic,” or suicide — because of this. And all to perpetuate a form of life that isn’t really making anyone happy at the end of the day. Why would people spend their lives and tarnish their souls for this? They claim it’s out of love, but I think it expresses a profound hatred of their own children, or at least of what their children might be or become apart from them. Perhaps it’s even a hatred of the part of themselves that wishes it could have had free range to experiment! It’s probably not helpful to speculate about that too much in individual cases, though. The larger reality is that the political strategy of the closet was a brutal, violent system, and a brutal, violent system produces brutalized, violated people who go on to be brutal and violent.

And it is by no means obvious that they will fail in their ambition to reinstitute the closet! The strategies are right there, familiar and ready to hand. For all but the youngest generation, they are a sad kind of muscle memory. All it takes is for the forces of repression to seem stronger and suddenly a lot of people will find a way to conform — as we can already see in the rank cowardice of most ostensibly “liberal” politicians and commentators on trans issues. Surely we are all old enough to know that progress is not automatic, that social justice does not depend on the date on the calendar, that every gain is reversible. The acceptance of minority gender and sexual identities was a contingent historical achievement, and allowing those gains to be reversed will have been a contingent historical failure — on the part of people who responded to irrational hatred with a pose of “reasonableness,” flinching in the face of a bully just as most of us did in the schoolyard.

Gay_Pride_Flag.svg

akotsko

Identity Politics vs. Identity Office Politics

In real life, identity is a structuring principle of human experience, which is by definition neither good or bad. For individuals, it can be constraining or life-enriching — or more likely, some mixture of both. For groups, identity can be the starting point for a broader engagement with the world, an alibi to turn inward, or even a spur to active hostility. Whether its effects appear to be positive or negative in any particular case, though, it is not something we can do without — especially on the political level, which by definition requires the creation or mobilization of an identity group toward some end. Every politics is in that sense an identity politics, even on the Marxist model, which requires the members of the working class to identify with their world-historical role as the proletariat.

Everybody who thinks seriously about identity and politics knows that this is the case. The Combahee River Collective knew that it’s the case, and presumably even Slavoj Žižek knows it’s the case. Why, then, do people so frequently denounce identity politics as a blind alley, a distraction, a cynical ploy, etc., etc.? I would suggest that it’s because there are actually two things that go by the name of “identity politics.” The first, which I have described, we could call “real-world identity politics.” The second, which people mostly hate, would best be designated as “identity office politics” — i.e., how identity functions in neoliberal institutional settings, most notably universities and corporations.

Obviously corporations and universities are different — if decreasingly so — but broadly speaking they share a certain neoliberal ethos, which I would summarize in two points. First, these institutions are irreducibly individualistic. Ideally, from their perspective, individuals would relate to the institution solely as individuals, without forming autonomous groups not authorized by the institution. Second, these institutions legitimize themselves by claiming to dispense rewards (pay, recognition, promotion) and punishments (disciplinary action, firing) upon individuals based solely on their own individual actions and merits.

Within such settings, it is difficult to see how something like a group identity (other than identification with the institution or a defined subunit thereof) would function. Yet social and political pressures to do justice to identity have only grown throughout the neoliberal era, amid increased awareness of forms of injustice that are systemic — i.e., irreducibly non-individualistic. How can the neoliberal institution translate this demand into its own terms? First, it defines identity as primarily an individual trait. An individual is this or that identity. Identity belongs to the individual, rather than the individual belonging to the group. Second, it tends to treat this identity-trait as the grounds for some kind of differential treatment — positive, in this case, to make up for the negative differential treatment of the identity group.

A historical group grievance is thus leveraged as an individual asset within the terms of the institution’s reward-and-punishment system. To make up for the fact that members of your group have faced systemic discrimination, you individually get a leg up, officially or unofficially. Of course, that leg up is often illusory or carries with it so much extra work (like serving on all the diversity committees, doing extra mentoring, etc.) as to negate the benefit. Indeed, the very systemic problems that the individual identity assets are supposed to resolve have definitely not gone away, even within those institutional settings themselves. That is most fundamentally because of the mismatch between the systemic injustice and the individual solution. And on a practical level, the individuals who benefit from the individual solution are, almost by definition, not the individuals most affected by the historical disadvantage — those most disadvantaged never have the opportunity to enter the institution and compete for its rewards in the first place. In fact, the recipients of identitarian advantages often have more than enough other advantages to compete successfully in any case. (This is the dynamic that Olúfẹ́mi O. Táíwò characterizes as “elite capture.”)

The neoliberal institution does not and cannot care about this mismatch. From their individualistic perspective, systemic problems simply cannot register as such. If anything, neoliberal institutions need systemic inequalities to continue. To the extent that neoliberal institutions exist to generate hierarchy, the more “sorting” goes on (whatever the basis) prior to the time individuals present themselves to the institution, the easier the institution’s job is. All the neoliberal institution cares about is satisfying the external demand for “levelling the playing field” for its identity-burdened participants, thus indemnifying it from legal action.

The shift from real identity politics to neoliberal identity office politics is therefore a shift from a complex lived reality to a counter in a game, which exists primarily to avoid lawsuits. It should go without saying that the latter is not a path to social justice. To that extent, the leftist critics of what is commonly called “identity politics” are right, because the individualistic and competitive presuppositions of neoliberal identity office politics can never produce the kind of solidarity and emergent collective identity a successful leftist movement would require.

But why would anyone ever get the idea that anyone thought that DEI-webinar-style identity management would produce liberation? There are two factors at work here. The first is simply the fact that most individuals spend a great deal of time in neoliberal institutions, and identity office politics seem to be the only lever to address identity-based injustices in that context. The institutions shape our behavior, which in turn shapes us — this is in large part why institutional reform is so urgently important. The second is the role of social media platforms, which expand the individualistic and competitive presuppositions of neoliberal institutions into every social interaction. In place of the relatively defined competition of the institution, social media engulfs us in an amorphous and endless competition in which we are all judge, jury, and HR coordinator. This grassroots form of neoliberal managerialism promotes ever more exaggerated claims of identitarian disadvantages (to be leveraged into individual discursive advantage) and ever more elaborate codes of conduct to govern interactions with people who claim a certain identity (as in the ever-lingering possibility that the most innocuous utterance could be declared somehow “problematic” and worthy of punishment).

The effects of social media identity office politics — objectively a somewhat sad, niche hobby — are then amplified in the mainstream media, which trumpets the dangers of “wokeness” to populations that are normally not granted the ability to leverage group grievance into individual advantage: namely, white people, especially white straight men. Paradoxically, this absence of grievance becomes the greatest grievance of all, as neoliberal identity office politics threatens to devalue the social capital once associated with whiteness. We can see this logic in the media stunts surrounding so-called “critical race theory,” which aim to protect oppressed white children from being burdened with generational guilt, etc., etc. Presumably one day Florida’s colleges and universities will offer special scholarships for white students who can prove they had a “woke” teacher — bringing the entire project of neoliberal office politics full circle by staging a bail-out for the now-toxic asset of white identity.

There is no solution to be found in the milieu of neoliberal office politics, no “right” way to implement it. The goal should be to abolish the individualistic, competitive neoliberal institutional form and find a new way to live together that can allow us to explore and enjoy our identities in a more authentic and organic way, unmediated by HR offices and mandatory trainings.

hands-2082x1171

akotsko

Sustaining Attention

To start, three short vignettes on attention:

  1. Like many of us, My Esteemed Partner and I were surprised when Chantal Akerman’s Jeanne Dielman, 23, quai du Commerce was named the number one film of all time by Sight & Sound — not least because we had never heard of it! We quickly corrected the oversight and found ourselves absolutely spellbound. We felt we could watch her do chores all day long, becoming deeply invested in the small changes to her routine — setting us up for the director to thwart our curiosity. We are normally impatient with films over two hours, but we were strangely disappointed that this one was only three hours long — we could have easily gone for another hour. And it struck me that, aside from its intrinsic merits, this was precisely the film to elevate in this historical moment, because it showcases the habits of attention that only cinema can truly cultivate.
  2. Over winter break, I like to read a “big novel” whenever possible. The past few years, I have been working my way through Pynchon, but this year I decided to do something a little more traditional: Middlemarch, that behemoth of the Victorian era. I started off reading it in fits and starts, but as soon as my schedule cleared up for a few days, I realized it was now or never and spent whole days reading — for the first time in years, maybe even since grad school. It took me a day or so to hit my stride, but by the final day, I was reading hundreds of pages. And I was attentive! If I caught myself scanning or skipping, I went back. It was incredibly rewarding. And it strikes me that it’s a rare enough experience for someone of my age cohort, but that it may feel almost completely impossible to my students, who have trouble focusing on a reading of more than ten pages, regardless of difficulty. But then I can hardly blame them, because outside of this incredible feat, I rarely read for more than 20 minutes straight without looking at my phone.
  3. This weekend, we went to the symphony. It was a pretty accessible program: Prokofiev’s “Classical” Symphony and Rachmaninoff’s Paganini variations and Symphonic Dances. The couple behind us were obviously radical newbies — not just to the symphony but to classical music in general. This had happened to us before, when we happened to sit next to a young woman who asked where to find the “set list” in the program and then writhed in agony through a 70-minute Bruckner symphony. But even after 15 minutes of the Prokofiev, our new friends seemed impatient. The sheer virtuosity of the piano performance placated them for a time, but by the end of the third piece, they seemed unable to resist whispering to each other. I was annoyed — it was a piece I had purposefully planned to hear for the first time in live performance, and they were breaking my concentration — but I was also sympathetic. I remembered the struggle of My Esteemed Partner to figure out how to cope with the demands of classical music when we first started going regularly, and I had to admit that even I sometimes wondered whether those demands were exhorbitant — I was not suffering as much as the hapless newbie, but I became extremely impatient with the Bruckner myself.

These three stories seem to me to point in a similar direction — toward the collapse of a certain regime of attention. In all three cases, we are dealing with a classically modern genre that is conceived as a kind of paradoxical mass solitude. We all file into the concert hall or movie theater, we all buy the mass-produced novel everyone is talking about — and we enjoy it alone, together. Western classical music has made high claims for itself over the centuries, but one area where it is surely an outlier among world musical traditions is in its near-total prohibition of audience participation. It contributed its full range of techniques for emotional manipulation to Its bag of tricks for emotional manipulation was selectively looted by cinema, which is now the dominant venue for orchestral music, and its successor artform also inherited the expectation — though not always the reality — of a passive, endlessly attentive audience. The horror and disgust that some filmmakers have expressed about the idea of watching a film on a phone (a prospect I also find unappealing) surely is not solely about the diminished screen size, but also about the expectation of attention.

As an educator and simply as a human being, I mourn for the loss of a cultural expectation of this kind of sustained attention. Truly great artworks and monuments of thought are becoming inaccessible in a way that will become increasingly difficult to overcome. That is a loss to humanity, full stop. But the entire regime of attention — deployed in obviously positive ways by Eliot and Akerman, and in an enjoyably harmless way by Rachmaninoff — was much more ambivalent than contemporary jeremiads against social media and mass distraction want to admit. There is obviously an authoritarian element to the high modern demand for endless attention, and it’s not clear to me that an easily-distracted population is easier to control than one disciplined by habits of sustained attention — I would compare my virtually non-existent discipline problems in the college classroom with those of a grade-school teacher, for instance.

This is not to say that it’s subversive to constantly look down at your phone or whatever. Yet we might observe that contemporary media effectively demand just as much sustained attention as the classic modern genres — social media doomscrolling and especially video games are intensely immersive and often transfix their users for many hours at a time. Meme culture certainly has its own complexity, including a (self-)referentiality that could put T.S. Eliot to shame, and people make high claims for the storytelling power of video games. What the user is supposed to do with this sustained attention is obviously different from the classically modern demand to cultivate subjective inwardness, above all in the expectation of audience participation (in the form of contributing content, rating others’ content, and, well, playing the game). The greater interactivity and user control has also, paradoxically, meant that I could easily get my wish of a four-hour film — but instead of Jeanne Dielman, it would be The Batman.

My empathy for those trapped in the new regime — including, at least in part, myself — is not oriented toward surrender, much less celebration. I really do want to find a way to usher at least some of my students into the best experiences of the old regime. Surely part of that means finding points of contact between the two regimes, but that always risks feeling like the classic Steve Buscemi meme. More than that, I wonder if the key is to model enjoyment — which might mean cutting down or changing the “canon” of works we highlight, for instance, so that the hypnotizing Jeanne Dielman replaces the “I guess you had to be there” Citizen Kane. One benefit of the passing of the old regime is that we no longer have to pretend that it fulfilled its promises all the time — the waning hegemony of the genre undercuts a certain amount of “affirmative action” for its less distinguished practitioners.

Ultimately, the true greats are going to be fine. They will find their audience. As someone whose professional calling turns out to be the curation and transmission of the cultural heritage, though, I want to find as many ways as possible to welcome people into that audience — in a non-patronizing, non-judgmental way. And my God, no Bruckner! What is he even thinking? There is no musical reason for all that repetition! Agony, pure agony!

WhiteFigure1-768x459

akotsko

Be the navel you want to gaze at in the world

It’s time for that oldest of blogging customs: explaining why you haven’t been blogging. This moment is especially fraught since I didn’t declare that I was taking “a hiatus.” My readers are feeling tense, abandoned. Didn’t Adam say he was back? Wasn’t he taking a whole big sabbatical from writing, all so he could blog again? What happened?

A lot has happened. Interesting things have happened in class. I’ve read good books. I’ve had illuminating conversations with friends that sparked my thinking. I’ve watched TV shows and movies and gone to concerts. I even noted with interest that a prominent figure in my field wrote a widely-shared article that divided readers! But not even my appetite for ill-advised controversy could rouse me from my blogological slumber.

It’s not a lack of material that caused this unannounced hiatus. Rather, it is the fact that having a full-time job turns out to be a full-time job. My teaching schedule this semester is demanding — I’m teaching an 8am class that is also much larger than my typical Shimer classes, then doing two more courses back-to-back after that. This schedule requires me to get up at 5:15am, which is hard even for a morning person like me.

I’ve also taken on a faculty governance role. In principle, the duties should be simple: I’m chairing the committee that assesses applications for sabbaticals (the real kind) and internal grants. My experience with governance at the old Shimer and my research profile make me a good fit for the position. Our main work amounts to two bursts of activity — basically assessing two piles of applications — and it’s interesting and rewarding, since I get a window into my colleagues’ research. And it’s kind of cool to be more of a public figure on campus. Now that I have to address faculty meetings, everybody knows my name! Amazing.

The implications of being chair, however, are considerably more. I am ex officio member of three committees — the Steering Committee (which plans faculty meetings and coordinates activities among the elected faculty committees), the Academic Advisory Committee (which brings various deans together with the elected faculty committee chairs), and the Board of Trustees Liaison Committee (which also entails observing a meeting of one of the Board subcommittees during the three annual Board meetings). All of this sounded great and doable to me — I’d be getting an inside window on the institution, just like at the old Shimer! But it turns out to be a lot of meetings for someone who has been strongly meeting-avoidant for the last several years — especially since a major leadership search has generated even more meetings! I recognize that it is an honor and a major responsibility to be included in all these processes, and I take my role in all these settings seriously. But wow. Yeah.

To some extent, I’ve also done it to myself, because I started to think — what is something that I could do as chair that wouldn’t have happened if it wasn’t me in that position? How could I make my mark? I had the wise idea to start an internal speaker series to highlight North Central faculty’s research. Thankfully, another faculty member was thinking along similar lines and volunteered to do essentially all of the organizational work — but I effectively assigned myself even more meetings. Good meetings! Interesting meetings! We had the first one, and I learned a lot and enjoyed myself and felt like I connected with old faculty friends and made new ones. But still. More meetings!

I am committed to a two-year term and anticipate that next year will be more manageable. I’ll be more experienced with all the processes, for one, and there will presumably not be another major leadership search of that scale two years in a row. The speaker series will either fizzle out or become more routinized. In fact, even now things are slowing down a bit — I don’t have any meetings or any extra trips to campus planned this week outside of teaching! At the same time, the unofficial sabbatical is starting to wind down — I have three speaking events coming up, and I’ve found time to do research toward the long-deferred book on Star Trek that I plan to draft this summer. All of these things, I suspect, will make me feel a little bit more like myself, easing me back into the routine of research and writing.

Not much of a sabbatical from most people’s perspectives, I bet! You could even argue that it made very little difference, since I wouldn’t have had much time to take on more writing in any case. But the fact that I wouldn’t have had time very much does not mean I wouldn’t have taken it on. What the public declaration did for me was commit me to say no to things for the foreseeable future, until I could reach the point I’m at now — namely, where I actively look forward to my return to writing, instead of (as was the case as recently as winter break) dreading it.

So yeah.

Office Space Printer

akotsko

❌