FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

Thriving as a Graduate Writer

Over the past few months, in the lead-up to the publication of my book, I’ve used this space to share brief excerpts. Now the book is out! If you want a copy, you can order it from the University of Michigan website (or other popular book ordering places!). In case you haven’t decided whether this book would be a good addition to your library, here’s a brief overview.

I wrote Thriving as a Graduate Writer because I believe graduate students can reframe their experience of academic writing. We all know that writing is at the heart of the academic enterprise. It is both how we communicate and how we are assessed. That combination can be brutal for any writer, and it’s particularly fraught for graduate writers, who must learn disciplinary writing practices while being judged on their early efforts. Recognizing these challenges is valuable; graduate students are better off knowing that their difficulties with academic writing are entirely legitimate. This recognition, however, is only the first step. The next step must be to find ways to ameliorate those challenges.

In the book, I offer a discussion of principles, strategies, and habits that I think can help. (The table of contents can be found below, so you can see the breakdown of this material.) The principles point to a way of thinking about academic writing. Since writing takes up so much time and energy, it is worth exploring foundational ideas that can ground a writing practice: writing as thinking; writing as revision; writing as reader awareness; writing as authorial responsibility. Those principles lead into concrete strategies that can transform the experience of creating and revising an academic text. The heart of this book is the five chapters that unpack these approaches to working with text: managing structure; managing sentences; managing punctuation patterns; managing momentum; and building a revision process. The final element of the book is the consideration of writing habits. Even with a solid approach to academic writing and range of useful strategies to hand, we all still need to find ways to get writing done. Graduate writers, in particular, need exposure to writing productivity advice that is rooted in their unique experience of academic writing. This chapter provides a range of strategies to help build a consistent and sustainable writing routine: prioritizing writing; setting goals; finding community; developing writing awareness; and grounding productivity in writing expertise.

This book is a short (only 226 pages!) self-study text. You can read through the whole book—in whatever way works for you—and then use it as a reference. The manner in which you refer back to the book will depend on what you currently need to concentrate on. Most readers will benefit from returning to two chapters: Establishing a Revision Process (Chapter Eight) and Developing Sustainable Writing Habits (Chapter Nine). Those chapters are organized around charts that are distributed throughout the chapter (and that appear again at the back of the book). Since every writer has their own challenges and their own optimal writing process, I urge readers to take those charts and rework them—on an ongoing basis—to suit their needs. In addition to the charts, you will also find other resources at the end of the book: guides to using the book in a graduate writing course or graduate writing group and brief account of the blogs and books that I most recommend to graduate writers.

Overall, this book aims to inspire graduate writers to think differently about the nature of writing and then offers concrete strategies for managing both their writing and their writing routines. It was a labour of love to craft the writing advice that I offer everyday—here and in the classroom—into a more coherent and enduring form. I hope it gives you the capacity to approach this indispensable part of academic life with more confidence and more enjoyment. I look forward to hearing what you think!


Thriving as a Graduate Writer is now available from the University of Michigan Press. To order your copy, visit the book page. Order online and save 30% with discount code UMS23!

rcayley

Learning to Learn; or, Online Barriers for Total Beginners

Coming off of Reclaim Open, one of the things I’m thinking about is online resources for self-teaching beginners. When we were interviewing people for the documentary, we asked people what they were glad the internet had now, in the present, that it hadn’t had in the past. And a lot of people — not everyone, but a lot — talked about how there’s a plethora of learning resources for beginners on just about any subject. Which got me thinking about the learning resources that I’ve used and the tutorials I’ve tried to follow.

There are so many things that I want to learn. I’ve got a post in the works about teaching myself to draw. About a month ago, I hit a milestone on my Duolingo streak (800 days!). I used to practice guitar, though I’ve fallen out of that habit in the past year. For a while I was experimenting with some of the beginner guides to Unity. I have an abundance of tutorials and resources on various topics bookmarked — a beginner’s guide to Ruby on Rails, Codecademy, HackerRank, etc. — which I’ve used… at some point in the past. I keep a list of topics to research that only gets longer and longer.

All this, and I still feel like a dabbler in everything. Part of it is that I’ve put aside topics for long periods of time (almost everything except Duolingo, honestly). That’s naturally led to skill atrophy and forgetting what I was doing, which means difficulty picking up where I left off. But for the one thing I have stuck with, I don’t feel like I’m getting any better — my Italian is beginner-level at best, with a poor grasp of grammar and difficulty remembering vocabulary when I need it.

So I’m thinking: what are the differences in the resources I’ve looked at? What do they require? Where do they go together, and where do they fall short?

The framework I’ve got in my head right now for self-teaching is structured vs unstructured learning resources.

Structured resources are things like Duolingo or Codecademy, a series of tutorials designed to build on each other. Unstructured resources are more like the Youtube video tutorials that exist for drawing or guitar, and their related practice tools (guitar tab websites, figure drawing photo banks).

Structured resources are designed methodically by one group in a way that emphasizes logical progress from point A to point B to point C. There’s a general focus on fundamentals first, then building up to more advanced concepts, with exercises designed to practice each new topic. The exercises are usually short and easy enough that lessons can be completed in 5-10 minutes max, to encourage making learning a routine and habitual practice. The focus is on progressing through the course.

Unstructured resources means that there’s a wide range of sources from various unconnected groups, which all specialize in different topics. Learning is self-directed, since there’s no clear path connecting everything, and there are few if any pre-built exercises (a given resource might have 2 or 3, but none of them hang together). Learners can focus their studying in their weakest areas, or specialize in the topics that most interest them, and the lack of pre-built exercises means that their learning goals shape what they’re working on — which means that there’s more intrinsic motivation to learn, since they’re tailoring their practice to their own interests. The most common advice I hear for people who want to learn guitar is “Pick a song you like, and learn to play it.” There’s simplified versions of just about every song out there so beginners can learn the most basic version, and once they have that, they can try something more advanced. It’s learning by doing.

With structured learning, there’s issues of pacing, attention span and motivation. Short, easy lessons are designed to keep attention and build routine, so you can do a little bit every day, but if you do only a little bit every day it might feel like you’re taking months or years to get anywhere. That damages motivation, which is doubly bad because you’re working towards proficiency but not a specific intrinsic goal; that makes it extra-hard to measure progress.

Curated and designed exercises may also not be right for all learners, or self-structured online learning may create certain pitfalls. For example, one major issue I have with Duolingo is that because of the way its lessons are structured, there’s no way to have exercises strengthening true composition (written or spoken). There’s options for translating back and forth between your native language and your target language, but there’s nothing along the lines of “Write a paragraph about your favorite book” or “Talk about your most recent vacation”. That’s a major barrier to fluency, since being able to read and listen in your target language is only one half of communication, and it’s the less challenging half.

With unstructured learning, though, you still get pacing, attention span and motivation issues. This time the issue is that it’s hard to know how much time to spend on certain topics, and where to start or how to build on them. Dumping time into something while feeling like you’re stumbling in the dark trying to figure out what you need to do next is a sure way to damage motivation, which can in turn make your attention focus elsewhere.

Exercises for unstructured learning can also feel repetitive, since your resources only give you a few. Everyone says the way to get good at drawing is to practice figure drawing, which is true — I have definitely improved as a result — but I don’t know how to vary it up to keep learning fresh or which details to pay attention to in order to practice most effectively. And if it’s not repetitive, it’s chaotic — everyone has an opinion, and everyone disagrees. Who do you listen to, and how do you cut through the noise and really decide how to spend your time?

This is a long way of saying: I’ve never learned to teach, and I don’t know how to learn to learn. Because self-structured learning is way different than learning in a classroom, or in a group, or with a mentor. There’s no external framework to keep you accountable, or to provide feedback, or to provide any of the other benefits that come with a learning community.

When it comes to self-directed learning, there’s so many principles I keep hearing about — resilience, goal-setting, failing forward, varying your practice, etc. — but all the resources I’ve found assume learners are coming to them with those principles already well-developed, and that all that’s left is the skill-building section.

Which makes sense! Teaching your learners how to self-teach before teaching them what they actually came to learn, is an absurd thing to ask. But for pretty much everything I learned in school, I learned from other people; I almost never got practice teaching myself.

So there’s a lot of beginner-friendly resources out there. And they’re great for if you have one or two specific things you need to learn. But for people starting in total ignorance who want to work their way up to overarching mastery, how beginner-friendly are they really?

From Experience to Insight – the Personal Dimension of Philosophy

Written by Muriel Leuenberger

The more philosophers I have come to know, the more I realize how deeply personal philosophy is. Philosophical positions often emerge from personal experience and character – even the seemingly most technical, detached, and abstract ones. As Iris Murdoch wrote: “To do philosophy is to explore one’s own temperament, and yet at the same time to attempt to discover the truth.” Philosophy is an expression of how one sees the world, a clarification, development, and defense of “an outlook that defines who someone is” to add the words of Kieran Setiya.

This personal dimension of philosophy becomes evident in the new philosophical positions and topics that emerge when people with different personal experiences and points of view start to do philosophy. The most prominent example is how women in philosophy, particularly in the last 50 years, have contributed new perspectives – a brush of fresh air in old, stuffy rooms. Philosophy’s allegedly objective view from nowhere was rather the view from a particularly male perspective. Care ethics, feminist philosophy, and philosophy of pregnancy are just some areas where the inclusion of women in philosophy with their own outlook and priorities has advanced the discipline.[i]

The relational turn that can be observed in the philosophy of identity can be seen as a recent addition to this list. Relational identity is the idea that who you are is not just defined by your own properties and characteristics but also by how others define you. Others define us through concepts and norms we acquire in a social context that shape how we see ourselves and the world, they define us through our relations with them as friends, siblings, or members of an ethnic group or a book club, and they have the power to constrain our scope of action or provide opportunities. The latter can be a particularly incisive way of being defined by others. For example, by banning women in Afghanistan from universities the Taliban is defining who they can be. They can no longer become a doctor who dedicates their life to and finds meaning in caring for their patients. Insofar as we are defined by our actions, we can be defined by others who exercise control over what we can do in our lives.

Philosophy has typically been pursued by people whose life was in some sense open to them. They had a range of opportunities – doing philosophy was one of them – and did not face strongly limiting constraints and expectations, as in the example of an Afghan woman today. Academia and with it philosophy have become more accessible in many parts of the world. This means that more people are doing philosophy who either experienced more limiting constraints posed by others or who are aware that only very recent changes or the fact that they are born in a certain country spared them from a life of far-reaching constraints. People who have experienced or can readily empathize with how others can define one’s identity have entered the debate on identity. This development makes the emergence and rising popularity of relational identity views comprehensible.

I want to highlight a further, related reason for how the personal dimension of philosophy creates new trends besides the commonly mentioned shift in who is doing philosophy. The growing literature on philosophy concerned with topics and positions relevant to and based on the experience of a more diverse range of people can also be traced back to a diversification in whose testimony is being heard and taken seriously. As Miranda Fricker argued, marginalized groups are often faced with testimonial injustice – their testimonies are considered less credible due to prejudices related to their identity. For most of the history of philosophy, testimonies of experiences and viewpoints of women, non-western, non-binary, and non-white people were not heard, not taken as seriously or relevant, and not readily accessible. Globalization, digitalization, and a cultural shift towards more openness and equality are gradually changing this (although we still have a long way to go). The increased accessibility and ascribed credibility of testimonies of diverse experiences can inspire new topics and positions in philosophers who do not share those experiences but have come to learn about and empathize with them.

Philosophy clearly profits from taking other perspectives into account. We can get a richer picture of reality, a broader understanding of the moral landscape, raise interesting metaphysical questions, and new philosophical positions can come into sight that challenge established old doctrines. The deeply personal character of philosophy makes the inclusion of and attention to different voices all the more pressing.

[i] Vintiadis, Elly (2021, August). The view from her. Aeon. https://aeon.co/essays/is-there-something-special-about-the-way-women-do-philosophy

Who Gets to Be a Person?

Written by Muriel Leuenberger

 

The question of who gets to be a person is one of those old but never outdated classics in philosophy. Throughout history, philosophers have discussed which human beings are persons, when human beings start to be persons, when they are no longer the same person, and whether non-human beings can be persons – and the discussion continues.

The task of defining the concept of a person can be approached from a purely ontological angle, by looking at what kind of entities exist in the world. There are those beings we want to call persons – what unites them and what separates them from non-persons? This ontological project has, at least at first sight, nothing to do with how the world should be and purely with how it is.

But many moral practices are connected to this concept. Persons deserve praise and blame, they should not be experimented on without their consent, they can make promises, they should be respected. The status of personhood is connected to a moral status. Because of the properties persons have they deserve to be treated and can act in a certain way. Personhood is what can be called a thick concept. It combines descriptive and normative dimensions. To be a person one must meet certain descriptive conditions. But being a person also comes with a distinctive moral status.

Defining thick concepts is particularly tricky. Those definitions are not just judged for their descriptive plausibility but whether they imply acceptable moral practices. In the debate on personhood, philosophers have repeatedly drawn boundaries on the descriptive level that lead to normative implications they do not want to support. Notably, individuals who they would like to see treated as persons do not meet their criteria for personhood because they do not have certain cognitive capacities.[i] Most recently, this happened in this year’s John Locke Lecture by Susan Wolf on Selves like us.[ii] She argued compellingly for a definition of character as a complex of dispositions and tendencies that reflect and express one’s distinctive way of seeing the world. She furthermore seemed to imply that certain types of attitudes, such as resentment, gratitude, forgiveness, anger, or love (Strawson’s reactive attitudes[iii]), can only be directed towards ‘selves like us’ which meet her definition of having a character. In her account, character requires cognitive faculties of “active intelligence”. Because of this, the question arose what this implies for individuals with cognitive disorders. She replied that she would certainly not want to exclude them from being appropriate objects of reactive attitudes and would have to do more research to work out how they would fit in her framework.

There seems to be a disparity between our intuitions and opinions on who should be treated as a person and descriptive definitions of the term. One attempt at fixing this problem has been to stipulate that while the suggested definition of personhood excludes, for instance, people suffering from dementia from being persons, this does not undermine their moral status.[iv] But because the normative and descriptive dimensions are intertwined in thick concepts, such attempts at separating them do not seem to be successful. It’s too little too late to reassert the moral status of an individual whose personhood has just been denied. The rhetorical power of denying that someone is a person should not be underestimated – a reassurance that this does not affect their moral status seems insufficient to counteract it.

Personhood is usually defined via capacities, such as moral agency, autonomy, self-awareness, narration, or rationality. Those capacities require certain brain functions – they are tied to biological facts about the individual. But biology is fuzzy, gradual, and full of multiple but slightly different solutions for the same problem (e.g., for realizing a capacity). As David DeGrazia[v] argues, those capacities are multidimensional and gradational. For instance, there are different kinds of self-awareness (bodily, social, introspective) and they come in degrees. To know whether, for example, great apes are persons, we would have to define arbitrary cut-off points for the capacities that are defined as essential to personhood. Thus, personhood is a vague concept, meaning that there is no non-arbitrary way to define whether an individual is a person. Because it is also a thick concept, arbitrary cut-offs are particularly worrisome since they can have far-reaching normative implications.

In the face of those considerations, we should be aware of and thematize the limits of definitions of personhood (or selves). Marginal cases can and should remain undecided. This does not mean that philosophy has nothing to say about what is distinctive of persons. Identifying common properties of clear, paradigmatic cases of persons can make salient in which way marginal cases differ. Differences in moral practices can be accounted for through distinct properties, instead of an overarching term like personhood or self. This allows for more nuance in our moral practices.

Pattern theories of personhood or self, which take a range of properties and capacities into account, can be particularly helpful in this regard.[vi] According to a pattern-theory, personhood or self are constituted by a cluster of dimensions that interact with each other and that take a different value and weight for each individual. A self might, for instance, be constituted by embodied, experiential, affective, behavioral, intersubjective, and narrative dimensions. Someone becomes a person through the dynamic interaction of a range of capacities, such as, moral agency, autonomy, self-awareness, narration, and rationality. Changes to one dimension may cause modulations in others. Concepts like personhood or the self are not reducible to any one of these aspects but are complex systems that emerge from the dynamic interactions of those constituents.

Pattern theories can illuminate how a range of properties and capacities interrelate to produce characteristics typical of clear cases and make salient in which ways other individuals differ. Instead of either ascribing marginal cases the status of personhood or not, pattern theories can describe them in terms of different types of persons (with gradual transitions in-between) which warrant distinct moral practices. Thereby, they can help us to avoid the philosopher’s compulsion to draw clear lines where there are none.

 

[i] On the other hand, definitions of personhood can of course also appear to be overly inclusive.

[ii] Self and person are often used interchangeably. Definitions of the self face the same problems because the self tends be considered as a thick concept as well (albeit less obviously than in the case of personhood).

[iii] Strawson, P. F. (2008). Freedom and resentment and other essays. Routledge.

[iv] Schechtman, M. (1996). The Constitution of Selves. Cornell University Press.

[v] DeGrazia, D. (1997). Great apes, dolphins, and the concept of personhood. The Southern journal of philosophy, 35(3), 301-320.

[vi] Leuenberger, M. (Forthcoming) A Narrative Pattern-Theory of the Self. In: Personhood, Self-Consciousness, and the First-Person Perspective. Edited by Markus Hermann. Brill mentis.

Gallagher, S. (2013). A Pattern Theory of Self. Frontiers in Human Neuroscience, 7, 443-443.

Mummification and Moral Blindness

By Charles Foster

Image: The Great Sphinx and Pyramids of Gizeh (Giza), 17 July 1839, by David Roberts: Public Domain, via Wikimedia Commons

Words are powerful. When a word is outlawed, the prohibition tends to chill or shut down debate in a wide area surrounding that word. That tendency is much discussed, but it’s not my concern here. It’s one thing declaring a no-go area: it’s another when the mere use or non-use of a word is so potent that it makes it impossible to see something that’s utterly obvious.

There has recently been an excellent and troubling example. Some museums have started to change their labels. They consider that the use of the word ‘mummy’ demeans the dead, and are using instead the adjective ‘mummified’: thus, for instance ‘mummified person’ or ‘mummified remains’. Fair enough. I approve. Too little consideration is given to the enormous constituency of the dead. But using an adjective instead of a noun doesn’t do much moral work.

Consider this: The Great North Museum: Hancock, has on display a mummified Egyptian woman, known as Irtyru.  Visitor research showed that many visitors did not recognise her as a real person. The museum was rightly troubled by that. It sought to display her ‘more sensitively’. It’s not clear from the report what that means, but it seems to include a change in the labelling. She will no longer be a ‘mummy’, but will be ‘mummified’.  She is a ‘mummified person‘:  She’ll still remain in a case, gawped at by mawkish visitors.

The museum manager told CNN that he hoped that ‘our visitors will see her remains for what they really are — not an object of curiosity, but a real human who was once alive and had a very specific belief about how her body should be treated after death.

Let that sink in.

Whoever Irtyru was, she did indeed have a ‘very specific belief about how her body should be treated after death’. It did not involve lying in Newcastle, causing school children to scream. To describe her as ‘mummified’ rather than ‘a mummy’ does nothing whatever to address the offence of displaying her in a way wholly inconsistent with that ‘very specific belief’. That the museum apparently thinks it does is a symptom of moral blindness. There is a real issue about the display of Irtyru: it is not addressed by tweaking a word. More worrying is that that tweak seems to render invisible the very moral issue it purports to address. I’m not saying that Irtyru shouldn’t be displayed: I am suggesting that changing a word is no substitute for proper deliberation – let alone real change.

This is an example of a more general and sinister malaise. Virtue signalling has taken the place of serious, difficult ethical discourse.

 

 

 

❌