Attention

Yesterday was a lovely, quiet Saturday. I got up early, went through my morning routine, and then went for a walk in the park. I did laundry, I had lunch, I took a little nap. I spent part of the day with the book I’m currently reading (Judith Halberstam’s The Queer Art of Failure, if you’re curious), and then in the late afternoon started listening to a series of lectures from a Oxford U general philosophy course. The lecturer is really quite good, and the narrative he presents quite compelling. I’m listening to this course, in part, because one of the more woeful gaps in my education (of which there are many, alas) is labeled “philosophy”; as an undergrad, I let the introductory formal logic course — which you had to get through in order to be admitted to any further courses — deter me, and so I have embarrassingly little grounding in the history of many of the ideas I want to be working with. I’m hoping that this series of lectures might at least give me enough of an overview so that I know a bit more about what it is I ought to know, but of course I’m certain that I won’t really know much of anything without taking on a more systematic, thorough course of reading.

This seems obvious, and yet what’s painful about it is all bound up in what Tim Parks and Corey Robin have lately written about: an increasing difficulty with actually doing the reading I set myself to do. I find myself lacking both for time (of which I seem to have precious little) and attention (of which I have less and less). Whatever the reason, it feels increasingly difficult to sit still and read much at all of late. I can’t tell how much of that difficulty is the technology-assisted monkey mind described by Parks — constantly looking for the next bit of incoming information, the next thing to click — or how much of it might be the distractions provided by other parts of my life, or (what I most fear) how much of it might simply be an aging brain. Would my attention span be shrinking even without all my surrounding technologies, in other words, or are the technologies interfering in my attention in the ways that I sometimes fear?

I’m working on some practices (a little meditation; a little bit of writing time in the early mornings) that I hope will help me better develop and maintain the ability to focus my attention on what’s in front of me, rather than constantly grasping for the next thing. But as I started pondering this problem in a bit of journaling this morning, it occurred to me that there’s another side to the question of attention that I hadn’t really connected here before. And it may be that they’re only connected by a sort of linguistic coincidence, but it nonetheless seemed significant.

As I started writing about my concerns about reading and my seemingly diminishing attention span, it hit me that this is the kind of thing that in the not-too-distant past I’d have written as a blog post, that I’d have shared almost reflexively. I felt little to no inclination to do that today, and so I started wondering what has changed. Is there something else different in my relationship to attention — not just the attention I pay, but the attention I seek, or more generously to myself, the attention I want to bear? One can read throughout my posts here since spring 2011 a series of not entirely successful attempts to work through my sense that my new position required (or seemed to require, at least) a reconfiguration of my public presence, my sense that I was at times a little more visible, a little more exposed, than might in the new order of things be ideal. There have also been, across that same period of time, some changes in the climate that have made working ideas out in the open feel a good bit less easy than it once was. But whether the changes are predominantly internal or external, the result is that I’ve become reticent about thinking in public — and that’s a not just a shame but in fact a pretty painful irony, given that thinking-in-public is both the source of whatever impact my work has had and the thing that I was hired to support.

In that support role, though, I’ve retreated somewhat behind-the-scenes, and I find myself somewhat reluctant to share the things I’m working on, in part because I get so very little time to work on them that all my ideas feel desperately under-baked. But the combination of what feels like my shrinking attention span and my reluctance to be public with my thinking have me more than a little worried about how (in fact whether) my work might proceed from here. I am hoping to find some strategies this summer to get myself past both of these hurdles, to work my brain in ways that help to grow my attention span again, and to re-develop my bravery about drawing attention to my work as it happens.

Being Wrong

Intermittently over the last year, I’ve found myself fumbling around an idea about critical temporalities. That is: ideas keep moving, keep developing, even after you’ve locked them down in print or pixels. You continue developing your own ideas, one hopes, but the others who encounter your ideas also develop them as well, often in very new directions. And given how much critical development takes place in the negative (demonstrating the fundamental incorrectness of previously held ideas, as opposed to building beside or on top of those ideas), the conclusion I keep being drawn back to is that everything that we are today arguing will someday be wrong.1

On the one hand, there’s a bit of a lament in this: the half-life of an idea seems desperately short today; the gap between “that’s just crazy talk” and “that’s a form of received wisdom that must be interrogated” feels vanishingly small. How nice it would be for us to linger in that gap a little longer, to find there some comfortable space between Radical Young Turk and Reactionary Old Guard. To get to be right, just a little bit longer, before those future generations discover to a certainty just how wrong we were.

On the other hand, there’s a perverse freedom in it, and the possibility of an interesting kind of growth. If everything you write today already bears within it a future anterior in which it will have been demonstrated to be wrong, there opens up the possibility of exploring a new path, one along which we develop not just our critical audacity but also a kind of critical humility.

The use of this critical humility, in which we acknowledge the mere possibility that we might not always be right, is in no small part the space it creates for genuinely listening to the ideas that others present, really considering their possibilities even when they contradict our own thoughts on the matter.

Critical humility, however, is neither selected for nor encouraged in grad school. Quite the opposite, at least in my experience: everything in the environment of, e.g., the seminar room made being wrong impossible. Wrongness was to be avoided at all costs; ideas had to be bulletproof. And the only way to ensure one’s own fundamental rightness was to demonstrate the flaws in all the alternatives.

As a result, we were too often trained (if only unconsciously) in a method that encouraged a leap from encountering an idea to dismissing it, without taking the time inbetween to really engage with it. It’s that engagement that a real critical humility can open up: the time to discover what we might learn if we are allowed to let go, just a tiny bit, of our investment in being right.

If time inevitably makes us all wrong, maybe slowing down enough to accept our future wrongness now can help us avoid feeling embittered later on. The position of critical humility is a generous one — not just generous to those other critics whose ideas we encounter (and want to contradict) today, but to our selves both present and future as well.

It’s no accident that I’m thinking about this today, on the cusp of a new year, as I try to imagine what’s ahead and look back on what’s gone by. It’s a moment of letting go of what’s already done and cannot be changed, and of opening up to new, as yet unimagined possibilities ahead. I wish for all of us the space and the willingness to linger in that moment, even knowing how wrong we will someday inevitably have been.

The Tree

Mom & me, next to the tree (December 1967).

I hunted through the cabinets where I’ve stored the old family photos to find this one this morning. It’s probably my favorite Christmas picture.

There are so many things about this picture that I’m haunted by, my mother chief among them. She’s barely 23 here — quite mature, by the standards of the time, to have had her first baby, and yet I can never see this picture without focusing on how unbelievably young she is. I want so badly to reach back through the image and help.

I also can’t help but focus on how tired she looks: I’m about to be four months old, and it looks like it’s been a pretty eventful four months. Her wrists are so delicate, and her skin is so pale. And yet for all that superficial fragility, she would hold everything together a few years down the road, when it all must have seemed like it was falling apart.

Youth aside, exhaustion aside, in this picture is my most intense connection to my mother. But for a slightly different nose, the girl holding the baby could perfectly well be me. My life, just starting in this picture, could have circled around to this point with no effort at all.

So much of the path I’ve taken — that she helped me take — has been different, and yet it all for me starts here, in the open-mouthed wonder of it all. How did they get this thing in here? And what for?

Merry Christmas, and happy holidays.

Tools and Values

I’ve been writing a bit about peer review and its potential futures of late, an essay that’s been solicited for a forthcoming edited volume. Needless to say, this is a subject I’ve spent a lot of time considering, between the research I did for Planned Obsolescence, the year-long study I worked on with my MediaCommons and NYU Press colleagues, and the various other bits of speaking and writing I’ve done on the topic.

A recent exchange, though, has changed my thinking about the subject in some interesting ways, ways that I’m not sure that the essay I’m working on can quite capture. I had just given a talk about some of the potential futures for digital scholarship in the humanities, which included a bit on open peer review, and was getting pretty intensively questioned by an attendee who felt that I was being naively utopian in my rendering of its potential. Why on earth would I want to do away with a peer review system that more or less works in favor of a new open system that brings with it all the problematic power dynamics that manifest in networked spaces?

In responding, I tried to suggest, first, that I wasn’t trying to do away with anything, but rather to open us to the possibility that open review might be beneficial, especially for scholarship that’s being published online. And second, that yes, scholarly engagements in social networks do often play out a range of problematic behaviors, but that at least those behaviors get flushed out into the open, where they’re visible and can be called out; those same behaviors can and do take place in conventional review practices under cover of various kinds of protection.

It was at this point that my colleague Dan O’Donnell intervened; by way of more or less agreeing with me, Dan said that the problem with most thinking about peer review began with considering it to be a system (and thus singular, complex, and difficult to change), when in fact peer review is a tool. Just a tool. “Sometimes you need a screwdriver,” he said, “and when you do, a hammer isn’t going to help.”

Something in the simplicity of that analogy caught me up short. I have been told, in ways both positive and negative, that I am a systems-builder at heart, and so to hear that I might be making things unnecessarily complicated didn’t come as a great shock. But it became clear in that moment that the unnecessary complications might be preventing me from seeing something extremely useful: if we want to transform peer review into something that works reliably, on a wide variety of kinds of scholarship, for an array of different scholarly communities, within a broad range of networks and platforms, we need a greatly expanded toolkit.

This is a much cleaner, clearer way of framing the conclusions to which the MediaCommons/NYU Press study came: each publication, and each community of practice, is likely to have different purposes and expectations for peer review, and so each must develop a mode of conducting review that best serves those purposes and expectations. The key thing is the right tool for the right purpose.

This exchange, though, has affected my thinking in areas far beyond the future of peer review. In order to select the right tool, after all, we really have to be able to articulate our purposes, which first requires understanding them — and understanding them in a way that goes deeper than the surface-level outcomes we’re seeking. In the case of peer review, this means thinking beyond the goal of producing good work; it means considering the kind of community we want to build and support around the work, as well as the things we hope the work might bring to the community and beyond.

In other words, it’s not just about purposes, but also about values: not just reaching a goal, but creating the best conditions for everyone engaged in the process. It’s both simpler and more complex, and it requires really stopping to think not just about what we’re doing, but what’s important to us, and why.

If you’ll forgive a bit of a tangent: I mentioned in my last post that I’d been reading Jim Loehr and Tony Schwartz’s The Power of Full Engagement, which focuses on developing practices for renewing one’s energy in order to be able to focus on and genuinely be present for the important stuff in life. I only posted to Twitter, however, the line from the book that most haunted me: “Is the life I am living worth what I am giving up to have it?”

At first brush, the line produces something not too far off from despair: we are always giving up something, and we frequently find ourselves where we are, having given up way too much, without any sense of how we got there or whether it’s even possible to get back to where we’d hoped to be.

But I’ve been working on thinking of that line in a more positive way, understanding that each choice that I make — to work on this rather than that; to work here rather than there; whathaveyou — entails not just giving up the path not taken, but the opportunity to consider why I’m choosing what I’m choosing, and to try to align the choice as closely as possible with what’s most important.

In the crush of the day-to-day, with a stack of work that’s got to be done RIGHT NOW, it can be hard to put an ideal like that into practice. And needless to say, the opportunity to stop and make such choices is an extraordinary privilege; thinking about “values” in the airy sense that I’m using it here becomes a lot easier once things like comfort, much less survival, are already ensured.

But this is precisely why, I think, those of us in the position to do things like create new programs, or publications, or processes, need to take the time to consider what it is we’re doing and why. To think about the full range of tools at our disposal, and to select — or even design — the ones that best suit the work that is actually at hand, rather than reflexively grabbing for the hammer because everything in front of us has always looked like a nail.

So, an open question: if peer review is genuinely to work toward supporting our deeper goals — not just getting the work done, but building the future for scholarship we want to see — what tools do we need to have at our disposal? What of those tools do we already have available, even if we’ve never used them for this purpose before, and what new tools might we need to imagine?

Engage. Disengage. Repeat.

I believe that I have caught myself just this side of a major case of burnout.

If that sentence is an exaggeration, it’s not by much. A few friends who had the dubious pleasure of talking with me just after I arrived at THATCamp Leadership last week can attest that I showed up with an attitude that was in need of a little adjustment. Whenever I was asked how I was, I’d find myself starting out by saying “things are great,” which I meant, but which gradually gave way to a Five-Minute Complaint. I kept trying to stop myself, but it kept bubbling over. I’d hit some kind of limit, and my self-censor was just gone.

It wasn’t that I was unhappy about being where I was; I was very pleased to be back at George Mason, to be seeing my friends, to be participating in an event that promised to be both important and energizing.

It wasn’t that I was unhappy about where I’d just come from; I’d had an excellent, if action-packed, visit to talk with faculty and administrators at an institution thinking seriously about its digital initiatives in the humanities.

It was more that where I was and where I’d just come from were on the tail end of five solid weeks of travel and committee meetings, involving eight cities (not counting New York) and more planes, trains, and automobiles (and one unexpected van) than I can count.

It was thirteen nights in eight hotels over a five-week period, capped off with a musty room with two double beds (rather than one king) on a low floor (rather than a high one) with an industrial rooftop right outside my window (rather than pretty much any other view possible from that building).

Something about that room was the last straw, the thing that sent me right over the edge into a bitter litany of complaint aimed at anyone who would listen. But it wasn’t the room, and it wasn’t the trip: it was everything I’d gotten myself into over the previous month and a half, and — especially — knowing full well that I’d done it to myself. That no one was responsible for where I was, or for the mood I was in, except me.

I’ve spent the week-plus since trying to how to rectify this situation, how to pull myself back from the edge of complete flaming disaster.1 Because, of course, my major projects did not grind to a halt in the office while I was traveling. Nor did the deadlines for the writing I’ve promised people this fall get any further away. It has become painfully clear that something has got to give — or that something will be me. And so, after a lot of thought, I think I’ve figured out what I need to do in order to make things better.

Less.

I need to do less.


You would be fully justified in rolling your eyes at this point. Because, yeah, duh. But this is a lesson that I have had to teach myself over and over.

I can read about the importance of significant downtime and totally get it. I can even go so far as to write about the degree to which stress has become the contemporary sign of our salvation or about the role of goofing off in the most important, most creative work that I do.

But I somehow cannot internalize it all enough to refrain from over-scheduling myself. Or at least I have not done so. And even when I think I’ve done a good job of protecting myself, of determining what’s enough and trying not to go beyond it, I manage to cram enough tiny things in around the edges that I end up just as over-scheduled and exhausted as ever.


If I’m going to be completely honest with myself — and this is hard — a huge percentage of this over-scheduling is about ego. People like my work enough to want me to come talk to them, and they’re nice to me when I get there, and that feels awfully, awfully good.2 There’s of course also a general people-pleasing aspect to the difficulties I have turning down requests. And as long as I’m at it I’ll acknowledge that I’ve also fallen under the spell of competitive busyness; every time somebody says “I don’t know how you do it” about my travel schedule I get a sad little boost.

Ha, I don’t know how I do it either.

I feel as though I’ve been able to do some good out there in my travels — as though I’ve been able to help some departments and institutions jumpstart some much-needed conversations, and as though I’ve been able to help demonstrate some of the possibilities for the academy’s future. But I also know, when I’m willing to look at it squarely, that I’ve gotten a lot out of just feeling important. But that’s finally wearing thin, and the toll is beginning to make itself known.


It’s perhaps not a coincidence that during this same period I’ve found myself withdrawing from the various venues where I engage with colleagues and other folks online. I haven’t been very present on Twitter, and I certainly haven’t posted here. Some of that withdrawal has been about not having enough time or space or whatever to devote to figuring out whether I had anything worth saying. Some of it has been about a level of conflict of late that I haven’t had the energy to face.

In any case, for someone whose job is focused on fostering productive online engagements, this withdrawal has not seemed to me a Good Sign, and it’s been one more thing that’s had me worried.

But I’m now thinking that the withdrawal is in part about the conservation of energy, and as such may not have been such a bad thing after all. Total disengagement would be a problem. But disengaging enough to restore oneself, in order to be better prepared to re-engage, is utterly, utterly necessary.

It’s like sleep. It’s cyclical. And you’ll go crazy without it.


I’ve been reading a fair bit of self-help type stuff of late, partially3 because I’m interested in the genre, in how it can describe and shape lived experience, and in the purposes it might serve in a scholarly context, and in part because I have felt myself in need of something that might help me personally figure out a better path. A more manageable way of being in the world.

Among the things I’ve read lately is Jim Loehr and Tony Schwartz’s The Power of Full Engagement, which, if they’ll forgive me, is a rotten title for a very important book.4 The key lesson in the book — heck, it’s in the subtitle, but if you’re interested, read farther than that — is that our belief that the resource we are shortest on, the thing that if only we had more of we could do what we need to do, is time, is dead wrong. In fact, the resource we are shortest on is energy, and we resist many of the things we need to do in order to conserve and restore our energy because they look to us like enormous wastes of time.

However, it’s clear that those wastes of time are precisely the things that allow us to step out of the barrage of the urgent long enough to discover, focus on, and make room for the important. In order to be genuinely engaged where it most matters, in other words, you have to find regular, routine ways to disengage. And to somebody as completely inculcated into our always-on, more more more culture as I am, that disengagement does not come easily.

Or at least it doesn’t come easily in a productive form. But it’s becoming clear that if I don’t figure out some better strategies for managing productive disengagement, a few much more damaging modes of disengagement are lurking just around the corner.


So, doing less. It’s not just a matter of saying no to more things. I keep trying to find some quantitative limit for how much I can do — no more than one trip every two weeks! no more than three major service commitments! — and yet it keeps not working. The over-extendedness just gets worse.

I finally realized something about why last week. In talking with my coach5 about the issue, it suddenly became clear that the problem is the nature of the quantitative itself. If I set a limit of four trips per semester, it becomes very hard to distinguish between four trips and four with one little add-on. Or five, for that matter. With maybe one small side thing tucked in there too. And something local, because that’s not really a trip. And next thing you know, I have a calendar filled with five solid weeks of three-city trips and am railing at my friends over cocktails.

It’s the nature of the more more more culture: if you can run two miles, isn’t it better to run five? If you can write an article about something, isn’t it better to turn it into a book? If you can speak in four places this semester, isn’t it better to add on just… one… more…?

The quantitative will do you in every time, precisely because so much of how we operate is all about finding our limits and pushing past them. So it’s becoming clear to me that I’ve got to turn my attention to the qualitative, if I’m going to change anything, even if it’s not entirely clear what in this context the qualitative might mean.


One key to the qualitative, I think, is figuring out how to determine what’s important, and how to separate it from what’s just nice, or ego-gratifying, or adding to the frequent-flyer record. But the real challenge in that is that I don’t mean “important” in some externally-defined sense: what’s best going to further my career goals, or promote my organization, or what have you. I mean what is most important in a very personal sense: what’s most in line with the things I value, the things I want to be, the ways I want to live. What’s going to support me not just in getting more done, but in doing what I most want to do, and doing it better.

What am I doing it all for, is the question I keep asking myself.


As I’ve been working on this post, I’ve been hoping that some conclusion would present itself to me, some anecdote that would cheerily illustrate everything I’m pondering here. I’m not sure that anything can; I’m not sure that concluding, in fact, is the right way to end this line of thought. As the links above might suggest, I’ve written too many times before about the need to recalibrate and reshape the way I’m living, and yet. Here I am. Again.

I had, however, a near-perfect day yesterday. I did a bit of work in the morning, and then went and got a fantastic haircut, and had a great lunch with a friend I haven’t seen in eons, and then headed back home. And on a whim, I told R. that I wanted to take a walk in the park. Rather than push it, though, in the ways that I usually do (surely you can go a little faster!), I let myself just… walk. A bit faster than a stroll. Kind of an amble. It only took about five minutes longer than usual to make the loop of the park, and in the process, I got to do two really important things. I got to spend the hour really talking with R., and I got to look around.

And the trees. If it’s not peak leaf around here yet, at least a few of the trees are there: flaming reds and yellows mixed in amongst the still-rich greens. It was absolutely gorgeous, the best moment of my favorite season.

It’s uncomfortably obvious (see footnote 5 above) to point out that it will all be gone in the blink of an eye. But it will be. And I’m grateful, really really grateful, not to have missed it.

That’s what I’m doing it for. That’s what I want to keep my eye on. How the things I elect to do can better contribute to my ability to engage with the here and now, and, when I need to recover, can let me gently disengage.

I do not know how. But I do know why. And that’s at least a start.

UGH.

I honestly don’t know what’s worse: that I never knew these lyrics at all until @Karnythia linked to them in the context of #SolidarityIsForWhiteWomen, or that I have not been able to shake the mouth-full-of-marbles version I’ve heard since childhood ever since.

I mean, I knew it was racist and sexist in that generalized “boy, aren’t YOU exotic! I could just eat you up” kind of way — which is to say, little worse than most rock music. But now it’s become the world’s most vile earworm.

I’m horrified that I didn’t know. And I’m disgusted that I can’t get rid of it.

Running

I’ve had an on-and-off romance with running for nearly 20 years now. I came to it late; I hated running as a kid, and I avoided it as much as I could in high school. And given that on the one hand I was pretty notably underweight until my mid-20s, and on the other, I grew up in a time and place that hadn’t yet been touched by things like girls’ soccer teams, nobody really forced me to think about anything like exercise. I joined a gym here and there; I took the occasional aerobics class. Never anything with any focus.

Until I went back to grad school. For some reason, that first semester at NYU I got serious. I went to Coles (which I recall being pretty shiny and new then) and took a prescriptive fitness class, where I learned a few basic things about exercise and was supervised through a range of circuit training programs. I remember spending a lot of my cardio time on a stair climber, until one of the instructors stopped me one day and said “mix it up a bit, Kathleen!” So I got on a treadmill and ran a mile in 10 minutes — the first mile I had ever run in my life. I was 26.

And I was hooked. R. and I started running together whenever we could. I was way slower than he was, always, but he pulled me along and got me to do more than I thought I could. And I ran by myself, too: endless tight little laps on Coles’s roof track, at first, and then once I moved to Hell’s Kitchen, early morning loops of lower Central Park. Those years, I was probably in the best physical shape of my life, and it was clear that the running was helping keep me on an even keel through the craziness of grad school.

But, being a grad student, I let the running gradually come, like everything else to be about Accomplishment. There’s nothing wrong with that, at least in the abstract, but it did something to the experience for me. It drove me to do more and more, well past the point at which I really should have just let myself settle into a more meditative routine.

In 1997, as I went on the job market, moved into high gear trying to finish my dissertation, and took on a full-time load of freelance work, my number came up in the lottery for the NYC marathon. And so I added training for that race to my schedule.

The marathon itself was amazing, though I ran it about half an hour slower than I’d hoped (partly for reasons out of my control; partly because of some less than optimal choices I made). It was an astonishing day, though, and I have no regrets whatsoever about the marathon itself. Training for the marathon, however, was another story. For months, I got up well before dawn to go run before settling down to work. I gave up hours and hours during the week, and pretty much a full day on the weekend, to running. And everything hurt pretty much all the time — not from an injury, just that overstressed, overused, constant ache.

I recovered slowly after the race, and gradually got back to a more normal level of running. Sort of. Something about all of those hours made me kind of dread running, and so once I graduated and moved to Claremont and started the business of being an assistant professor, I gradually… just… stopped.

Which is when the running dreams started, I think. I’d have these incredible dreams about running very strange race courses — across cities, in buildings, down stairs, through stores. Or I’d start running to try to catch someone, and just keep going. In my dreams, I was fast, and I felt great. A little nudge from the unconscious, I think, saying “don’t you want to feel this again when you’re awake?”

So I did gradually pick the running back up again, but wound up following the same cycle: ran well and felt great; ran more and felt better; decided to see if I could run another marathon. That one was Los Angeles, in 2005, and again the race itself went super well. And again, all the running before and after, a bit less so. I blew out one of my arches due to all the overtraining, and wound up with orthotics, which I never really got the hang of running with. And gradually, again, I stopped.

I picked the running back up a bit during my sabbatical a couple of years ago, but things started hurting again, and so I backed back off, trying to find my way to something that would be enough. Since then I’ve done some yoga, and a bunch of walking, but nothing has felt quite as good as running at its best has felt. And if I actually get to move into the apartment that I’m hoping I’ll be moving into soon, I’m going to have amazing access to another amazing park, and I want to be able to take advantage of that.

So I’m back at it, running again. And I’m trying to get myself to think about “enough” on the front end, as I’m starting up, rather than when things begin breaking down. I’m nearly 20 years older than I was when I ran that first mile, and I weigh a fair bit more, and things just don’t work quite like they used to. I eased my way into running this time with a lot of walking, and then slow short running intervals, gradually increasing them until I could run continuously. I’m a couple of months in, and it all feels pretty decent — nothing hurts, and I’m recovering from my runs well.

But I’m slow. What used to be my steady training pace is now my all-out intervals pace. I can feel my younger selves sneering at what my steady training pace has now become.

I remember telling R. years ago, in those early running days, that the key aspect of discipline for me was less about the need to make myself go do something than it was about the need to keep myself from doing too much. And so I’m trying to be very disciplined about things, to build strength slowly, to keep plodding forward, to focus on the years ahead rather than the miles right now.

Doubts

It’s not easy to write or talk about doubts. The things we have doubts about are often precisely those things that are most important, both to us and to those around us: a relationship, a job, a major life choice. If they weren’t important, our ambivalences and worries wouldn’t reach the level of real doubt.

But those things are often so important that even feeling a little bit of doubt around them (did I make the right choice? is this going to work?) can become a crushing weight. Doubt in those cases seems tantamount to betrayal, especially when it’s clear that acknowledging those doubts would create anxiety in the people around us. How can you possibly admit to feeling doubt? It would only let everyone down.

Or, if it won’t disappoint someone else, doubt can feel like an admission of error — and the stakes of such error can be too high to countenance. (Having spent ten years preparing for a career, for instance, experiencing doubt about the choice not only feels like failure, but like a failure so long-term that it raises the possibility that one can have wasted one’s life tout court.)

So the doubt gets suppressed, stuffed into the corners of our lives that we ignore. And sometimes that works, and in the busyness of the day-to-day, in the daily struggles and triumphs, the doubt fades. But sometimes it festers in those corners, and feeds on itself and everything around it, becoming much worse than is necessary.

Finding the sweet spot between allowing doubt to metastasize and infecting others with it is an enormous challenge. This is the kind of thing that people rely on trusted advisors, therapists, clergy, and really close friends for — airing doubts with someone who won’t freak out, someone who can act as a reality check and reflect the doubt at an appropriate size.

I find myself, however, wanting to write about my doubts, to air them publicly, in part as an attempt to demonstrate — as I have found myself doing over and over with a range of professional fears and failures — that we all experience this pain. I’m confident, in fact, that we all1 feel painful levels of doubt, precisely because that doubt is a core element of the intensely self-reflective careers that we have chosen. Not-knowing, uncertainty, insecurity, second-guessing — without them, we wouldn’t have questioning, investigation, development, growth.

So here’s the admission: I have doubts. Big honking doubts. Now more than ever. I’ve been asked more times than I can count over the last two years how my career transition has gone, how I feel about the change, and my standard response has been to say that 90% of the time, I’m absolutely certain I’ve made the right choice. And I think that’s all anybody can ask for.

What I don’t tend to say is that 10% of the time, the doubt can be all but paralytic. And I also don’t say that it’s gotten more intense lately, now that I’ve taken down the safety net. In fact, though, it’s been particularly acute for the last few weeks, as I’ve felt myself not getting done the things I want to do, and not doing well at the things I need to do, and as I’ve been left wondering whether I’m really cut out for this new gig at all, and what if I’ve made a horrible, terrible, irreversible mistake.

It’s not at all coincidental, I think, that my doubts — indeed, my self-doubts — have become so much more painful and pronounced just as I’m inching up on closing the largest financial transaction of my life: I’m buying an apartment in New York.

(That sound you hear is me hyperventilating.)

It’s not just a transaction with huge financial implications. It’s putting down roots. It’s not just saying “I’m not going back there,” as I did some months back. It’s saying “I’m staying here.”

And on a day when, for one reason and another, I just don’t feel like I’m good at my job, the weight of those doubts becomes unbearable.

* * *

I had a dream over the weekend that I think is about all of this doubt. I’ve been dreaming about work more or less non-stop for weeks, anxiety-filled dreams about trying to get stuff done and being unable to keep the details from skittering off everywhere. But this one was different: I dreamed I quit. I told the people around me that I just couldn’t handle it anymore.

Right in the middle of that, I remembered a couple of my projects — in fact, the biggest, scariest projects that are actually on my desk right now. I realized that I wasn’t going to be involved in seeing them through. And I was suddenly, crushingly, disappointed.

I wanted to be involved in those projects. I wanted to be the one who would get to see them through.

And so I ran off, trying to find Rosemary (hi, Rosemary! Don’t worry; it turns out well) to take my quitting back, to tell her I’d changed my mind. But I couldn’t find her, and I was horribly afraid it was too late.

And just as I told someone that, a huge airliner2 came flying in right overhead. Way. Too. Low. And it pulled up hard, but too late, and it clipped the top of the building across the street, and flipped over, and fell to earth upside down.

Everything else I was thinking just stopped, and I stared at the upside-down plane. Literally: the upside-down plane. It wasn’t wreckage. It wasn’t on fire. It was just sitting there. And all the passengers, who I had been sure were dead, were filing off in an orderly fashion.

And I thought, Huh. It’s all okay.

Which is when I woke up, thrilled beyond belief that I hadn’t in fact quit my job, no matter how stressful it can be at moments. Certain I could work through the doubts.

* * *

I started writing this post on the subway yesterday morning, feeling as though I needed to do some public thinking about the nature of doubt and what it means for the choices we make. Got into the office, put it aside, and took care of business. And proceeded to have a day utterly full of win.

The doubts will — undoubtedly, ha — come back. But even if I crash, it doesn’t mean I have to burn. It is really possible, even when it doesn’t seem so, that it will all be okay — maybe because being willing to embrace the doubt means that I’m ready to do the impossibly scary things ahead.

Productivity and Goofing Off

Lately I’ve found myself in one of those periods — perhaps we might refer to it as “my forties” — in which I’m so overwhelmed with the details involved in just keeping up with the most immediate and pressing tasks ahead of me that not only have I not gotten to do any writing, I’ve barely even found the space to contemplate the possibility of what might write if I had the time.

This makes me profoundly sad.

It’s not just about feeling too busy — it’s about the busy making me feel unfocused and unproductive. As though the big picture is slipping away in the masses of tasks that take up the work day and bleed over into evenings and weekends. And days off: not too many weeks ago, I’d made a pact with a friend to observe the oddity of the Presidents’ Day holiday by really making it a day off, celebrating by lying around reading a novel. Instead, I spent the day catching up on the many work and para-work tasks that just cannot be gotten through in the office. I got a lot done. I couldn’t tell you what, but it was a lot. It was kinda great, and kinda awful.

Another friend recently noted that I’ve come to refer to my plans to take a genuine day off by saying “I’m going to lie around and read a novel.” And as a professor of literature, at least in my not-too-distant past, I’ve got to marvel a bit at the association I’ve managed to build between novel-reading and leisure. Sloth, even: it’s not just reading, it’s lying around reading.

At some point, probably right about when I stopped teaching literature classes, the prior association I’d had between reading fiction and work began to fade. Reading fiction became play again, the way it had been when I was a kid. In part, the sense of fun in reading came back because I let it — I gave myself permission to read whatever I wanted, without any pressure to make use of what I was reading by either teaching it or writing about it. Without any pressure for the reading itself to be important. It was just about pleasure.

What happened shouldn’t come as much of a shock: I started reading more.

I’m looking now for a way to return that sense of play to my writing, to lessen the pressures that my preconceived notions of productivity have placed on it. I want writing to become a retreat from work again, rather than being all about work. I want it to be the thing I can’t wait to escape back into.

In order for that to happen, I think I’ve got to give myself a similar permission not to take it quite so seriously. What might be possible if I didn’t feel the pressure for my writing to be of use — if I didn’t need for it to be important? What if I could let my writing be just about pleasure?

Can I build an association between writing and goofing off?

Can a day spent sitting around writing come to feel like a holiday?

If You Can’t Say Anything Nice

Folks, we need to have a conversation. About Twitter. And generosity. And public shaming.

First let me note that I have been as guilty of what I’m about to describe as anyone. You get irritated by something — something someone said or didn’t say, something that doesn’t work the way you want it to — you toss off a quick complaint, and you link to the offender so that they see it. You’re in a hurry, you’ve only got so much space, and (if you’re being honest with yourself) you’re hoping that your followers will agree with your complaint, or find it funny, or that it will otherwise catch their attention enough to be RT’d.

I’ve done this, probably more times than I want to admit, without even thinking about it. But I’ve also been on the receiving end of this kind of public insult a few times, and I’m here to tell you, it sucks.

I am not going to suggest in what follows that there’s no room for critique, even on Twitter, and that we all ought to just join hands and express our wish for the ability to teach the world to sing. But I do want to argue that there is a significant difference between thoughtful public critique and thoughtless public shaming. And if we don’t know the difference, we — as a community of scholars working together online, whose goals are ostensibly trying to make the world a more thoughtful place — need to figure it out, and fast.

There are two problems working in confluence here, as far as I can tell. One is about technological affordances: Twitter’s odd mixture of intimacy and openness — the feeling that you’re talking to your friends when (usually, at least) anyone could be listening in — combined with the flippancy that often results from enforced, performative brevity too frequently produces a kind of critique that veers toward the snippy, the rude, the ad hominem.

The other problem is academia. As David Damrosch has pointed out in another context, “In anthropological terms, academia is more of a shame culture than a guilt culture.” Damrosch means to indicate that academics are more likely to respond to shame, or the suggestion that they are a bad person, than to guilt, or the indication that they have done a bad thing. And he’s not wrong: we all live with guilt — about blown deadlines or dropped promises — all the time, and we so we eventually become a bit inured to it. But shame — being publicly shown up as having failed, in a way that makes evident that we are failures — gets our attention. That, as Damrosch notes, is something we’ll work to avoid.

And yet, it’s also something that we’re more than willing to dole out to one another. There’s a significant body of research out there — some of my favorite of it comes from Brené Brown — that demonstrates the profound damage that shame does not only to the individual but to all of the kinds of relationships that make up our culture. Not least among that damage is that, while a person who feels guilty often tries to avoid the behavior that produced the feeling, a person who feels shame too often responds by shaming others.

So, we’ve got on the one hand a technology that allows us, if we’re not mindful of how we’re using it, to lash out hastily — and publicly — at other people, for the amusement or derision of our followers, and on the other hand, a culture that too often encourages us to throw off whatever shame we feel by shaming others.

Frankly, I’ve grown a little tired of it. I’ve been withdrawing from Twitter a bit over the last several months, and it’s taken me a while to figure out that this is why. I am feeling frayed by the in-group snark, by the use of Twitter as a first line of often incredibly rude complaints about products or services, by the one-upsmanship and the put-downs. But on the other hand, I find myself missing all of the many positive aspects of the community there — the real generosity, the great sense of humor, the support, the engagement, the liveliness. Those are all way more predominant than the negative stuff, and yet the negative stuff has disproportional impact, looming way larger than it should.

So what I’m hoping is to start a conversation about how we might maximize those positive aspects of Twitter, and move away from the shame culture that it’s gotten tied to. How can we begin to consider whether there are better means of addressing complaints than airing them in public? How can we develop modes of public critique that are rigorous and yet respectful? How can we remain aware that there are people on the other end of those @mentions who are deserving of the same kinds of treatment — and subject to the same kinds of pain — that we are?