Warmish Take: What is a community?

Hello and welcome to my semi (very semi)-regular series of Warmish Takes: ruminations of impeccable (cough) wisdom on topics that are almost certainly past their sell-by date. Takes that are, let’s say, the quintessence of l’esprit de l’escalier.

Today’s take is about representation, particularly where it intersects with creative work and its audience. Now, I’m on record here as saying that recognizing and celebrating more authors from marginal groups is a good thing, not least because exposure to writers and artists who are outside the dominant norm expands our ability to recognize good work when we see it. Soliciting their insight and spotlighting their work is not only just, it’s enriching.

All the same, I am extremely chary of describing myself as an OwnVoices author or anything like it. For one thing, my use of a label like “aro-ace” is casual; it’s shorthand for an explanation that normally would have a lot more syllables. I described myself as “asexual” in a medical context recently and was startled to see the word appear in the clinical notes readout of the visit afterward. I thought, “Do I…want…to be known as Asexual in my medical chart? Isn’t that going to invite some assumptions?” I could already feel myself wanting to prepare some pre-explanations for future scenarios.

It’s the same with writing. If I say that I’m “representative” of the aro-ace marginal group, will people think that limits my insight into other people’s experiences? Will they assume I don’t know how to write sex, or can’t delineate the shape of a romance? Of course, in this case all I would have to do to avoid being pigeonholed is keep my mouth shut. Writers who are people of color, for example, don’t have such a luxury.

The thing about marginal groups…is that they’re marginal. Every human being is fully representative of humanity: there should be nothing prescriptive or limiting about the fact that they may also be an example of a non-dominant group. We ought to value the contributions of marginal writers, not because they are marginal, but because they exist. The margin only exists to be turned inside out.

In my view a lot of this is due to an irksome habit people have of throwing around the word “community” like it means the same thing as “demographic.” Like, “the gay community,” for example. Which one? Where do I show up for the meeting? Who takes the minutes? Who presents the treasurer’s report? What most people mean when they say something like “the gay community” is “various and sundry gay people you run into in online spaces,” and that’s not a community. A demographic is a counter that any person can have a spectrum of experience with; a community is always actual and specific people who interact with each other on purpose.

The compounding of technologies in the Internet Age has left our human brains struggling to catch up in a lot of ways. We used to have to go to some place to encounter a community, or to meet people outside it. Now, you don’t even have to join another social media platform to have someone screenshot a tweet into your Facebook feed. Little wonder that we would try to use the language of hospitality to make this limitless landscape of encounters make sense.

Community is human messiness and drama and blood and marrow and tears and helpless laughter, and you know with whom you are doing it. Representation as such is just public demographic presence; it’s not activism and it’s definitely not community engagement. But neither is it mere tokenism. Representation at its best disturbs ossified categories and clears space for real engagement. It is not the work; it is the ground-clearing for the work.

It’s important to make this distinction because communities have authority over their members and over their message, and we therefore expect whole demographics to police their members, issue coherent statements and intentions, and feel individual responsibility for how the whole demographic is seen. But those things aren’t possible because a demographic isn’t a fucking community.

So while this isn’t all there is to be said about the subject, I definitely want this statement to have a memorable impact, so I’ll stop here. By the way, the seed of this Warmish Take was germinated in one of my Morning Lights newsletters, so if you want your Warmish Takes a little warmer, by all means subscribe.

And now for a g-&-t.

Jane Eyre, trauma, and the writer’s id

So in my summer odyssey of brain fog, I became a bit of a Youtube junkie, because that was a relatively effort-free distraction from my back injury and its sequelae as well as my complete inability to make and carry out plans. (As any ADHD person knows, you have to make a plan to make a plan, so sometimes you’re just SOL on a bad brain day.) There, that should fill you, gentle reader, in on what was going on while I wasn’t blogging.

But, I’m back, with a whole list of Youtubers whose channels I’ve been enjoying, and today I’m linking a recent video by Dr. Octavia Cox, who does close readings of 18th and 19th century literature as a public service, and really, why more English majors don’t do this, I don’t know. Dr. Cox invites people to open discussion in the comments, but to be honest there’s no way I’m going to fit my ersatz Romanticist reax into a mere Youtube comment, so I’m blogging it instead. Plus, it has a bearing on the kind of writing talk I do here, so that’s where I’m going with this post.

You should really watch the video for the nuances — it’s only 20 minutes — but the gist is that a very celebrated passage in Jane Eyre, in which Jane-as-narrator castigates the cultural bonds that give women no scope for action and creativity, is bookended by her rather repressive methods as governess at the beginning, and the bitter laugh of Grace Poole (which is really Bertha Mason but Jane doesn’t know it yet), in a metaphorical commentary on Jane’s feminist mental rant at the end.

I think Dr. Cox is mostly right in her analysis of the passage (she is pretty good with these close readings generally — I particularly commend her commentaries on Jane Austen). What I’d like to discuss is the wider angle of Charlotte Brontë’s engagement with the themes of non-balanced power dynamics.

Jane Eyre is one book — among all the other books I read in surveys of the period — that all but demanded that I read it like a writer. I mostly do that anyway, but I think CB deliberately invites the interlocutor into the space where the story is being created: “Reader, I married him” seems to me another way of saying, “Writer, this is how I’m doing this story.” At some points of the text this invitation seems almost like daring the reader/writer to argue; at others it seems to presume a collaborative listening on our part, and this is where I’m reminded that the Brontë children made up stories together in a literal collaboration of writing/reading.

When it comes to this feminist/counter-feminist tidal lock in Jane Eyre, I have to (pause to groan) bring up The Professor. I’m not going to say go read The Professor if you haven’t, because you probably will wish you hadn’t. It’s an extremely idtastic early novel of CB’s in which the titular professor goes to Belgium, courts one of his students and marries her, and finally achieves a relationship in which he can be the dom he’s dreamed of being all his life but who no one in real life would ever want to have as a dom. If you think I’m exaggerating, this novel really puts the sub in subtext, and the reason I bring it up is that this novel is also written in first-person POV — but from the POV of the male character. The female character (well, all the other characters, really, but the love interest in particular) is seen entirely from the outside and is objectified by the narrative as well as the professor. My overall impression of this story is that CB had to write it to cleanse her writerly palate; but the point is this. The D/s elements in The Professor are very strong, counter-feminist, and appear to be quite unexamined; but in Jane Eyre they are brought to the center of the narrative and deliberately engaged by the author with the intent of making a fully integrated story realized not just in the POV of Jane the character, but in the 360 degrees of vantage surrounding her.

What this suggests to me is that while Jane the character is replicating the repressive education she herself received, the narrative is interrogating it, and the author is continuing a process of engaging with elements of her own interior world that she is working out through stories.

That’s one of the things that makes Jane Eyre so exciting as a novel, in my opinion; this deliberate cultivation of the id in story to narrate and re-narrate the experience of powerlessness minus trauma. Part of the mechanism of that in Jane Eyre is an actual redressing of the balance of trauma — Rochester has to suffer in order for Jane’s coming back to him to work as a story. But part of it is also setting up situations in which sexually-inflected power imbalances are handled without threatening the integrity of the person who has less power. I’m thinking particularly of St. John whatsisname and how he tries to tell Jane who she is and is destined to be, which is of course his obedient wife, very Professor-like; so, she leaves. And goes back to Rochester, who may be chaotic but at least seems to get her. I have a very strong memory of the scene in which Rochester is begging her to be his bigamous mistress and becomes so insistent and tearful that Jane in the narrative voice says “in another moment, I should be able to do nothing with him” — i.e., if she doesn’t change the trajectory of this scene he is going to make her his mistress by force. Jane frames the threat of rape by someone she loves who lied to her as a situation in which she can’t “do something” with him — she can’t make him be obedient, tractable, calm, or docile as she can with his ward, Jane’s pupil. The Professor this is definitely not.

No, Jane Eyre is not reliably feminist as a governess; one would be surprised if she were. But her counter-feminist tendencies are mingled with this element of dominant-submissive power exchange as a part of the author’s ongoing project of recasting potential and even actual traumas as more integrated stories. Conceptually, feminism and D/s interplay are two different issues, but in the human heart, it ain’t necessarily so. Charlotte Brontë has invited us into her parlor as collaborative listeners as she tells this story; she sets the parameters, and we have the opportunity to reimagine trauma as integrity along with her. I think it’s this aspect of the book that makes it a feminist project, more than the sum of its ideological parts.

Part of the problem for my generation of writers, though, is that the New Critics stand between us and the Brontës, with their insistence that “objective” (which is to say, established and therefore male) storytelling is superior to that which draws on the author’s id; that the recasting of trauma and power imbalances as integrated stories is a contemptible project for a writer to undertake. To which, at this point in my life, I want only to make the same reply that Captain Marvel did: “I don’t have to prove anything to you.”

But I am writing this blog post: I’m glad people are still reading Jane Eyre and grappling with its implications, because it’s still a hugely important book, and I can only aspire to the kind of narrative theology that CB’s achievement represents.

Warmish Take: “Careful or I’ll put you in my novel.”

Hello and welcome to the newest segment on this here blog, Warmish Takes. There are already plenty of places where you can get Hot Takes, but what yours truly promises here are Warmish Takes, straight off the bat.

Mind you, many of my takes are Warmish because it takes me so damn long to string enough words together: hence the hiatus here while I coped with Summer Doldrums and Plot Problems. More on that in another post. Fortunately, some of my Warmish Takes receive a flush of renewed warmth by coming back round again in the social media turbine, and that’s the case with today’s take.

In keeping with the warmishness of my takes, I’m not going to link out to any NYT articles or dissections of the short story “Cat Person” (the value of which for me is primarily in improving my Current Events percentage in my online trivia league). I’m just going to address this whole idea of “borrowing” (or “stealing” or whatever) other people’s lives and personalities to write fiction with.

And yes, I’m a longtime fan of Anne Lamott too, who says, “If people wanted you to write warmly about them they should have behaved better.” I’ve read any number of tweets the last few months in which writers defend themselves against the charge of sociopathy with something along these lines. Don’t want to appear in a writer’s fiction? Don’t have writer friends. Or friends with writers for friends. Or something.

As pointed out on Twitter, this particular hazard seems to be more endemic to the literary fiction world:

And really, why not? Literary fiction is more likely to involve situations and personalities that can be more easily lifted (or at least recognized) from the people around us. It does seem like a natural kind of hazard. I suspend judgment, like a shiny trapset cymbal to bang upon when the mood strikes. After all, aside from the more obvious heists, writers are the last people to know what alchemy induced them to come up with and sustain a story or a character — I say sustain because no matter how juicy a bit of goss might be, the writer just might not be into it for creative purposes.

No, I suspect there has to be a constellation of motivations in order for a writer to satirize a real-life person in the fiction they write. There are plenty of coffee mugs and bumper stickers warning the public at large: “Careful, or I’ll put you in my novel.” I usually take this for a pretty light jest; some writers pay compliments to people they love by drawing on them for a character and killing the character off. And a real friend eats that shit up with a spoon!

So yeah, judgment suspended. But…I can’t be the only writer who doesn’t really do this?

I mean, not that you’re not all interesting, you crazy multifaceted diamonds, you. It’s just a way of going about things that is really foreign to me. I just don’t really get the concept of fictionalizing things and people that are really out there. I don’t get fictionalizing my own life, or any of my experiences; all of that stuff is like wool sheared from the sheep, destined to be carded and dyed and spun and become something, well…else. Not rearranged into the shape of the original sheep and framed on a wall. It just doesn’t make any sense to me as a writer at all.

It’s not dazzlingly unique to say that all my characters are made out of me-stuff, out of things I’ve thought and felt and experienced; and I’m sure that’s true of these other kinds of writers too. Who knows, maybe I do have a roman à clef kicking around in me somewhere. But as of this warmish moment, it’s not interesting to me, either to write or to read.

And that’s my Warmish Take.

Sex scenes and the aroace writer

Ask ten aroace folks what they think about sex scenes in stories and you’ll probably get seventeen different answers. So consider this more a meditation than an assay at representation.

Writing always starts with reading. In the olden fandom days I used to complain a great deal about how ship fics crowded out nearly everything else in the pipeline. It’s vastly irritating to work hard on a piece that isn’t terribly explicit or shippy (or God forbid, gen) and see it drop in the pond without a ripple, while the one-off explicit story one writes for a challenge gets an avalanche of recs. There are still a few embers of annoyance there, when I care to stir them. Why the hell does a story have to have a sexually-driven throughline to compel widespread interest?

Yet I don’t hate to read it myself. I quite like a good romance from time to time, particularly if it’s got mystery or mayhem to go with the sex. True, I’ll often skip, skim, or gloss the sex scenes, not because I am disgusted by them, but mostly out of a reflex I used to call “smut sunblindness” — one doesn’t want there to not be a sun, but staring at it is A Bit Much.

I’m leaning toward a food metaphor for it these days, though. If I order a hamburger or somesuch, I rarely ask to omit the tomato and onion, even though I inevitably end up fishing them out and leaving them in the wrapper. Why? Because I like the flavor those things bring to the party, to paraphrase Alton Brown. I just don’t necessarily want to eat more than a bite or two of them.

“A sex scene should be about sex and something else,” counseled a writing-advice book I once read; I can’t remember if he was quoting Vonnegut or merely used Vonnegut as an example. (He took care also to excerpt some bad and overwritten sex scenes to ahem, nail down the point.) I think this is quite right; the more a sexual interlude drives character development or plot arcs, the more likely it is I’ll want to read it, and the more likely it is to be erotically interesting, too.

This principle informs my writing, too, naturally. If you’re not going to tastefully fade to black, there ought to be a reason for staying in the room where it happens, so to speak. Say perhaps that the encounter is the locus of a turning point between the characters, or a catalyst for the motives of one or both; say that it’s an occasion for release, or recognition, or ruin. That much should be clear whether or not one chooses to use unequivocal language.

I don’t, for the most part; my goal in writing scenes like that is to evoke emotion and sensation by an indirect approach, which is, as I said above, more erotically interesting to me. Once the scene is written, though, I tend to treat it just like sex scenes I didn’t write; I gloss them on reread and sift for the emotional throughline on the other side.

This is another instance of how hobblingly inadequate writing advice like “Write what you know” can be. It so easily becomes “Write only what you know,” and that is manifest bullshit. If we wrote nothing but what we know, we would write nothing but memoirs. Often I turn that around and say, “Know what you write,” but in this case, I could also say “Use what you know.” As an aroace person I know for a fact that that “something else” turns a piquant sexual interlude to a compelling one; that access to emotion and sensation is the goal of good prose; and that, as Lord Peter Wimsey observed, sex isn’t some separate thing “functioning away all by itself; it’s usually attached to a person of some sort.”

So as a person of this sort, I happily invite all and sundry to make use of my expertise. Happy belated Pride.

The self-suspicion of the (woman) artist

I’m not sure how or why this 2017 essay by Claire Dederer washed up on my Twitter timeline, but it was an interesting and layered read. Its question was: what does one do with the art of monstrous men? And of course, in that #MeToo moment, it was a question on everyone’s lips. And, since the essay invites its readers to weigh in with their perspective, I’m going to.

Hildegard of Bingen: now there was a woman who could write things in accordance with reality!

Dederer chose to peel these layers using the particular onion of Woody Allen. Which is interesting because I know exactly two things about Woody Allen: his movies are supposed to be towering comedic art, and he’s a child predator. Have I seen said movies? I have not. Have I read in depth the accounts of Allen’s misdeeds? Also no.

This is because I was raised in a strictly evangelical Christian environment. My parents may have watched a Woody Allen movie or two; I don’t know. When I became a fully independent adult, I had a nearly limitless array of modes in which to revolt; “watching Woody Allen movies” just didn’t make it onto the list. Diving into the liturgical church; reading, writing, and watching sci-fi and fantasy; and excusing myself from marriage and motherhood occupied most of those energies.

But. I’m intimately familiar with the self-suspicion Dederer describes. Am I a monster? I was asking myself this while I was still a child. I asked myself this when I was a callow college student. I asked myself this while working as an underemployed adult. I asked myself right up until I was 38, and one morning I contacted again an old memory of fleeting cruelty from a man when I was very small. But for the first time ever, instead of focusing on how furious and helpless it made me feel to remember it, I thought: I was right.

At the time I had said to myself: I must be mistaken. This can’t be sadism. This must be something else. I must be making a mistake.

But I wasn’t. I was simply telling myself a necessary lie, a lie that the powerless have to tell themselves for the time being. My perception is messed up, that’s what the problem is. No, what the problem is, is that lies like that throw out little metastatic filaments and snare the rest of your soul and make you think you’re fundamentally broken. Evil, even.

A monster.

But I’m not a monster. I have a fully functioning human instrument. My perception is just what a human’s perception ought to be: limited, but a miracle of function. My insight is a fine blend of acuity and experience.

It’s interesting to me that Dederer describes the indignation against monstrous men making good art and moves from that toward suspicion of herself as — too selfish? not selfish enough? — a secret monster making good art, or an aspiring monster in order to make good art. Yes, it’s all very sturdily Jungian; do your shadow work.

But this meditation is centered around a movie apparently written as an elaborate apologia for a middle-aged man fucking a 17-year-old girl. A girl who, because Allen is a good writer and has a sense of “balance” in these things, is miraculously free of the neuroticism that the grown women characters display. Listen: show me a girl who is preternaturally mature at 17, and I will show you a girl who secretly suspects she is the real monster in the room.

I believe that the only thing that has kept me perpendicular and sane these last four years is that moment of unbelievable escape beforehand, when every single one of those protective lies unraveled and fell to my soul’s feet. It was easier on me for a time to think of myself as a monster rather than stare my helplessness in the face. It took escaping one to also escape the other.

Perhaps this is why none of these terrible revelations about monstrous men behind closed doors have given me more than a few layers’ worth of pause about their art. Yeah, I felt a little guilty watching Carol — not because it was a film about lesbians, but because it was a Weinstein property. But there’s just not much shadow work to be done there, if I’m honest. No, what I’m thinking about is the parable of the demoniac who got rid of his demon, only to have it come back with seven friends and make things worse. Jeffrey Toobin is back on CNN as a pundit, after how many months in exile? Not many. They filled an empty chair with Toobin because there was an empty chair there.

This is not about selfishness, though arguments about selfishness are the stuff of (women) artists’ lives. This is not even about monstrousness, though the troops of House Depiction Is Endorsement come out to bay across the valley at the giants of predatory cruelty.

This is about insight. To claim insight is the ultimate act of temerity. Dederer lost a male reader because she questioned Allen’s insight in making Manhattan; she was not an obedient audience. She could make bloodthirsty remarks about butchering men in the street, apparently, without giving this man a qualm; and indeed why not? That can be dismissed as derangement. Derangement and neurosis, or demure nubile receptivity: no place for actual insight, in stories or in life, for people who are not white men. If a white man is not sitting in the chair, it’s an empty chair, amirite?

Yes, I say these things because the reality on the ground makes me angry. But it’s a mathematical anger. A logical anger, even. A Zachary-Quinto-saying-Live-long-and-prosper-when-he-really-means-Fuck-you kind of anger. A Stacey-Abrams-writing-a-shedload-of-romance-novels kind of anger.

An insightful anger. An anger that finishes what it starts.

In the summer of 2017, while Dederer was working on this essay (and her book on the subject), I was feverishly finishing the manuscript of Ryswyck. It’s an interesting thing to remember, the galvanizing power of that anger. I wasn’t marching in the streets; I was sweating in front of a computer screen in my apartment. In the same 24 hours, I wrote the last sentence, and John McCain turned his thumb down on ACA repeal. In such acts, visible and invisible, the resistance propounds itself.

We’ve had our fill of monstrousness, and even with the Abuser in Chief gone, there are still plenty of inexplicably cruel people willing to be monsters in public, and occasionally it feels really demoralizing. So it’s good for me to remember that I got free of that debilitating self-suspicion, and when I did I vowed to set free as many other people as I could.

In that sense, the pen isn’t mightier than the sword. It is the sword.

The Protagonist Opportunity

First, go read this amazing essay by Ada Palmer and Jo Walton. I’ll wait.

So, speaking of hotel clerks, there once was a man who went to a conference at a hotel whose customer-service motto was “There are no such things as problems, only opportunities.” He went up to the desk and said to the clerk, “There’s a problem with my room.” “Ah,” said the clerk, tapping the sign, “but there are no problems, only opportunities.” “Call it what you want to,” the man retorted, “but there’s a woman in the room assigned to me.”

Yes, it’s a stupid joke, and faintly creepy to boot, but it plays into what Palmer and Walton are talking about in their essay, which is at bottom an issue of displacement, in the Archimidean sense. It reminds me of the time when I, with disastrous naivety, joined a writer’s group while I was working on Ryswyck. At one point another member grilled me about who the protagonist was in the story: I tried to say that if anything, Speir and Douglas (and specifically the friendship between them) was the protagonist, if there had to be one — but that wasn’t sufficient. I finally allowed as how the reader’s-eye POV belonged to Speir, but refused to follow the logic that was being pressed on me: they wanted me to refocus the story on one person and leave out what wasn’t relevant to her directly.

Needless to say, this was the beginning of the end of my participation in this group, but I’m really grateful to Palmer and Walton for bringing an even wider angle lens to this issue — for describing the continuum of storytelling from protagonismos through braided POV through tapestry. Not only does this perspective explain why I find pitch advice for aspiring writers so desperately annoying (“make sure to identify your protagonist and her conflict/desire/pain point!”), it shows how dangerous for our collective narrative diet it is to read no stories except those driven by protagonismos.

Of the tapestry stories mentioned in the essay, I’ve read only the last — Edward Rutherfurd’s Sarum, which a housemate lent me as a favorite book of his (in exchange for Doomsday Book, if I recall correctly). I would never have picked up this immense book on my own, but I was fascinated by this “tapestry” mode of storytelling, in which all the characters, and the landscape itself, are like the striations of a muscle, working away to drive the story along. I do believe that even a plague flea was given a brief POV in Rutherfurd’s book.

Like Palmer and Walton, I’m not entirely sure what made Rutherfurd the final outlier in the trend away from tapestry storytelling, but I remember the 90s, and recall how much of the fin of that particular siècle was dominated by avatars — the Tank Man in Tiananmen Square, Bill Clinton, O.J. Simpson, Ryan White, Tupac Shakur, Michael Jordan (a lot of men are coming to mind, for some reason!). Stories were avatarized: A Night to Remember became Titanic; D-Day became Saving Private Ryan. Nowadays we’re getting villain origin stories, as if the only way to make Cruella de Vil interesting or compelling is to protagonize her. And let’s not get into Star Wars, shall we?

As the essay points out, the trend has swung so hard that a series like Martin’s “A Song of Ice and Fire” — which in another frame would be seen as a bog-standard braided-POV story — is regarded as an outlier for having a large ensemble cast. Ensemble casts have been actively discouraged as making books unwieldy and hard to sell. My friend and fellow indie author Erica H. Smith has embraced the cast-of-thousands approach — structurally, her books are made up of disciplined POV braids mostly in tight-third, but every other chapter she finds herself inventing another fascinating walk-on character to stir things along and I’m usually like, “Ooh! I like them; are we going to see them again?” “…Maybe.”

This is one of the uses of independent publishing. Ensemble casts, intricate POV braids, walk-on multitudes, tapestries — they may not sell like hotcakes, but someone has to write them. Else the protagonismos displacement might go the way of the Ever Given and block global sea traffic for weeks.

Thanks to this essay, a widened perspective shows me that my own instincts were what I thought they were — a braided ensemble like the cast of Ryswyck is not grotesque, nor is it fully a tapestry story. But as I’ve mentioned in other places, I made sure that the turning point of the plot depends not on Speir or Douglas or any of the other POV characters, but on the most ordinary and unsophisticated character in the cast, a character whose legacy will ultimately cast a longer shadow than a charismatic would-be protagonist like Barklay. I did my best to make sure not only that every character had a trajectory but that nearly all of them are indispensable to the community and to the solution of the story’s dilemma.

The fact that this essay exists is a harbinger of what I certainly hope will come, stories whose moral imperative is based in community, with hope that doesn’t spring from powerful avatars or narrative exceptionalism.

Now excuse me while I go seize an opportunity.

On Specialist Knowledge

Some years ago, a priest who was teaching a class I was auditing sat down with me to teach me how to chant a collect. (A collect — accent on the first syllable because it is a noun — is a prayer said by an officiant on a specific occasion to “collect” the prayerful intentions of the whole gathering. It has three main parts: it names God in a specific way, asks for a blessing in keeping with that name, and finishes with a doxology. I digress, but this will be useful in a moment.) The lesson didn’t last very long, because she discovered that I already knew where to put which cantilations. “It’s a grammar,” I said.

But here’s the thing. I knew how to chant a collect because I had been listening to priests who knew what they were doing chant collects year in and year out till I picked up the grammar by instinct. I still don’t know what that grammar is, diagramatically. I have the knowledge-by-acquaintance of how to chant a collect; I don’t have the specialist knowledge of how these cantilation structures work.

In my aside above about the definition of a collect, I mentioned the emphasis on the first syllable “because it is a noun.” Until someone on social media mentioned this rule in passing, it hadn’t occurred to me to notice that in English nouns that double as verbs, the accent goes on the first syllable for the noun form and the second syllable for the verb form — so a collect is a prayer that collects; a record is what results when you record something, and so on. Do we need to know this information? No, but somebody should know it. That context is meaningful, and may at times be crucial.

On the other hand, there was the time when I was six and an instructor was trying to teach me how to ski down a slope. “Put your weight on one foot,” she said, and I tried to put one ski on top of the other. “No, it’s more like leaning,” she corrected, and I almost fell over. It wasn’t till a few weeks later, when I was playing and thinking about something completely different, that her meaning clicked and I said, “Ohhhh!” I didn’t have the experiential knowledge needed to grasp the special skill she was teaching me. I didn’t yet have the muscle memory of purposely shifting all my weight to one hip, that poised flex of the bearing knee, that sweet spot of placement for my center of gravity (what’s that?).

Many times, we pick up knowledge by experience and we don’t know what we know until we are presented with specialist knowledge. We have to make a successful handshake for the two knowledges to integrate, and sometimes that’s a real challenge.

Such a challenge came up for me last night when Adam Neely’s latest video dropped. I’ll wait here while you watch it. It’s worth all 27 minutes.

Yes, it’s about Céline Dion; yes, it’s about a power ballad I always thought cheesy — though thanks to Adam Neely I am now aware that it’s a deliberate quotation of Rachmaninoff’s Second Piano Concerto. So that’s no wonder because, Unpopular Opinion time, I think Rachmaninoff and a lot of the other Later Romantics are cloyingly overwrought. But that doesn’t matter, because Adam Neely’s actual topic is fascinating: it’s an exploded diagram of the aural and emotional effects of a key change on a sustained note.

As someone who plays music as a craft but is not a practitioner of it as an art, I can appreciate the specialist knowledge Neely brings here, and I can even bring to bear my own experience of feeling myself in or out of tune with the ensemble when playing the flute, or the experience of blending when using my voice (I’m told I blend well, but I don’t get much of a chance to practice these days). I have the ghostly memory of what it means to sustain a note and feel the context change around it so completely that I have to hold up against a chill. I know what it’s like to try and sing without succumbing to the emotional power of the music. But even with all that experience, I still had to reverse the video three or four times in places and go, “Okay, Adam, run that by me again.”

It gives me a renewed appreciation for specialist knowledge.

But while it’s true that we don’t know what we know, we also don’t know what we don’t know. This is the basis of what’s called the Dunning-Kruger effect, which is how you get assholes convinced they are experts pontificating about shit they clearly know very little about. A friend recently sent me this article about the interpersonal pitfalls of encountering such people when you have specialist knowledge. When people experience a missed handshake between their experience and specialist information, it can read to their brains like an actual threat. The experience of being wrong can be felt as a kind of death, and the person inflicting that experience becomes a killer.

I don’t have to elaborate, do I, about how we’re seeing this aggression toward “experts” in the public square, to the point where “science” itself is a loaded catchall term for any situation in which we don’t put up with someone talking out of their ass? Okay. Let’s skip to what I said to my friend M who sent me that article.

What I said was, “I think women experts actually go through those stages [that Venkatesh Rao talks about] in reverse. We doubt ourselves; then we try to help; then we are reduced to manipulating people; then we wash our hands of it.” Who the expert is makes a vast amount of difference to the level of threat people feel when they encounter that uncanny valley between what they know and what they don’t know. I don’t think it’s an accident that expertise itself is being disparaged at a time when women and minorities are completing post-secondary educations at unprecedented levels.

Worse, post-secondary education has become inextricably tied up with class, so that we are all too likely to see someone with a college degree as someone who was able to complete a class gatekeeping ritual where others could not. The degree, and the jargon they pick up getting it, has no other meaning than that.

Yet this can’t be entirely true, or else Adam Neely wouldn’t have thousands of people watching (and reversing to rewatch) his explanations of music theory every time he drops a new video. It helps that Neely’s not threatening: he’s a cute young white guy with a Baptist haircut (an aesthetic I happen to like, so I’m not disparaging him here), operating on a social media platform. He’s clearly leaning in to all these advantages for his living; and why not, if it results in thousands of folks having fun while learning about music theory?

Step one, getting the expertise, is difficult enough. Step two, making use of it for the public good, is often dependent on whether we are the kind of person others want to recognize as an expert, and is therefore not necessarily within our control. But when we succeed, it’s nearly always because of a personal encounter: a priest teaching a theology student, or a ski instructor helping a six-year-old negotiate a slope. Even a one-way encounter on social media is still a place where one person (me) on a quiet Friday night during a pandemic can navigate that uncanny valley between what she knows and what she can’t yet grasp.

I don’t think that if we are an expert in something that it obliges us to try to reduce people’s threat level in any given encounter. But it seems to me that a reduced threat level is part of the exploded diagram of a successful encounter between someone’s experience and specialist knowledge, whether that’s within our reach or not. And I don’t blame people for washing their hands of some folks for whom, clearly, the least scintilla of acknowledgment is a crucifixion. Some of these folks are just going to have to go through some things.

I guess my takeaway this morning is that we need specialist knowledge, and we need people who are practitioners of it, and we need those handshake moments without which we cannot integrate our lives as we’ve lived them so far with what comes next. It’s an uncanny valley, and the tone colors are amazing. Meet me here.


So that time-dishonored topic of self-insert characters has rolled around on the Birdie App again, and I had Opinions:

“I suspect this scorn for authorial “self-insert” has leached into the water supply from the early psychologists and New Critics, who liked to tout the “objectivity” of high art as against art that draws on personal narrative, i.e., what women were doing at the time.

This got mixed in with the whole "Mary Sue" Disk Horse and cemented "self-insert characters" as a benchmark sign of bad writing.

It's bullshit at the root. As better people than I have pointed out, it's easy to seem "objective" when your POV is already dominant enough to be widely understood.

But the point of creating stories is to speak to some truth. Everyone with a functioning human instrument can do that; it's just that our culture wants to pretend that only some of us are worth hearing stories from.

I once wrote half a million words centering on what I called a "Mary Sue on purpose" — but my so-called "self-insert" character quickly took on a life of her own, which is as it should be.

I’m not writing any overtly self-modelled characters right now, but I reserve the right to if I goddamn want. So there.”

Originally tweeted by L. Inman (@linman) on January 31, 2021.

This is a topic Erica and I revisit occasionally: how we make characters out of our own soul-stuff, how we spin a creation from the ephemera of our minds. All characters are, as I quoted above, made of the author in one way or another — modelled, acted out, mimed, wept out of our own tears. You can’t “insert” anything into a story of your making, even a simulacrum of your own self for metacommentary’s sake.

Yet there are these hidden rules of criticism like bear traps in the path, that the reader is obliged to guess what parts of your story are biographical, and your job is to make the guessing very difficult. But, as it turns out, it’s always pretty easy to guess — wrong.

Oh, certainly, a better-written story is seamless in its elements, and nothing feels manufactured or out of place. But: the rules are a lie. You don’t have to guess the author’s biography, and there are no prizes for guessing right. Cynicism is not the opposite of naivety; the trajectory away from naivety goes in an entirely different direction.

I find it kind of telling that an author like John Scalzi, who is a Notorious Feminist Patsy Ally, is being tagged here for “bad” writing that is associated with the “bad” writing of women. We all know that women can’t come up with fiction that isn’t based on their own meretricious lives, amirite? But it’s different when F. Scott Fitzgerald does it.

Which is not to say that we don’t occasionally run across a story in which the id of the author is painfully obvious. It’s just that I don’t think that kind of discernment is useful as a critical apparatus — or at least, not as a primary driver of criticism. In that sense, the project of the New Critics was a worthwhile project. It’s just that they started out with a lot of begged questions, and that doesn’t do the reading public any favors.

We need a new new criticism, for these factionalist times.

On Killing Characters: or, the parabolic functions of SFF

It’s almost inevitable that at some point in a project, a writer shakes out the Evil Author cap, dons it, and puts a character to torture or to death. I’ve known and read plenty of Evil Authors through the years, and claimed the label myself on occasion: usually it’s with a slight deprecating laugh, like when disclaiming one’s internet search history. How long does it take a stab wound to close? Asking for a friend.

Was it readers who first started the Evil Author moniker, or did writers start calling themselves that in reflexive self-defense? Impossible to say, but that in itself underlines that Evil Authorship is usually conceived in terms of the relationship between writers and their readers. (“You killed Major Blue! How could you??”) In an age in which readers have almost immediate access to authors on social media and via email (and authors use those media to seek new readers), this dynamic is often the opposite of abstract and hypothetical. It’s a prominent feature of a very real landscape; but it isn’t exactly anything new.

All this is by way of saying that I hadn’t given much thought to the matter for a while. Then I ran across a tweet thread that gave me to think:

(Once again catching up on old topics now that my site is back up. NB: some database capabilities remain offline until the site is migrated to the big server being set up by my web host. If you have a subscription it should then be restored. When I’m in my new server home I hope to implement some expansions. If this blog is Relevant to Your Interests, perhaps you’d like to subscribe to a regular newsletter. I toyed with starting one but then 2020 happened. Anyway, back to the topic.)

What’s it really like to kill a character? What is that process? I have heard some testimony from other writers, but ultimately I can only speak for myself. When I conceived the story that would become Ryswyck some years ago, some structural framework was immediately apparent, and none of it really surprised me because I knew what kind of story I like to tell myself.

If any given writer has their own narrative preoccupations, mine have been apparent for a while. I’ve always been fascinated with the dynamics of forgiveness — what it’s really like to deal with a wrong done you by someone who matters; what it’s really like to be that person who did the wrong; what it’s like when the person who wronged you isn’t sorry, or doesn’t know enough to be sorry, or is committed to other priorities. What kind of things actually happen in the mind and heart when trying to cope with a wrong. What that might mean for the restoration of human dignity to people who were robbed of it.

Still, although I’d tortured plenty of characters in the service of my preoccupying narrative, I hadn’t killed any that I recall. Yet as the proto-structure of Ryswyck emerged, the death of a particular character was there from the beginning, and the real question in my mind was whether I would actually use his POV in the story. (He insisted.) The day I wrote the scene in which he was killed, I felt tired and drained, but mainly from hard work. Emotionally I felt firmly satisfied: I thought the scene was solid, and the story still what I wanted to tell.

No, it was killing a different character altogether that gave me trepidation. Here was an ordinary, likable supporting character, bluff, sensible, inoffensive. And one afternoon, between the writing of one early chapter and the next, I realized there was a storyline in which he was not only killed but tortured first. The more I thought about it the more it made horrible sense: how he matched up to a foil character, how he could act as a catalyst for the endgame, how thematically appropriate his end would be, how parabolic not just for my future readers but for the other characters. I was going to do it.

I did the same work: laid the same foreshadowing, traced the same thematic touches, made sure that an appearance from my foil character in the narrative was followed by him being onstage, or vice versa. When I wrote the scene in which he was killed, I felt all the same tired satisfaction at good work well done. But I also IMed my betas: “I need a drink.”

Another tweet I can’t currently find has crossed my ken recently, something to the effect that instead of asking writers why they built a non-sexist fantasy world, why we don’t ask other writers why they built a sexist one. And fair play to that; we don’t want to give sexist tropes a pass. But it’s hard for me to imagine a non-sexist fantasy world not being remarkable: because it is remarkable when compared with ours. Of all the genres of storytelling, SFF is the most specifically parabolic; from “The Cold Equations” to Ancillary Justice, from The Blazing World to Frankenstein to The Inheritance Trilogy, when we tell these stories we are all but explicitly measuring the moral curve between the world of that story and our own.

A parable isn’t deterred by the prospect of unsettling its audience. In fact, it would happily afflict the comfortable as well as comfort the afflicted. This is so deeply embedded in our understanding of the genre that in order to get away with using sexist and racist tropes these days, the writers of them try to re-identify who the afflicted and the comfortable in our world actually are. That the worlds are to be thrown side by side is never in question. The only question is what the ambition of the author is. What effect on our world are they aiming for?

I suppose that’s why, although I killed a lot of characters in Ryswyck, I was never less disposed to plume myself with the Evil Author epithet. An Evil Author might aim to make readers howl, but she isn’t out to mend the world with her song. I was out to imbue my characters with the power to bear witness, in life and death alike. Not to mention tell a cracking good story.

But I still probably need to disclaim my internet search history.

Artifacts of the Internet Old: the digital dig

Part I of an as-yet undetermined series.

Further to my last post, Gretchen McCulloch’s book got me thinking about the history of my own experiences getting on the internet. I’ve been threatening for a while to do posts about the lexicon of Fandom Olds, but McCulloch’s book made me realize that for this era when the internet is new, the time at which you got on it is itself an artifact worth examining — worth, even, recording for the benefit of people who study these things.

When John Keats died in 1821, he wrote his own epitaph which his friends duly inscribed on his gravestone: Here lies one whose name was writ in water. It was a reference to his perception of the fame and immortality he had achieved in his short life. Ironically, of course, Keats’s name and work turned out to be far more durable than his perception: people thought enough of his poetry to canonize it, thought enough of his letters to save and collect them, thought enough of his life to write biographies of it, and 200 years later you can take a class at nearly any university covering Keats as a subject in himself, or together with his set, or as an indispensable part of a survey of the Romantic literary period.

But in our own era — with the rise of the internet and the informal writing we use to navigate it — our usernames are writ in ether, and that can be a bug or a feature depending on when we joined the online world.

McCulloch divides the internet generations by adoption rather than age. Old Internet People are the early adopters, the techies and people with specialist interests who used bulletin boards, Usenet, and listservs as their platforms when they joined the internet. Full (and Semi) Internet People joined a bit later, in the late 90s and early 2000s, using blogs, LiveJournal, MySpace, GeoCities, and the like to create their web presence. The teens and young adults joining the internet now don’t remember a time when there was no internet; their first platforms were Facebook, Twitter, and YouTube, or later Snapchat, Instagram, and WhatsApp.

The fact that new platforms (together with their new practical uses) are continually rising to replace the old ones means that there is a lot of digital archeology building up. Some of it can be dug — and some of it can’t.

My first job out of graduate school was a temporary gig as a library tech working in the manuscripts and special collections department of my alma mater. I was also working on a novel at the time, and I was aware that the emails and AIM chats I was exchanging with my friends about the project, not to mention the chapter files themselves, stored on a handful of 3 1/2-inch floppy disks that were maddeningly subject to random corruption, were less easy to archive than the 20th-century manuscripts and correspondence I was handling at work. I made a haphazard effort to print out a lot of these, sometimes cutting and pasting chats into Word documents, but I wasn’t very thorough, and I don’t know where that file is, and as I recall it is very thin compared to the virtual reams of communication that died with my defunct AOL and Earthlink accounts. (I should probably archive my Yahoo email account, now that I think about it; but I never think about it.)

I identify myself as one of the Full Internet People McCulloch describes, because I am by nature a late adopter of new technology and new platforms, delaying to join until a critical mass of my acquaintances have already done so. I wouldn’t have known online fandom existed, much less gone looking for it, if a graduate school friend hadn’t liked a series I recced to her well enough to find a listserv for it.

But the fact that what she found was a listserv, and the fact that most of the people I met there were early adopter types who were already versed in BBS and Usenet, already had their own websites, were getting into the brand-new craze of blogging (“Blog — it’s a web log! Geddit?”), means that everything I learned about the internet I learned from Old Internet People. I learned enough HTML to code my own GeoCities website, followed my online friends to LiveJournal and learned to use Photoshop so I could make icons, absorbed enough CSS to tinker with the theme I was using, and occasionally joined chats for multi-person discussions.

More than that, I was in continual engagement with people who were older than I was, both online and off. Most people my age were not participating in fandoms or hanging out in chatrooms; they were launching careers and starting families. If it seems weird now that one would be doing either one or the other, it was even weirder to the people I knew offline what I was doing. For my older friends — fellow members of my religious community, coworkers, friends’ parents — I was wasting vast tracts of time communicating with people that would never be proper friends, about things that were by definition ephemeral. I wrote half a million words of fanfiction when I could have been writing original work of my own.

Bearing battle scars from arguing my case against this offline disapproval, I find it incredibly odd now that the internet — and fandom with it — are ubiquitous in “real” life, as if these arguments had never happened at all. You can hear phrases like “spoiler alert!” on the radio or television, and nobody is confused. News isn’t just discussed on Twitter, it happens there. My mother is on Facebook.

It’s my lifetime — not the lifetime of my parents, and not the lifetime of young people now — that has seen the full effect of the internet as a new and massive accelerant of change. When I was a college freshman, you checked your email by going down the hall to a small room of terminals in your dorm, typed “vax1” into a command prompt on a green screen, then put in your username and password. When I was a graduate student four years later, webmail came in, and I finally had an email with an @ sign and domain name, and accessed it via a browser. Ten years later, when my brother started at the same school, my university had graduated to using a Gmail client, and he probably had built-in DSL, too.

It sounds like I’m singing the old song “when I was in school we walked uphill both ways, over broken glass!, etc.” — but that’s not what joining the internet was like then. We knew it was new. We knew it was an innovation. We built our mental ships to take those waves, and willingly charted the new reaches of online communication. And, maugre the opinions of my offline relations, my online friendships are the ones that have lasted longest: they were formed from the start to withstand physical separations and other vicissitudes that make intimacy hard in the modern era. Erica, my most longtime beta and the one who gave me McCulloch’s book, I met on that first listserv in 1998. (Or thereabouts; we didn’t really get close till after we’d both joined the LordPeter list, so the details are hazy.)

So if this were an episode of Time Team, consider this Trench 1 in my digital dig. We know what kind of site we’re on; next, we’ll see what kind of finds we get.