wasted money on a cab again, but the Museum of the Moving Image is all the way in Astoria and I woke up too late and hungover to navigate the buses and trains to Queens. It’s one of those white, sterile modern buildings, both inside and out, very iOs—“futurist” in that way that’s always doomed to feel dated when people remember that the promise of technology doesn’t preclude the use of color. Even the font on the MoMI  logo—custom-made for the museum by Icelandic-German-NYC firm Karlssonwilker, no less—was created to fit the retro-futurist theme. In the designers’ own words:

“Once inside you are enveloped in angular whiteness and digital projections, the whole thing reminiscent of a cartoon imperial destroyer.”

That’s a 40-year-old reference to Star Wars, folks. And why is our vision of the future hopelessly démodé these days? My guess is that the realities of climate change make it difficult to imagine anything beyond either dystopia or nostalgia. At best we can slap a little vintage sci-fi lacquer over austere, “clean” interiors, pinning the little cartoon robots of our parents’ childhood onto Helvetica hairshirts. The stagnation in design aesthetics reminded me of the Gramsci line:

“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.”

In the midst of a splitting headache, I managed to mentally pat myself on the back for remembering a relevant text, and being the self-congratulatory millennial asshole that I am, I literally bought myself a cookie at the museum gift shop as a reward. As if to corroborate my dime-store design analysis, the cookie was shaped like Pac-Man (1980). It was dry and tasteless.

Theorizing the Web is now in its seventh year, its fifth in New York since moving from the University of Maryland. It’s billed as an “inter- and non-disciplinary annual conference that brings together scholars, journalists, artists, activists, and technology practitioners to think conceptually and critically about the interrelationships between the Web and society.” I couldn’t draw much of a bead on the target audience from such a generic cattle call. I had anticipated a crowd of bloodless, unerotic Silicon Valley expats, disruption in their eyes and Soylent pumping through their veins, but the quiet, shuffling flock of conference-goers exuded a distinctly MFA vibe. Lots of sprezzatura, “effortless” frayed denim and chunky heeled mules—you know the look, this season’s Gallery Girl uniform. And of course there was the timeless art school sea of monochrome black. Repetitive fashions aside, attendance was more diverse than I anticipated with regards to gender, age and race. Really, it looked as if the only underrepresented demographic was The Homely.

Style observations aside, the young, NYC-centric coterie of New Media Theory isn’t merely a generational subculture; they represent a distinct ideological schism from their libertarian counterparts in techno-optimist Silicon Valley. Though the politics of an event like Theorizing the Web tend to be fairly meandering and inchoate, gone is the Occupy-era techno-utopianism of a wild web frontier, and certainly any fantasy of the inherent “democratizing nature” of the internet. Of course, you’d have to be delusional to still believe in Liberation by Internet. Because a record of every injustice, from man-made environmental disasters to police violence, proliferates online to exponential redundancy, no reasonable or observant person can any longer argue that interconnectivity is the key to a Better World. In the wake of this disorienting disappointment, no dominant political ethos yet pervades the Theorizing the Web set, but the academia-borne political culture of Twitter and Tumblr abound.

For example, one panelist began her presentation with a “land acknowledgment,” a concept I was familiar with despite not having heard the exact neologism. After issuing a trigger and content warning (not uncommon at TTW), the speaker prefaced her presentation on surveillance and police violence with the following statement: “I wish to acknowledge the Native land on which we are gathered here today. The state of New York and the area surrounding the Museum of the Moving Image is traditional Iroquois and Six Nations Territory of which is part of 12 nations and tribes of the area, amongst many others which make up the whole of North America, traditionally known as Turtle Island.”

It was a strange little prayer to deliver to this particular crowd, none of whom I would wager were unaware of the indigenous genocides of the Americas. But as is the case with social media, even dating back to its primitive roots on message boards, the language produced by these platforms favor a kind of ritualistic set of grammars that are extremely well-suited to communicating fellowship and shared values, arguably sometimes at the expense of more compelling or challenging dialogue.

All photos by Aaron Thompson, http://www.aaronthompson.photo/

At the same time, other panels dealing with seemingly shallower subjects pushed for compelling theoretical work. I didn’t expect to get much out of the discussion on “Selfie Feminism”—I’m just generally skeptical of the political utility of vanity and self-care, as indulgent in both as I may be. One speaker presented her study of an online fashion game, where her results essentially reasserted what Kenneth and Mamie Clark had already illustrated in 1939 with their famous doll experiment: that white hegemony produces a decidedly racist beauty standard. Another woman discussed the self-produced softcore porn of Instagram, but I zoned out slightly, unable to concentrate on the repetitive slides of T and A. It’s not that I find porn monotonous per se, but the surfeit and uniformity of Instagram beauty gets old very quickly. Daisies are lovely, and varied from flower to flower, but when gazing at a field of them they become less of a pageant and more of a texture. I only really got annoyed when talk turned to that insidious spectre of The Male Gaze, which feminists are generally assured they neither need nor want, as if they somehow have a choice in the matter. My head was spinning with the sort of antagonistic questions I was too sick to ask without sounding bitchy: What’s wrong with women performing for male attention? Is not the indignity of performance innate to the social aspect of human sexuality? Is it even possible for sexuality to exist in a state of pure solipsism? And if so, is that a even desirable relationship to the body? Aren’t we all at times just putting on a show so that people will want to fuck us? Why are the lights so bright in here? Does anyone have any aspirin? Is there a bar nearby?

As if to answer my questions (at least the first few), the final speaker led with a refreshing Margaret Atwood quote:

“Male fantasies, male fantasies, is everything run by male fantasies? Up on a pedestal or down on your knees, it’s all a male fantasy: that you’re strong enough to take what they dish out, or else too weak to do anything about it. Even pretending you aren’t catering to male fantasies is a male fantasy: pretending you’re unseen, pretending you have a life of your own, that you can wash your feet and comb your hair unconscious of the ever-present watcher peering through the keyhole, peering through the keyhole in your own head, if nowhere else. You are a woman with a man inside watching a woman. You are your own voyeur.”

What followed was a rousing call to disinvest ourselves from the myopic and individualistic analysis of feminine content, to desist in reading tea leaves from atomized incidences of a larger social phenomenon (as if every selfie contained a secret feminist message to be decoded). She argued viewers should instead focus on the mediums and means by which women actually produce this content, bolstering her argument with some slick Susan Sontag and “Woah Dude” photography theory.

But then came the Q and A, and a woman asked that old, inescapable classic: can I be a feminist and still shave my legs? I went to the bathroom to throw up. “When I am Kommissar,” I thought to myself between heaves, “there will be a ten year moratorium on discussing the feminist implications of epilation.”

Of course, it’s the unfortunate tendency of all intellectuals to overthink the trivial and mistake the symptomatic for the causative, often exposing the boundaries and limitations of their particular school of thought (except for Marxism of course, which is always appropriate and correct). New Media Theory is no different. On one panel, a journalist brilliantly dissected the racist ramifications of the sharing economy; on another, there was a presentation on the political nuance of “race-bending” in online fan fiction (Theorizing the Web is the sort of event where a speaker wouldn’t be expected to spend any time defending the relevance of such a topic, which is… something). One speaker analyzed the implications to the DSM when delusional paranoia is promulgated online, then there was the woman nearly in tears of rage over a gender-neutral chatbot. Thankfully in decline seems to be the toothpaste-and-orange-juice trend of pairing pseudo-populist pop culture with opaque cultural theory—I sniffed out barely any “What would Deleuze and Guattari think of Kylie Jenner” type panels. I’m cautiously optimistic for the death of that particular party-trick-for people-who-go-to-corny-parties. (The New Inquiry is dead, long live The New Inquiry.)

Of course Post-Internet Theory suffers from a lot of the same pomo pop culture pomposity that academia does, and of course it’s mediated heavily by the social atmosphere of woke campus activist politics. And obnoxiously, both of tendencies are replete with tedious jargon, some of which overlaps (like saying “bodies” instead of “people,” for example). The use of this jargon often indicates more about the speaker than it does the subject being spoken of: academic slang is deployed to show that you’re Smart, while activist slang is deployed to show that you’re Good. And whenever either breed of jargon is employed to excess, you can bet it’s there to mask a lack of content. But amidst all the Small Liberal Arts lingo and critical theory pretensions, there are some brilliant and dynamic thinkers who believe it is an intellectual’s job to elucidate a complex world, rather than mire it further in bullshit (either obscurantist or sanctimonious). I mean, I followed that selfie feminist chick on Instagram, and what better endorsement is there, in this day and age?

As with most professional writers, my entire career clings like a barnacle to the unwieldy garbage barge that is the Internet. However, I had the distinct impression I was approaching Theorizing the Web as an outsider, and not only because “tech” and “conference” are two of my least favorite words. I like Evgeny Morozov as much as the next Marxist who only reads one tech theorist, but I’ve always been a late adopter to every technology and platform, and I was even more out of the loop than usual due to a recent exit from social media. Grand speeches on one’s exodus from online are the nadir of media onanism, so I left no note, but my reasons for logging off were fairly banal.

The truth is that my already mercurial attention span was suffering from too much screen time, and as I found it more and more difficult to concentrate, I realized social media wasn’t even really fun or interesting to me anymore. Twitter wasn’t making me laugh the way it used to, and scrolling through one petty non-event after another—usually cynical, careerist media spats masquerading as “the discourse”—left me bored and bleary. I figured if I was watching the timeline out of habit and resignation I might as well deactivate. Facebook was also taxing, and the interaction is more intimate, and therefore more repellent to me. Said goodbye to Zuckerberg after reading yet another status denouncing “someone who shall remain unnamed (but I think you all know who it is).” I have Snapchat, but I spend more time reading about its hyper-inflated stock bubble than I do actually using it. The camera is nice, but the filters are far too twee for my tastes. I’m also too egotistical to spend time on ephemeral media. Why would I create something that disappears as soon as it’s viewed? I want everything I produce to be engraved onto the face of a mountain, preferably one that children are forced to visit on field trips. As a result of my own conceit, I have never actually sent a Snap. That left only my Instagram, which I had set to private for years. Feeling admittedly sort of “disconnected” (from what I cannot say), I actually set it to public for a few months, but then I got sick of people trying to discuss politics with my selfie so I deactivated. I held out the longest on Instagram because it was the least intimate platform; I used it to quietly capture the little moments of my life that I thought I might like to rifle through one day, and (in defiance of doctrinaire selfie feminism) so I could remember that some people think I’m attractive. Now I just wear shorter skirts when I go out.

When I got home from Theorizing the Web, a nice man lay stoned in my bed scrolling through Twitter. He informed me that another online social justice “personality” had recently fallen from public favor, arguably for being an insufferable asshole. The nice man slurred at me dejectedly, as if from rote, “The internet has brought to the fore the most wretched and contemptible aspects of human life, and I fear it, because any form of human progress requires some minimum level of fellowship.” He’s not wrong, but there is no moving backwards now, and I’m strangely heartened that there are people who sit around thinking about this stuff, people who are neither techno-utopian nor Luddite. Neither of those scenarios are real futures; they’re both just nostalgic fantasies. As for being relatively offline myself, I think I will stay here; every experiment needs a control, and in some ways I have an advantage as an outsider. When there is bullshit, I am more apt to declare the Emperor naked. When there is brilliance, I still make a decent observer, like a precocious chimp among a gathering of Jane Goodall’s. It’s not a bad feeling.