In one of my first acting classes at college, we performed a Stanislavski exercise. “You’ll come up to the front of the class and perform an activity,” the professor said. “Something mundane. Something you do every day.” One classmate brushed her teeth. Another did her makeup. Another, a guy, walked to the stage wearing nothing but a pair of boxer shorts and proceeded to get dressed. When my turn came, I sat down and painted my toenails, perched on a chair with my knee bent up under my chin. I was on my second-smallest toe when the brush slipped, and I swore.
When my performance was over, my professor asked: “Would you have done that if you were painting your nails in your room alone?”
“Yes,” I said.
But the truth was, I wasn’t sure — and 20 years later, I still think of this moment every time the brush slips when I’m painting my toenails and I say “god damn it”. I wonder: am I authentically a person who swears when she messes up her pedicure? Or have I become one because, all those years ago, I said I was?
I thought of that moment, too, while reading Emily Bootle’s new book, This Is Not Who I Am, an examination of authenticity in the digital age, which attempts to suss the “realness” of everything from politics to personal brands. One early definition of authenticity, offered in the book’s introduction, is drawn from Rousseau: that you must not only be who you are but be seen to be who you are. But surely being seen, especially if you’re trying to be, influences your behaviour? If you dance (or paint your toenails) like no one is watching but in full awareness that an audience does, in fact, exist, where does authenticity end and performance begin?
This was a difficult enough question to answer back when authenticity was synonymous with the mundane — when those “They’re Just Like Us!” photo spreads of celebrities running errands or exiting the gym wearing sweatpants were the realest thing out there. It’s even more difficult in an era when authenticity is increasingly synonymous with tragedy, when being real requires nothing less than eviscerating your own trauma in public so that everyone can see the suffering. That well-known meme, “pics or it didn’t happen”, implies a constant burden of proof, a sort of Turing test whereby you prove your humanity by posting. The irony: there remain circumstances when whipping out a camera to post the moment is decidedly inhuman.
One of the most disturbing things I saw amid the posting-mad days of the pandemic, now gratefully deleted, was a selfie from a woman standing in a hospital room, the camera angled so that the patient, her father, was clearly visible in the bed over her shoulder. His eyes were closed, his mouth was open, and not only was it unclear whether he was merely dying or already dead, it is hard to say which of these things would have been worse. “Overexposure to this type of image diminishes their power to induce revery,” Bootle writes in the chapter titled “Celebrity”. She’s talking about Beyonce’s pregnancy photo shoot, but she could just as easily be talking about this. This is the content we’re most rewarded for sharing: the Botticelli-style pregnancy photo shoot or the deathbed selfie (but not much in between).
Having become synonymous with the ugliest moments of life, the notion of authenticity intersects with another term, relatability, which is equally hard to pin down. Bootle notes that it is “overwhelmingly used to describe a generic self-sabotaging personality that leans towards extremes”:
“It is relatable to lie in bed all day; it is also relatable to try to cram too much into the day. It is relatable to run out of money… it is relatable to spend a lot of money on something frivolous but not on something sensible. It is relatable to have a panic attack. It is relatable to eat burgers; it is relatable to forget to eat. It is not relatable to resist sleeping with your ex, and it is not relatable to have a gym membership… It is not relatable to get married. Relatability, in short, suggests mess.”
The question, then, is what happens when we start thinking of our mess as something to be posted about instead of something we need to clean up.
For every person who quietly recoiled from the sight of that hospital room selfie, from the gaping mouth of the dying man looming like a black hole over the picture-taker’s shoulder, another punched the retweet button and praised her bravery, her realness, in sharing that moment. This is perhaps unsurprising — it’s widely known by now that negative post receive more engagement online because they make people feel good — but less studied is the peculiar alchemy whereby positive posts can get the same sort of traction if they make people feel bad
Last month, a woman who posted about the pleasure of spending mornings drinking coffee in her garden with her husband became Twitter’s main character of the day. Her authentic expression of joy accrued outraged demands that she check her privilege. When a person is seen to be who they are, but who they are is a happy and well-adjusted adult, we don’t quite know what to do with it. We don’t like it. We don’t trust it. And if you won’t show us the mess lurking under the surface of your seemingly-happy life, we’ll just have to make one for you.
This toxic breed of engagement isn’t unique to Twitter, but it does seem to find its purest expression on that platform. Recently, on a friend’s podcast, I listened while he spoke ruefully about the behaviour that Twitter incentivises, comparing the site to a drug from which he wished some external force would cut him off. But I wondered how much we can get away with blaming the platforms for the behaviour of the people on them — and said as much. My friend, a sweet and mild-mannered guy in real life, has an entirely different persona on Twitter: pompous, caustic, bullying, openly antagonistic. Surely this is at least partly a choice, I said.
My friend conceded that Twitter does not bring out the best in him. Then he told me that personal responsibility is a Right-wing talking point — which is, ironically, exactly the kind of argument that would play maximally well on Twitter. Does this manifestation of extremely-online silliness in a non-online dialogue just go to show the lengths to which Twitter has rewired our brains, just as that acting class might, I sometimes think, have rewired mine? Or is it simply revealing what was already, authentically there?
Bootle asks questions like this without ultimately offering answers. She concludes: “If any self we put out into the world is subject to the same analysis and questioning, to the same rules of performance, is being authentic materially different from being inauthentic?” As a means of understanding who we are and how we live in the present, then, her book is inscrutable — but I don’t think that’s the book’s fault. Perhaps the only conclusion that can be drawn about authenticity is that it’s become a useless term.
But the emptying-out of how we gauge “realness” has very real consequences. When Elon Musk took over Twitter, he announced his intent to upend the old system that delineated verified versus unverified accounts. Overnight, the site’s blue check was transformed from a sort of “trusted source” badge into a commodity anyone could buy — and within hours, Twitter was overrun with accounts touting “inauthentic” blue ticks tweeting all manner of nonsense and wreaking all manner of havoc. When you consider that this is where some people get their news, where some influential people form their opinions, where political movements have been known to begin: authenticity is more than a question of what your average social media user feels able to share with a small circle of friends online.
But as for the average social media user’s search for authenticity, it seems fundamentally self-defeating. The most recent great hope was the app BeReal, which encourages the sharing of in-the-moment mundanity during an arbitrary two-minute window each day. This is understood to be the nearest thing available to a spontaneous and honest glimpse of a person’s life, yet is achieved only through the most rigid of constructs, the strictest rules — which, of course, self-conscious and digitally-savvy users are already figuring out how to manipulate (and sometimes using to go viral on other, more performative platforms). BeReal might be the app on which posts share the closest relationship with the truth, but it is not the truth — any more than brushing your teeth or painting your fingernails in an acting class, for an audience, is precisely the same as the real thing.
And even if you manage to draw back the curtain on your real life so that the internet can see everything, the act of drawing it back is inherently, well, unreal. Erasing the barrier between our inner and public-facing selves doesn’t reveal our authentic selves; it creates a third self, something uncanny, like those mirror-image photos that show what you would look like if both halves of your face were perfectly symmetrical.
Obviously, we all contain multitudes; obviously, we all make context-dependent choices to display some aspects of our inner selves while withholding others. Obviously, it is fully possible for a person to be a terror on Twitter, a doting cat dad on Instagram, a glib interlocutor in political debate, and an absolute peach to have a beer with. But which of these personas is real, or most real? All of them? Or none?
Disclaimer
Some of the posts we share are controversial and we do not necessarily agree with them in the whole extend. Sometimes we agree with the content or part of it but we do not agree with the narration or language. Nevertheless we find them somehow interesting, valuable and/or informative or we share them, because we strongly believe in freedom of speech, free press and journalism. We strongly encourage you to have a critical approach to all the content, do your own research and analysis to build your own opinion.
We would be glad to have your feedback.
Source: UnHerd Read the original article here: https://unherd.com/