Yesterday we talked about how a recent social media conflict between journalists and the tech world might be better framed as a conflict between managers and their employees. Today I want to look at that conflict from another angle — how it played out on the buzzy, audio-only, invite-only social network Clubhouse. Like many social startups before it, the company neglected to develop or enforce strong community guidelines before launch — and the oversight could derail a company valued at $100 million while still in private beta.
Let me acknowledge up front that Clubhouse is barely a few months old, and currently has just two full-time employees — its founders, Paul Davison and Rohan Seth. I’ve known Davison for about seven years, and have always found him fun to talk to. He’s charming, he’s had multiple wild visions about what the future can look like, and he has repeatedly convinced venture capitalists to part with millions of dollars so that he can build it.
But one of the core principles of The Interface says this: “Most tech CEOs are intelligent, kind, hard-working people who want to make the world a better place, and this is largely beside the point.” And so this is not a column about the co-founder’s intentions, which I assume to be good. Instead, it’s about the way Davison has built products to date — and the gap between that style and the way I think modern social networks ought to be built.
Let’s start at Pinterest. In the summer of 2016, that company had hired the team behind Highlight, a boldly invasive app that broadcast your name, photo and other information to other users in hopes of introducing you to strangers. Highlight was led by Davison, a former Googler who had an expansive vision for transforming what previous generations would have regarded as privacy invasions into products. “If you don’t push things a little bit, you miss opportunities,” Davison told me in 2013. “Fifteen years ago, it would be crazy to post your resume online. This is new territory we’re figuring out.”
Highlight never got traction. Neither did the company’s next boundary-pushing effort, Shorts, which invited you to share your camera roll with friends and friends of friends. ”If you look at the most interesting and loved and useful social products over the last 20 years, you’ll find that lots of them have pushed us to share a little more openly than perhaps we felt comfortable doing,” Davison told me about that one.
Once he was acqui-hired by Pinterest, Davison took on a refreshingly mundane challenge: taking over the development of “tried it” pins, a feature that lets users post photos of the activities they’ve completed related to Pinterest posts. If you find a recipe for a cake on Pinterest and make it, for example, the feature lets you post your version in a thread attached to the original pin. The feature was in testing the summer that Davison joined, and he oversaw its development until it was released in November.
When it was released, though, there was a problem: the feature was not connected to the systems that screen content for pornography, harassment, and other violations of Pinterest’s content policies. As a result, Pinterest saw a surge in pornographic content uploaded to the service, two former employees told me. “About one out of every dozen photos uploaded was a penis for a good while,” one told me.
Pinterest told me the problem was fixed shortly after launch. Through a spokesperson, Davison declined to comment.
But to one former employee I spoke with, the lapse was emblematic of an overly laissez-faire attitude to content moderation on Davison’s part. “His entire perspective was always to push for, how do we get users to expose more data in the product?” the former employee said. “User trust and safety was completely an afterthought.”
All of that feels like necessary context for understanding how Clubhouse found itself at the center of a now much-discussed conflict between New York Times reporter (and friend of The Interface) Taylor Lorenz and the investor Balaji Srinivasan. When Lorenz joined a conversation about herself in the app — one in which she would eventually be accused of playing “the woman card” in complaining about harassment she was receiving on Twitter and elsewhere — she could not have reported it even if she wanted to.
The reason is that Clubhouse does not allow users to report harassment or other violations of its terms of service through the app. And Lorenz, who wrote an enthusiastic early profile of the app in May, told me she has been besieged by Clubhouse trolls. The app offers no ability to block users, and so some users are changing their profile pictures to Lorenz’s antagonists to taunt her while she uses the app. Screenshots of beta tester forums that I obtained show users begging Clubhouse’s founders to, among other things, write comprehensive community guidelines. (Its published terms of service are largely just legal boilerplate.)
“Writing up community rules to include expected behaviors, actions and giving people a place to appeal is super important,” one woman wrote in the private user forums. “It’s just as important to enforce these actions including timeouts / re-education and suspension when warranted. I don’t think Taylor’s incident is going to be the last, unfortunately.”
Davison called Lorenz to discuss the harassment she had faced, she told me, and asked her to offer suggestions for what Clubhouse could do. She offered a variety of suggestions, including banning people that harass other users, none of which have so far been implemented. Lorenz told me she felt disappointed when Davison went on to like a tweet that read, “Honestly in this whole Taylor vs Balaji S., Clubhouse won.”
During my reporting, I’ve also heard from Clubhouse users who have reminded me, in exasperated fashion, that the app is currently in a closed beta. Traditionally, the invitation-only stage of a social app has been used to build the exact systems these users are now clamoring for. A two-person startup that goes from idea to a $100 million valuation within a few weeks has countless problems to worry about, Clubhouse supporters tell me. Also the founders give out their email addresses to users, and respond to many of their complaints personally.
At the same time, we’ve seen enough social networks come and go that we now understand the consequences of making content moderation an afterthought. Ask Reddit, which just a few weeks ago thought to explicitly ban hate speech — years after nurturing communities of racists, nonconsensual porn distributors, and other blights on the internet.
And for Clubhouse, moderation issues promise to be particularly difficult — and if the app is to ever escape closed beta successfully, will require sustained attention and likely some product innovation. Tatiana Estévez, who worked on moderation efforts at the question-and-answer site Quora, outlined Clubhouse’s challenges in a Twitter thread.
Audio is fast and fluid; will Clubhouse record it so that moderators can review bad interactions later? In an ephemeral medium, how will Clubhouse determine whether users have a bad pattern of behavior? And can Clubhouse do anything to bring balance to the age-old problem of men interrupting women?
“Is this impossible? Probably not,” Estévez wrote. “But in my experience, moderation and culture have to be a huge priority for both the founding team as well as for the community as a whole.”
Moderation does not appear to have been a huge priority at Highlight, or at Shorts, or the team that built the “Tried It” feature at Pinterest. If Clubhouse is to live up to the potential its investors clearly see in it, its builders should consider making it one, and soon.