A new social-media platform wants to enforce “kindness.” Can that ever work?on October 7, 2020 at 9:00 am

It’s also extremely vague and subjective, though, especially when protecting some people can mean criticizing others. Questioning a certain point of view–even in a way that seems critical or unkind–can sometimes be necessary. Those who commit microaggressions should be told what’s wrong with their actions. Someone repeating a slur should be confronted and educated. In a volatile US election year that has seen a racial reckoning unlike anything since the 1970s and a once-in-a-generation pandemic made worse by misinformation, perhaps being kind is no longer enough.

How will Telepath thread this needle? That will be down to the in-house content moderating team, whose job it will be to police “kindness” on the platform.

It won’t necessarily be easy. We’ve only recently begun understanding how traumatic content moderation can be thanks to a series of articles from Casey Newton, formerly at The Verge, which exposed the sweatshop-like conditions confronting moderators who work on contract at minimum wage. Even if these workers were paid better, the content many deal with is undeniably devastating. “Society is still figuring out how to make content moderation manageable for humans,” Matias says.

When asked about these issues, Estevez emphasizes that at Telepath content moderation will be “holistic” and the work is meant to be a career. “We’re not looking to have people do this for a few months and go,” she says. “I don’t have big concerns.”

Telepath’s organizers believe that the invite-only model will help in this regard (the platform currently supports approximately 3,000 people). “By stifling growth to some extent, we’re going to make it better for ourselves,” says Estevez.

But the model presents another issue. “One of the pervasive problems that many social platforms that have launched in the US have had is a problem with diversity,” Matias says. “If they start with a group of users that are not diverse, then cultures can build up in the network that are unwelcoming and in some cases hostile to marginalized people.”

That fear of a hostile in-group culture is well founded. Clubhouse, an audio-first social-media app used by many with Silicon Valley ties, was launched to critical acclaim earlier this year, only to devolve into the type of misogynistic vitriol that has seeped into every corner of the internet. Just last week, it came under fire for anti-Semitism.

So far, Telepath has been dominated by Silicon Valley types, journalists, and others with spheres of influence outside the app. It’s not a diverse crowd, and Estevez says the team recognizes that. “It’s not just about inviting people; it’s not just about inviting women and Black people,” she says. “It’s so that they [women and Black people] have a good experience, so they see other women and Black people and are not getting mansplained to or getting microaggressions.”

That is a tricky balance. On the one hand, maintaining an invite-only community of like-minded members allows Telepath to control the number of posts and members it must keep an eye on. But that type of environment can also become an echo chamber that doesn’t challenge norms, defeating the purpose of conversation in the first place and potentially offering a hostile reception to outsiders.

Tan, the early adopter, says that the people she’s interacting with certainly fall into a type: they’re left-leaning and tech-y. “The first people who were using it were coming from Marc [Bodnick] or Richard [Henry]’s networks,” she says, referring to the cofounders. “It tends to be a lot of tech people.” Tan says the app’s conversations are wide-ranging, though, and she has been “pleasantly surprised” at the depth of discussion.

“Any social-media site can be an echo chamber, depending on who you follow,” she adds.

Telepath is ultimately in a tug-of-war: Is it possible to encourage lively-yet-decent debate on a platform without seeing it devolve into harassment? Most users assume that being online involves taking a certain amount of abuse, particularly if you’re a woman or from a marginalized group. Ideally, that doesn’t have to be the case.

“They need to be committed, that this isn’t just lip service about ‘being kind,'” Citron says. “Folks often roll out products in beta from and then think about harm, but then it’s too late. That’s unfortunately the story of the internet.” So far, but perhaps it doesn’t have to be.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *