Social Media
Tech leaders discuss how social media is broken and what we can do about it
Toxic culture, deadly conspiracies and organized hate have exploded online in recent years. At TechCrunch Sessions: Justice we talked with Color of Change’s Rashad Robinson, Accountable Tech’s Jesse Lehrich and Naj Austin, of Somewhere Good and Ethel’s Club about how much responsibility social networks have in the rise of these phenomena and how to build healthy online communities that make society better, not worse.
On building intentional social spaces that cultivate positive behavior
When we think of social networks, we think of the huge platforms that dominate that conversation today. But those aren’t the only models for how digital communities can grow. Alternative social networks designed with specific communities in mind could cultivate healthier, more positive experiences for the people who need those spaces most.
Austin: One of the aspects that was really important to us when we got started was size. And so one of the factors of Somewhere Good is kind of connecting people with smaller, more intimate communities, because people tend to behave better in smaller networks. One of the things that we use in our product meetings to kind of base a lot of our decisions off of is mimicking a real-life dinner party. Who would you invite to that party? How do people interact when they’re in those kinds of smaller intimate spaces, it tends to, again, be a lot more aligned with better behavior. And then that’s, you know, that’s our perspective in terms of the many ways in which social networks can sort of be more user friendly and a multitude of ways. (Timestamp: 1:26)
“… Even with Ethel’s Club, the physical location in Brooklyn, we built it out, so that it necessitated smaller, more intimate interactions with our members. And so we took that same little nugget of knowledge and used it digitally and Somewhere Good is now our technology platform that allows us to build that out on a much larger, faster scale, but still at that level of intention that people want to be connected in a more intimate manner with one another.” (Timestamp: 2:49)
On tech industry exceptionalism
Tech, in spite of its historic wealth and power, lobbying efforts, addictive products and global effects would never liken itself to something like the oil and gas or tobacco industries. But now that we’re seeing some of the society-wide ills Silicon Valley has sown, tech’s exceptionalism is looking more misguided than ever.
Robinson: I do think that in Silicon Valley, there’s… a tendency for people to be maybe a lot more impressed with themselves and impressed with their politics and impressed with their sort of like, ability to believe in something bigger…
Sometimes it’s a lot easier to deal with a Coca-Cola where like I go in, and I know that these people think that they’re making soda, not making like a new society for all of us. And I can like, deal with the impacts of something that they’re doing. And we can all be on the same page, because they don’t think that they should get a Nobel Peace Prize. But in tech world, the people think that they can code, we can code our way out of structural racism, when in fact, the code is just amplifying structural racism. (Timestamp: 8:15)
On lessons about online hate and disinformation from the Clinton campaign
In 2021’s pandemic-ravaged world, 2016 feels like a world away, but Donald Trump’s successful campaign for president, the Russian disinformation scandal and an open embrace of white supremacy in mainstream politics gave a telling glimpse of what was to come in the next four years — and beyond.
Lehrich: Did I see QAnon coming? Probably not. But did I see this, like, horrifying crawl toward more and more explicit, you know, racism that’s always been there, obviously, but had been sort of at least relegated to the corner… You know, like, out and out racists like proud to be part open out in the open being part of these kinds of communities? That was that we definitely saw that on the horizon.
And I don’t think we necessarily grappled with it the right way. I don’t know, it was a really challenging thing to try to navigate at the time, but I very much felt like something really ugly is coming. And I definitely poured every hour of my waking time during the year and a half into that campaign, in part because I was like, terrified of where the country was headed if we didn’t win the election, and everything and more that I shared has definitely come to fruition. (Timestamp: 13:07)
On how social platforms mirrored entrenched racism in society
Like more traditional sectors, the tech industry is dominated by white men who have created wealth by building what they know. The industry may be ahistorical by definition, but without looking back and grappling with the ugly side of societal power structures like misogyny and white supremacy, tech is doomed to perpetuate those same inequities at scale.
Austin: I don’t think the issues we’re seeing come from the internet and social platforms being free. I think it’s deep embedded systemic problems, as Rashad mentioned, that have haunted this country since day zero, think the people that are creating these platforms are creating what they know, which is to create, patriarchal, you know, systems that live within platforms that look glossy, and have illustrations and use fun, human centric language, but at the heart of it don’t make space for what marginalized communities feel on those platforms. And so I think that’s just embedded in almost every platform we use. As Rashad mentioned, we’ve got… the fact that [Zoom] didn’t recognize that there are people who want to purely cause terror to Black people and people of color, for fun — it is a big issue. (Timestamp: 16:35)
On how lived experience influences design
One way to build social networks that allow a diverse range of communities to flourish is to have those people in the room to begin with, building together on day one. Imagining online social spaces that feel safe and enriching is a natural process when your community has had to deal with online hate and harassment everywhere else for years.
Austin: How do you get in front of these issues that my team which is composed of Black people, Latino people, Asian American people, queer people, non-binary people, the things that we experience every single moment of being online and sort of a larger explanation of that, right? That can be Tumblr, it can be Twitter, it can almost be can every platform. All the things that we’ve lived through, we are saying, What if we didn’t have to? (Timestamp: 16:35)
On how the government should hold social platforms accountable
Federal and state governments are interested in cracking down on tech’s outsized power for the first time. A good place to start might be looking at tech like we already look at regulation on issues like food safety.
Robinson: We need some sort of CFPB, FDA version of infrastructure at the government level. Because anyone who has watched some of the hearings, knows that the hearings on nuclear power, or even keeping our milk safe, would look the same if we didn’t actually have government infrastructure, people who we elect to be in Congress or Senate can’t be experts on all these issues.
And so part of having that infrastructure is the same way I talked about having buildings that meet code. It’s not because of our elected officials or experts. It’s because we build the infrastructure at the government level, and that we actually have fines and accountability that’s at scale to make sure that they’re that they’re that there’s consequences. (Timestamp: 22:19)
The big challenge in regulating big tech companies is that they’ve become so large and so powerful that financial punishments can’t even make a dent at this point. Meaningful change needs to realign the industry’s incentives by examining the structures that allowed these companies to grow so powerful with no oversight to begin with.
Lehrich: The fundamental incentive structure around large social media platforms right now is so perverse, they are incentivized to amplify the most toxic content, disinformation, hate speech — like that stuff drives engagement. And so long as they have platforms that where they have no accountability… talking about Section 230 reform, there’s no way to hold them liable from a legal standpoint, the FTC is not hitting them with fines that really hurt them. They don’t have ends and, and there’s just no friction anywhere.
… Until we fundamentally, fundamentally upend that incentive structure, they’re gonna continue profiting off hate and distortion and deceit and delusion and discrimination. And that’s just the reality. (Timestamp: 24:39)
You can read the entire transcript here.
-
Entertainment7 days ago
‘Interior Chinatown’ review: A very ambitious, very meta police procedural spoof
-
Entertainment6 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment6 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment5 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment4 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment3 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?