The average person spends about two hours a day on social media platforms. If that sounds like a lot, it’s nothing compared to the nine daily hours the average teenager devotes to social media. As a society we’re becoming more and more addicted to posting, tweeting, viewing, responding, opining, and sharing online—and what’s going to come of it all?
The debate rages on about whether Facebook and other social media platforms are good or bad for humanity, and whether they should be held responsible for the actions of their users or third party advertisers.
These are some of the questions Roger McNamee and Meighan Stone discussed in a panel moderated by David Kirkpatrick at Singularity University’s Global Summit last week. Roger McNamee is a venture capitalist and investor; he was an advisor to Mark Zuckerberg and an early investor in Facebook. Meighan Stone is a senior fellow in the Women and Foreign Policy Program at the Council on Foreign Relations. And David Kirkpatrick is the founder and editor in chief of Techonomy and the author of the 2010 book The Facebook Effect: The Inside Story of the Company that is Connecting the World.
The Ills of Social Media
For decades, Silicon Valley was something of a golden child, with almost every technology coming out of the enclave receiving enthusiasm and praise. Over the last decade, though, there’s been a reversal of that mindset, and the Valley’s reputation has dramatically declined, with the biggest tech companies being widely regarded as downright evil.
How about finding a middle ground between these two ends of the spectrum? Stone called for a more moderate view of tech as a whole, and of social media specifically. “We need to be careful not to have an absolutist kind of argument,” she said. “I think we need to get out of this dichotomy. It’s a gray area, right? Guess what? Just like all of us. Just like people.” Social media is a tool and is ultimately what we make of it.
However, all the panelists noted that what’s good for social media and tech companies is at odds with what’s good for society. Others have made this argument with increasing outrage as we uncover more nefarious actors using the platforms. A recent Vanity Fair article by Nick Bilton simply stated, “For Twitter, fewer jerks mean less revenue.”
Writing in the Washington Monthly, McNamee pointed out, “Thanks to the US government’s laissez-faire approach to regulation, the internet platforms were able to pursue business strategies that would not have been allowed in prior decades.” Gathering data without users’ consent and amassing a huge portion of the market share are two glaring examples.
At the top of the cultural consciousness today is the meddling in the 2016 presidential election by Russian political entities. Many of the ads that were posted on Facebook would not have been allowed to air on television, and the panelists argued that Facebook is complicit in the damage to our democratic process.
In Myanmar, the stakes are even higher. Making an impassioned argument for greater oversight, Stone explained Facebook was used to spread hate speech in the genocide of Rohingya Muslims; in a country with no free press, this propaganda on Facebook went unchecked and led to riots.
But this doesn’t mean social media platforms are inherently negative. Stone acknowledged that in the 1994 Rwandan genocide, people spread hate speech and sowed discord over the radio. This ugly side of humanity has always been around; platforms like Facebook are simply another outlet for it, and possibly an even more harmful one.
What Can Be Done?
Kirkpatrick shifted the focus away from assigning blame and towards being proactive. “What should we do?” he asked. “Let’s go deeper into that.”
Enforcing the terms of service companies already have in place would be a good start, said Stone. Often, safeguards exist but are simply disregarded. She noted that Alex Jones, the conspiracy theorist who engages in hate speech, was flagrantly violating Twitter’s rules for quite some time before he was suspended. However, Twitter CEO and founder Jack Dorsey and others have argued that Jones has not, in fact, violated Twitter’s rules.
But what are Twitter’s rules? Stone stressed the importance of social media platforms clearly defining their policies. And, she added, “We need to support journalists who are kind of our modern-day muckrakers, who are out there saying, ‘Wait a minute, you’re not even enforcing your own rules.’”
The panelists also agreed that social media platforms should be subject to government regulation. This might take the form of a statute of limitations on the use of consumer data or banning bots that impersonate humans. After all, Stone pointed out, we readily accept regulation for cars and transportation. Cars themselves are neutral; they can be a major convenience to our lives or they can be used as a weapon. We have laws in place—everything from drivers’ licenses to vehicle emissions tests—to try to make cars safe. Likewise, social media platforms are inherently neutral, but with controls and safeguards they could be used for good more than for ill.
McNamee noted that the thin startup model turned out to have limits, and platforms like Facebook weren’t meant to scale to their current size and reach. “I think antitrust law is the most pro-growth form of regulatory intervention,” he said. “We all are here because the AT&T 1956 consent decree took the transistor and put it in the public domain, and there has never been a technology antitrust action that didn’t leave both the industry and target company better off than they were before.”
More generally, McNamee said, there are four separate areas to look at with respect to social media platforms: democracy and election security, the effect on individual psychology and mental health, privacy, and the monopoly carried by a few enormous companies. He continued, “You need to essentially do one of two things. You either need to change the business model, or you need to have lots and lots of heavyweight regulation for each of the four failure modes that are going on now.”
The panelists agreed that governments, journalists, boards of directors, and, mostly, ordinary citizens must become activists for ethical behavior—on and off social media. This is easier said than done; it seems pretty harmless, after all, to scroll through your feeds and click “Like” now and then, or to share your latest photos and opinions with your friends. But social media has already seeped into crevices of our lives where it may not belong, and tempering its power could start with individual users being more aware of those boundaries.
Image Credit: Julia Tim / Shutterstock.com