|
SHOW ALL SECTIONSAmy Klobuchar: No More Blind Trust in Big Tech
If you have a Facebook account, your data brought in $51 to Facebook last quarter. That’s because even though Facebook presents itself as a free service, it uses its platform to gather personal data and sell targeted ads, turning its own users into profit centers—and our lack of data privacy laws helps its bottom line. In fact, the company brings in twice as much money from its American users as it does from users in other countries with more stringent protections.
It’s not just Facebook’s use of personal data that makes it dangerous, it’s the lengths to which the company will go to keep users online. Frances Haugen, the Facebook whistleblower, showed us how its algorithms are designed to promote content that incites the most reaction. Facebook knows that the more time users spend on its platforms, the more data it can collect and monetize. Time and time again, it has put profits over people.
For too long, social media companies essentially have been saying “trust us, we’ve got this,” but that time of blind trust is coming to an end.
We need transparency—and action—on the algorithms that govern so much of our lives.
While my colleagues on both sides of the aisle are committed to reform, these are complicated problems. We have to come at this from multiple angles—starting with data privacy. We need to make sure that Americans can control how their data gets collected and used. When Apple gave its users the option to have their data tracked or not, more than 75 percent declined to opt in. That says something. We need a national privacy law now.
We also know that one-third of kids age 7 to 9 use social media apps, so we need stronger laws to protect them online. Right now, American kids are being overwhelmed by harmful content—and we don’t have nearly enough information about what social media companies are doing with their data. We can’t let companies put their profits above the well-being of children.
One reason Facebook can get away with this behavior is because it knows consumers don’t have alternatives. In CEO Mark Zuckerberg’s own words in a 2008 email, “It is better to buy than compete.” Who knows what user-friendly privacy protections competitors like Instagram could have developed if Facebook hadn’t purchased them? To protect competition in the digital marketplace, we have to update our antitrust and competition laws and make sure the enforcement agencies have the resources to do their jobs.
Finally, we need transparency—and action—on the algorithms that govern so much of our lives. Between Facebook’s role in promoting health misinformation during the pandemic and Instagram’s directing kids to accounts that glorify eating disorders, it’s clear that online algorithms can lead to real-world harm. Congress has to look at how harmful content is amplified.
We know from experience that these companies will keep milking users for profits until Congress steps in. Now is the time to act, and we have bipartisan momentum to stop just admiring the problem and finally do something about it.
Ms. Klobuchar, a Democrat, is a U.S. Senator from Minnesota.
Facebook CEO Mark Zuckerberg testifies before Congress on April 10.
PHOTO: CHIP SOMODEVILLA/GETTY IMAGES
Nick Clegg: Facebook Can’t Do It Alone
The debate around social media has changed dramatically in a short time. These technologies were once hailed as a liberating force: a means for people to express themselves, to keep in touch without the barriers of time and distance, and to build communities with like-minded souls.
The pendulum has now swung from utopianism to pessimism. There is increasing anxiety about social media’s impact on everything from privacy and well-being to politics and competition. This is understandable. We’re living through a period of division and disruption. It is natural to ask if social media is the cause of society’s ills or a mirror reflecting them.
Social media turns traditional top-down control of information on its head. People can make themselves heard directly, which is both empowering and disruptive. While some paint social media, and Facebook in particular, as bad for society, I believe the reverse is true. Giving people tools to express themselves remains a huge net benefit, because empowered individuals sustain vibrant societies.
Of course, with billions of people using our apps, all the good, bad and ugly of life is on display, which brings difficult dilemmas for social media companies—like where to draw the line between free expression and unsavory content, or between privacy and public safety.
If Facebook didn’t exist, these issues wouldn’t magically disappear.
Undoubtedly, we have a heavy responsibility to design products in a way that is mindful of their impact on society. But this is a young industry, and that impact isn’t always clear. That is why Facebook, now called Meta, conducts the sort of research reported on in The Wall Street Journal’s Facebook Files series. We want to better understand how our services affect people, so we can improve them. But if companies that conduct this sort of research—whether internally or with external researchers—are condemned for doing so, the natural response will be to stop. Do we want to incentivize companies to retreat from their responsibilities?
MICHAEL KIRKHAM
I think most reasonable people would acknowledge that social media is being held responsible for issues that run much deeper in society. Many of the dilemmas Facebook and Instagram face are too important to be left to private companies to resolve alone. That is why we’ve been advocating for new regulations for several years.
READ THE FACEBOOK FILES>>
If one good thing comes out of this, I hope it is that lawmakers take this opportunity to act. Congress could start by creating a new digital regulator. It could write a comprehensive federal privacy law. It could reform Section 230 of the Communications Decency Act and require large companies like Meta to show that they comply with best practices for countering illegal content. It could clarify how platforms can or should share data for research purposes. And it could bring greater transparency to algorithmic systems.
Social media isn’t going to go away. If Facebook didn’t exist, these issues wouldn’t magically disappear. We need to bring the pendulum to rest and find consensus. Both tech companies and lawmakers need to do their part to preserve the best of the internet and protect against the worst.
Mr. Clegg is the Vice President for Global Affairs at Meta.
Frances Haugen, former Facebook employee and whistleblower, testifies to a Senate committee on Oct. 5
PHOTO: DREW ANGERER/POOL/AFP/GETTY IMAGES
Clay Shirky: Slow It Down and Make It Smaller
We know how to fix social media. We’ve always known. We were complaining about it when it got worse, so we remember what it was like when it was better. We need to make it smaller and slow it down.
The spread of social media vastly increased how many people any of us can reach with a single photo, video or bit of writing. When we look at who people connect to on social networks—mostly friends, unsurprisingly—the scale of immediate connections seems manageable. But the imperative to turn individual offerings, mostly shared with friends, into viral sensations creates an incentive for social media platforms, and especially Facebook, to amplify bits of content well beyond any friend group.
We’re all potential celebrities now, where anything we say could spread well beyond the group we said it to, an effect that the social media scholar Danah Boyd has called “context collapse.” And once we’re all potential celebrities, some people will respond to the incentives to reach that audience—hot takes, dangerous stunts, fake news, miracle cures, the whole panoply of lies and grift we now behold.
The faster content moves, the likelier it is to be borne on the winds of emotional reaction.
The inhuman scale at which the internet assembles audiences for casually produced material is made worse by the rising speed of viral content. As the behavioral economist Daniel Kahneman observed, human thinking comes in two flavors: fast and slow. Emotions are fast, and deliberation is slow.
The obvious corollary is that the faster content moves, the likelier it is to be borne on the winds of emotional reaction, with any deliberation coming after it has spread, if at all. The spread of smartphones and push notifications has created a whole ecosystem of URGENT! messages, things we are exhorted to amplify by passing them along: Like if you agree, share if you very much agree.
Social media is better, for individuals and for the social fabric, if the groups it assembles are smaller, and if the speed at which content moves through it is slower. Some of this is already happening, as people vote with their feet (well, fingers) to join various group chats, whether via SMS, Slack or Discord.
We know that scale and speed make people crazy. We’ve known this since before the web was invented. Users are increasingly aware that our largest social media platforms are harmful and that their addictive nature makes some sort of coordinated action imperative.
It’s just not clear where that action might come from. Self-regulation is ineffective, and the political arena is too polarized to agree on any such restrictions. There are only two remaining scenarios: regulation from the executive branch or a continuation of the status quo, with only minor changes. Neither of those responses is ideal, but given that even a global pandemic does not seem to have galvanized bipartisanship, it’s hard to see any other set of practical options.
Mr. Shirky is Vice Provost for Educational Technologies at New York University and the author of “Cognitive Surplus: Creativity and Generosity in a Connected Age.”
Nicholas Carr: Social Media Should Be Treated Like Broadcasting
The problems unleashed by social media, and the country’s inability to address them, point to something deeper: Americans’ loss of a sense of the common good. Lacking any shared standard for assessing social media content, we’ve ceded control over that content to social media companies. Which is like asking pimps to regulate prostitution.
We weren’t always so paralyzed. A hundred years ago, the arrival of radio broadcasting brought an upheaval similar to the one we face today. Suddenly, a single voice could speak to all Americans at once, in their kitchens and living rooms. Recognizing the new medium’s power to shape thoughts and stir emotions, people worried about misinformation, media bias, information monopolies, and an erosion of civility and prudence.
The government responded by convening conferences, under the auspices of the Commerce Department, that brought together lawmakers, engineers, radio executives and representatives of the listening public. The wide-ranging discussions led to landmark legislation: the Radio Act of 1927 and the Communications Act of 1934.
These laws defined broadcasting as a privilege, not a right. They required radio stations (and, later, television stations) to operate in ways that furthered not just their own private interests but also “the public interest, convenience, and necessity.” Broadcasters that ignored the strictures risked losing their licenses.
Like radio and TV stations before them, social media companies have a civic responsibility.
Applying the public interest standard was always messy and contentious, as democratic processes tend to be, and it raised hard questions about freedom of speech and government overreach. But the courts, recognizing broadcasting’s “uniquely pervasive presence in the lives of all Americans,” as the Supreme Court put it in a 1978 ruling, repeatedly backed up the people’s right to have a say in what was beamed into their homes.
If we’re to solve today’s problems with social media, we first need to acknowledge that companies like Facebook, Google, and Twitter are not technology companies, as they like to present themselves. They’re broadcasters. Indeed, thanks to the omnipresence of smartphones and media apps, they’re probably the most influential broadcasters the world has ever known.
Like radio and TV stations before them, social media companies have a civic responsibility and should be required to serve the public interest. They need to be accountable, ethically and legally, for the information they broadcast, whatever its source.
The Communications Decency Act of 1996 included a provision, known as Section 230, that has up to now prevented social media companies from being held liable for the material they circulate. When that law passed, no one knew that a small number of big companies would come to wield control over much of the news and information that flows through online channels. It wasn’t clear that the internet would become a broadcasting medium.
Now that that is clear, often painfully so, Section 230 needs to be repealed. Then a new regulatory framework, based on the venerable public interest standard, can be put into place.
The past offers a path forward. But unless Americans can rise above their disagreements and recognize their shared stake in a common good, it will remain a path untaken.
Mr. Carr is an author and a visiting professor of sociology at Williams. His article “How to Fix Social Media” appears in the current issue of The New Atlantis.
In the early days of radio, Congress passed laws requiring broadcasters to serve the public interest.
PHOTO: H. ARMSTRONG ROBERTS/CLASSICSTOCK/GETTY IMAGES
Sherry Turkle: We Also Need to Change Ourselves
Recent revelations by The Wall Street Journal and a whistleblower before Congress showed that Facebook is fully aware of the damaging effects of its services. The company’s algorithms put the highest value on keeping people on the system, which is most easily accomplished by engaging users with inflammatory content and keeping them siloed with those who share their views. As for Instagram, it encourages users (with the most devastating effect on adolescent girls) to curate online versions of themselves that are happier, sexier and more self-confident than who they really are, often at a high cost to their mental health.
But none of this was a surprise. We’ve known about these harms for over a decade. Facebook simply seemed too big to fail. We accepted the obvious damage it was doing with a certain passivity. Americans suffered from a fallacy in reasoning: Since many of us grew up with the internet, we thought that the internet was all grown up. In fact, it was in its early adolescence, ready for us to shape. We didn’t step up to that challenge. Now we have our chance.
In the aftermath of the pandemic, Americans are asking new questions about what is important and what we want to change. This much is certain: Social media is broken. It should charge us for its services so that it doesn’t have to sell user data or titillate and deceive to stay in business. It needs to accept responsibility as a news delivery system and be held accountable if it disseminates lies. That engagement algorithm is dangerous for democracy: It’s not good to keep people hooked with anger.
We lose out when we don’t take the time to listen to each other, especially to those who are not like us.
But changing social media is not enough. We need to change ourselves. Facebook knows how to keep us glued to our phones; now we need to learn how to be comfortable with solitude. If we can’t find meaning within ourselves, we are more likely to turn to Facebook’s siloed worlds to bolster our fragile sense of self. But good citizenship requires practice with disagreement. We lose out when we don’t take the time to listen to each other, especially to those who are not like us. We need to learn again to tolerate difference and disagreement.
We also need to change our image of what disagreement can look like. It isn’t online bullying. Go instead to the idea of slowing down to hear someone else’s point of view. Go to images of empathy. Begin a conversation, not with the assumption that you know how someone feels but with radical humility: I don’t know how you feel, but I’m listening. I’m committed to learning.
Empathy accepts that there may be profound disagreement among family, friends and neighbors. Empathy is difficult. It’s not about being conflict-averse. It implies a willingness to get in there, own the conflict and learn how to fight fair. We need to change social media to change ourselves.
Ms. Turkle is the Abby Rockefeller Mauzé Professor of the Social Studies of Science and Technology at MIT and the author, most recently, of “The Empathy Diaries.”
Worries About Children, Democracy and Free Speech
In a recent poll, Americans expressed a range of concerns about the effects and power of social media.
PERCENTAGE OF RESPONDENTS WHO AGREE THAT:
We don't yet know enough about the long-term impact of social media on children's health and well-being.
83%
We don't yet know enough about the impact social media will have on our democracy.
74%
More government regulation is needed to address online content that is harmful or offensive.
74%
More government regulation is needed to ensure free speech on online platforms.
73%
PERCENTAGE OF RESPONDENTS WHO SAY IT IS A PRIORITY TO:
Hold social media companies accountable for the content on their platforms
67%
Break up big tech companies
49%
Source: Benenson Strategy Group Future of Technology Commission survey of 2,106 registered voters in the U.S. conducted July 20-29, 2021
Josh Hawley: Too Much Power in Too Few Hands
What’s wrong with social media? One thing above all—hyper-concentration. Big Tech companies have no accountability. They harm kids and design their data-hungry services to be addictive. They censor speech with abandon and almost always without explanation. Unlike phone companies, they can kick you off for any reason or none at all. The world would be a better place without Facebook and Twitter.
All these pathologies are designed and enabled by one principal problem: centralized control. Take Facebook. Three billion people use the Facebook suite of apps, yet just one person wields final authority over everything. Google, ultimately controlled by just two people, is no better.
Concentrated control of social media aggravates all other problems because it deprives users of the competition that could provide solutions. Consider content moderation and privacy. As Dina Srinivasan has shown in her influential research, social media companies used to have to compete on these metrics. Then Facebook got big. Now, switching from Facebook to Instagram still leaves you under Mark Zuckerberg’s control. There is no other social media option with comparable reach.
No single person should control that much speech. We ought to retain the benefits of a large communications network but dismantle centralized control over it. Imagine a Facebook where you can use an algorithm other than the one that Mr. Zuckerberg designs. Everybody gets the benefit of the large network, but nobody suffers the harm of centralized control. Or imagine a world where Mr. Zuckerberg can’t unilaterally kick you off the largest digital communications platform on the planet. Phone companies don’t get to deny service to law-abiding Americans. Neither should Big Tech.
There’s no accountability for bad algorithms, excessive data collection or addictive features
Decentralizing social media can be accomplished in a few steps. First, social media companies must become interoperable. Just as you can call somebody who uses a different wireless carrier, you should be able to contact people on Facebook by using a different social media provider.
Second, as Justice Clarence Thomas recently pointed out, courts have grossly distorted Section 230 of the Communications Decency Act to protect tech companies from their own bad acts. There’s no accountability for bad algorithms, excessive data collection or addictive features. There’s no accountability for harming kids.
In all other industries, the prospect of liability helps to hold the powerful responsible and makes obtaining concentrated market power more difficult, but Section 230 now is a perpetual get-out-of-jail-free card. Today’s robber barons, in a company motto coined by Mr. Zuckerberg, get to “move fast and break things”—reaping the profits and footing us with the bill.
Finally, we must update antitrust laws to prevent platforms from throttling innovation by buying and killing potential competitors. A world in which Facebook never acquired Instagram would look very different.
To fix social media, break up centralized authority and help regular Americans take back control over their lives.
Mr. Hawley, a Republican, is a U.S. Senator from Missouri.
YouTube, Facebook and Instagram are the social media apps that Americans use most.
PHOTO: PHOTO ILLUSTRATION: CHESNOT/GETTY IMAGES
David French: Government Control of Speech? No Thanks
The government should leave social media alone. For any problem of social media you name, a government solution is more likely to exacerbate it than to solve it, with secondary effects that we will not like.
Take the challenge of online misinformation and censorship. Broadly speaking, the American left desires a greater degree of government censorship to protect Americans from themselves. The American right also wants a greater degree of government intervention—but to protect conservatives from Big Tech progressives. They want to force companies to give conservatives a platform.
But adopting either approach is a bad idea. It would not only involve granting the government more power over private political speech—crossing a traditional red line in First Amendment jurisprudence—it would also re-create all the flaws of current moderation regimes, but at governmental scale.
Instead of one powerful person running Facebook, another powerful person running Twitter and other powerful people running Google, Reddit, TikTok and other big sites and apps, we’d have one powerful public entity in charge—at a moment when government is among the least-trusted institutions in American life.
How many tech panics must we endure before we understand that the problem is less our technology than our flawed humanity?
American history teaches us that we do not want the government defining “misinformation.” Our national story is replete with chapters where the government used its awesome power to distort, suppress and twist the truth—and that’s when it knows what’s true.
Our history also teaches us that the government is never free of partisanship and that if those in office have the power to suppress their political opponents, they will. Grant the Biden administration the power to regulate social media moderation and you hand that same power to the next presidential administration—one you may not like.
MORE SATURDAY ESSAYS
How many tech panics must we endure before we understand that the problem with society is less our technology than our flawed humanity? Social media did not exist in April 1861 when a Confederate cannonade opened the Civil War. Does social media truly divide us, or do we divide ourselves? Does social media truly deceive us, or do we deceive ourselves?
Social media is a two-edged sword. The same technology that connects old classmates and helps raise funds for gravely ill friends also provides angry Americans with instant access to public platforms to vent, rage and lie. Social media puts human nature on blast. It amplifies who we are.
But so did the printing press—and radio and television. Each of these expressive technologies was disruptive in its own way, and none of those disruptions were “solved” by government action. In fact, none of them were solved at all. And thank goodness for that. Our nation has become greater and better by extending the sphere of liberty, not by contracting it. It’s the task of a free people to exercise that liberty responsibly, not to beg the government to save us from ourselves.
Mr. French is senior editor of the Dispatch and the author of “Divided We Fall: America’s Secession Threat and How to Restore Our Nation.”
A supporter of then-President Donald Trump at a rally in Beverly Hills Gardens, Calif., on January 9, after he was banned from Twitter and suspended from Facebook.
PHOTO: RINGO CHIU/REUTERS
Renee DiResta: Circuit Breakers to Encourage Reflection
Social media is frictionless. It has been designed to remove as many barriers to creation and participation as possible. It delivers stories to us when they “trend” and recommends communities to us without our searching for them. Ordinary users help to shape narratives that reach millions simply by engaging with what is curated for them—sharing, liking and retweeting content into the feeds of others. These signals communicate back to the platform what we’re interested in, and the cycle continues: We shape the system, and it shapes us.
The curation algorithms that drive this experience were not originally designed to filter out false or misleading content. Recently, platforms have begun to recognize that algorithms can nudge users in harmful directions—toward inflammatory conspiracy theories, for instance, or anti-vaccination campaigns. The platforms have begun to take action to reduce the visibility of harmful content in recommendations. But they have run up against the networked communities that have formed around that content and continue to demand and amplify it. With little transparency and no oversight of the platforms’ interventions, outsiders can’t really determine how effective they have been, which is why regulation is needed.
The problem with social media, however, is not solely the algorithms. As users, we participate in amplifying what is curated for us, clicking a button to propel it along—often more by reflex than reflection, because a headline or snippet resonates with our pre-existing biases. And while many people use the platforms’ features in beneficial ways, small numbers of malign actors have a disproportionate effect in spreading disinformation or organizing harassment mobs.
Nudges could be programmed to pop up in response to keywords commonly used in harassing speech.
What if the design for sharing features were reworked to put users into a more reflective frame of mind? One way to accomplish this is to add friction to the system. Twitter, for instance, found positive effects when it asked users if they’d like to read an article before retweeting it. WhatsApp restricted the ability to mass-forward messages, and Facebook recently did the same for sharing to multiple groups at once.
Small tweaks of this nature could make a big difference. Nudges could be programmed to pop up in response to keywords commonly used in harassing speech, for example. And instead of attempting to fact-check viral stories long after they’ve broken loose, circuit breakers within the curation algorithm could slow down certain categories of content that showed early signs of going viral. This would buy time for a quick check to determine what the content is, whether it’s reputable or malicious and—for certain narrow categories in which false information has high potential to cause harm—if it is accurate.
Circuit breakers are commonplace in other fields, such as finance and journalism. Reputable newsrooms don’t simply get a tip and tweet out a story. They take the time to follow a reporting process to ensure accuracy. Adding friction to social media has the potential to slow the spread of content that is manipulative and harmful, even as regulators sort out more substantive oversight.
Ms. DiResta is the technical research manager of the Stanford Internet Observatory.
Jaron Lanier: Topple the New Gods of Data
When we speak of social media, what are we talking about? Is it the broad idea of people connecting over the internet, keeping track of old friends or sharing funny videos? Or is it the business model that has come to dominate those activities, as implemented by Facebook and a few other companies?
Tech companies have dominated the definition because of the phenomenon known as network effects: The more connected a system is, the more likely it is to produce winner-take-all outcomes. Facebook took all.
The domination is so great that we forget alternatives are possible. There is a wonderful new generation of researchers and critics concerned with problems like damage to teen girls and incitement of racist violence, and their work is indispensable. If all we had to talk about was the more general idea of possible forms of social media, then their work would be what’s needed to improve things.
Unfortunately, what we need to talk about is the dominant business model. This model spews out horrible incentives to make people meaner and crazier. Incentives run the world more than laws, regulations, critiques or the ideas of researchers.
The current incentives are to “engage” people as much as possible, which means triggering the “lizard brain” and fight-or-flight responses. People have always been a little paranoid, xenophobic, racist, neurotically vain, irritable, selfish and afraid. And yet putting people under the influence of engagement algorithms has managed to bring out even more of the worst of us.
The current incentives are to ‘engage’ people as much as possible, which means triggering the ‘lizard brain.’
Can we survive being under the ambient influence of behavior modification algorithms that make us stupider?
The business model that makes life worse is based on a particular ideology. This ideology holds that humans as we know ourselves are being replaced by something better that will be brought about by tech companies. Either we’ll become part of a giant collective organism run through algorithms, or artificial intelligence will soon be able to do most jobs, including running society, better than people. The overwhelming imperative is to create something like a universally Facebook-connected society or a giant artificial intelligence.
These “new gods” run on data, so as much data as possible must be gathered, and getting in the middle of human interactions is how you gather that data. If the process makes people crazy, that’s an acceptable price to pay.
The business model, not the algorithms, is also why people have to fear being put out of work by technology. If people were paid fairly for their contributions to algorithms and robots, then more tech would mean more jobs, but the ideology demands that people accept a creeping feeling of human obsolescence. After all, if data coming from people were valued, then it might seem like the big computation gods, like AI, were really just collaborations of people instead of new life forms. That would be a devastating blow to the tech ideology.
Facebook now proposes to change its name and to primarily pursue the “metaverse” instead of “social media,” but the only changes that fundamentally matter are in the business model, ideology and resulting incentives.
Mr. Lanier is a computer scientist and the author, most recently, of “Ten Arguments for Deleting Your Social Media Accounts Right Now.”
Popular PlatformsPercentage of respondents who say they have used the following social mediaSource: Pew Research Center survey of 1,502 U.S. adults conducted Jan. 25-Feb. 8, 2021, with a margin of error of +/–2.9 pct.pts.
81%69403128252323211813YouTubeFacebookInstagramPinterestLinkedInSnapchatTwitterWhatsAppTikTokRedditNextdoor
Clive Thompson: Online Communities That Actually Work
Are there any digital communities that aren’t plagued by trolling, posturing and terrible behavior? Sure there are. In fact, there are quite a lot of online hubs where strangers talk all day long in a very civil fashion. But these aren’t the sites that we typically think of as social media, like Twitter, Facebook or YouTube. No, I’m thinking of the countless discussion boards and Discord servers devoted to hobbies or passions like fly fishing, cuisine, art, long-distance cycling or niche videogames.
I visit places like this pretty often in reporting on how people use digital tools, and whenever I check one out, I’m often struck by how un-toxic they are. These days, we wonder a lot about why social networks go bad. But it’s equally illuminating to ask about the ones that work well. These communities share one characteristic: They’re small. Generally they have only a few hundred members, or maybe a couple thousand if they’re really popular.
And smallness makes all the difference. First, these groups have a sense of cohesion. The members have joined specifically to talk to people with whom they share an enthusiasm. That creates a type of social glue, a context and a mutual respect that can’t exist on a highly public site like Twitter, where anyone can crash any public conversation.
Smallness makes all the difference. These groups have a sense of cohesion.
Even more important, small groups typically have people who work to keep interactions civil. Sometimes this will be the forum organizer or an active, long-term participant. They’ll greet newcomers to make them feel welcome, draw out quiet people and defuse conflict when they see it emerge. Sometimes they’ll ban serious trolls. But what’s crucial is that these key members model good behavior, illustrating by example the community’s best standards. The internet thinkers Heather Gold, Kevin Marks and Deb Schultz put a name to this: “tummeling,” after the Yiddish “tummeler,” who keeps a party going.
None of these positive elements can exist in a massive, public social network, where millions of people can barge into each other’s spaces—as they do on Twitter, Facebook and YouTube. The single biggest problem facing social media is that our dominant networks are obsessed with scale. They want to utterly dominate their fields, so they can kill or absorb rivals and have the ad dollars to themselves. But scale breaks social relations.
Is there any way to mitigate this problem? I’ve never heard of any simple solution. Strong antitrust enforcement for the big networks would be useful, to encourage a greater array of rivals that truly compete with one another. But this likely wouldn’t fully solve the problem of scale, since many users crave scale too. Lusting after massive, global audiences, they will flock to whichever site offers the hugest. Many of the proposed remedies for social media, like increased moderation or modifications to legal liability, might help, but all leave intact the biggest problem of all: Bigness itself.
Mr. Thompson is a journalist who covers science and technology. He is the author, most recently, of “Coders: The Making of a New Tribe and the Remaking of the World.”
Research has increasingly shown that social media has negative effects on children and teenagers.
PHOTO: GALLERY STOCK
Chris Hughes: Controlled Competition Is the Way Forward
Frances Haugen’s testimony to Congress about Facebook earlier this month shook the world because she conveyed a simple message: The company knows its products can be deeply harmful to people and to democracy. Yet Facebook’s leadership charges right along. As if on cue, the same week, Facebook, Instagram and WhatsApp went completely dark for over five hours, illustrating how concentration creates single points of failure that jeopardize essential communications services.
At the root of Ms. Haugen’s testimony and the service interruption that hundreds of millions experienced is the question of power. We cannot expect Facebook—or any private, corporate actor—just to do the right thing. Creating a single company with this much concentrated power makes our systems and society more vulnerable in the long-term.
The good news is that we don’t have to reinvent the wheel to “fix” social media. We have a structure of controlled competition in place for other essential industries, and we need the same for social media.
A single company with this much concentrated power makes our systems and society more vulnerable in the long-term.
Our approach should be grounded in the American antimonopoly tradition, which dates back to the start of our republic. Antimonopoly is bigger than just antitrust; it is a range of policy tools to rein in private power and create the kind of fair competition that meets public and private ends simultaneously. These can include break up, interoperability requirements, agreements not to enter ancillary markets or pursue further integration, and public utility regulation, among others.
The most talked-about antimonopoly effort—break up—is already under way. In one of his last major actions, Joe Simons, President Trump’s chair of the Federal Trade Commission, sued to force Facebook to spin off Instagram and WhatsApp. But breaking up large tech companies isn’t enough on its own. Requiring Facebook to split into three could make for a more toxically competitive environment with deeper levels of misinformation and emotional pain.
SHARE YOUR THOUGHTS
What do you think is the biggest problem of social media, and how would you fix it? Join the conversation below.
For social media in particular, competition needs to be structured and controlled to create safe environments for users. Antitrust action needs to be paired with a regulatory framework for social media that prevents a race to the bottom to attract more attention and controversy with a high social cost. Calls to ban targeted advertising or to get rid of algorithmic feeds are growing, including from former Facebook employees. These would go to the root problem of the attention economy and are consistent with the kind of public utility regulation we have done for some time.
At the core of this approach is a belief that private, corporate power, if left to its own devices, will cause unnecessary harm to Americans. We have agreed as a country that this is unacceptable. We structure many of our most essential industries—banking and finance, air transportation, and increasingly, health care—to ensure that they meet both public and private ends. We must do the same with social media.
Mr. Hughes, a co-founder of Facebook, is co-founder of the Economic Security Project and a senior advisor at the Roosevelt Institute.
PHOTO: MICHAEL KIRKHAM
Siva Vaidhyanathan: A Social Network That’s Too Big To Govern
Facebook and WhatsApp, the company’s instant messaging service, have been used to amplify sectarian violence in Myanmar, Sri Lanka, India and the U.S. Facebook’s irresponsible data-sharing policies enabled the Cambridge Analytica political scandal, and many teenage girls in the U.S. report that Instagram encourages self-harm and eating disorders. As bad as these phenomena are, they are really just severe weather events caused by a dangerous climate, which is Facebook itself.
Facebook is the most pervasive global media system the world has ever known. It will soon connect and surveil 3 billion of the 7.8 billion humans on earth, communicating in more than 100 languages. Those members all get some value from the service; some are dependent on it to run business or maintain family ties across oceans. But Facebook’s sheer scale should convince us that complaining that the company removed or did not remove a particular account or post is folly. The social network is too big to govern, and governing it effectively would mean unwinding what makes Facebook Facebook—the commitment to data-driven, exploitative, unrelenting, algorithmically guided growth.
As Facebook executive Andrew Bosworth declared in a 2018 internal memo, “The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.” In another section of the memo, Mr. Bosworth acknowledged that growth can have negative effects: “Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.”
Facebook lacks the incentive to change, and we lack methods to make it change.
But to Facebook’s executives, the company’s growth appears to matter more than public relations, the overall quality of human life and even the loss of life. Those are all just externalities that flow from the commitment to growth. Even profit is a secondary concern: Make Facebook grow and the money will take care of itself. Mark Zuckerberg truly seems to believe, against all evidence, that the more people use Facebook for more hours of the day, the better most of us will live.
Facebook lacks the incentive to change, and we lack methods to make it change. The scale of the threat is so far beyond anything we faced in the 19th and 20th centuries that reaching for the rusty tools through which we addressed corporate excess in the past—such as antitrust law and civil liability—is another sort of folly. Reforming Facebook requires restricting what feeds Facebook: The unregulated harvesting of our personal data and the ways the company leverages it.
Short of that, we are just chasing tornadoes and hurricanes, patching up the damage already done and praying another storm waits long enough to return. The problem with Facebook, after all, is Facebook.
Mr. Vaidhyanathan is a professor of media studies at the University of Virginia and the author of “Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.”
에이미 클로부샤르 “이제는 도무지 못 미더워진 빅테크”
2021년 11월 2일 | By: ingppoo | IT, 정치, 칼럼 |
월스트리트저널이 지난 주말판에 “소셜미디어, 어떻게 고칠 수 있을까?(How to Fix Social Media)”라는 제목아래 여러 전문가의 칼럼을 한데 실었습니다. 이 가운데 지난해 미국 대선 민주당 경선에 참여했던 에이미 클로부샤르 상원의원의 글을 소개합니다. 클로부샤르 의원은 올해 초 자신의 반독점 정책 기조를 상세히 밝힌 책 “반독점: 도금 시대부터 디지털 시대에 이르는 독점 기업 규제의 역사(Antitrust: Taking on Monopoly Power from the Gilded Age to the Digital Age)”를 펴냈습니다.
페이스북 계정이 있으세요? 그렇다면 당신은 지난 분기에만 페이스북에 51달러를 벌어준 셈입니다. 정확히 말하면 당신의 데이터가 그렇게 한 거죠.
페이스북은 이렇게 편리하고 좋은 서비스를 모든 고객이 공짜로 이용하고 있다고 늘 광고하지만, 사람들이 페이스북 플랫폼에 머물며 활동한 모든 내역이 데이터로 남고, 페이스북은 그 데이터로 맞춤형 광고를 팔아 막대한 이윤을 챙깁니다. 사람들이 플랫폼에서 뭔가를 하는 모든 순간이 페이스북에는 돈이 되는데, 개인정보보호법이나 데이터 수집에 관한 규제가 미비한 탓에 페이스북은 마음껏 이윤을 독식해 왔습니다. 법과 규제가 엄격한 나라의 이용자보다 미국 이용자들은 페이스북에 돈을 두 배나 더 벌어다 줍니다. 그만큼 미국에선 플랫폼 사업자가 고객의 데이터를 모으고 활용하는 데 제약이 덜한 거죠.
페이스북이 고객들의 개인정보로 돈벌이를 하는 것보다 더 심각한 문제도 있습니다. 고객들을 플랫폼에 머물게 하기 위해 페이스북은 말그대로 수단과 방법을 가리지 않았다는 사실이 지난달 내부고발자 프랜시스 하우건의 폭로로 드러났습니다. (월스트리트저널 페이스북 파일 기사 보러 가기) 하우건이 의회에 제출한 페이스북 내부 자료를 보면, 페이스북은 사람들이 가장 격렬하게 반응할 수 있는 콘텐츠를 더 많이 보도록 알고리듬을 짰습니다. 사람들이 플랫폼에 오래 머물수록 더 많은 데이터를 모아 더 많은 돈을 벌 수 있다는 걸 알았기에 그렇게 한 겁니다. 페이스북은 사람보다 회사의 이윤을 우선시하는 결정을 반복해서 내렸습니다.
“우리가 다 알아서 잘 처신할 테니, 우리를 믿고 지켜봐 달라.”
오랫동안 소셜미디어 기업들이 해온 말입니다. 그러나 소셜미디어 기업을 믿어줄 수 있는 시대는 이제 갔습니다. 이미 전혀 미덥지 못한 존재가 됐죠.
의회에선 소셜미디어와 빅테크를 개혁해야 한다는 기조에 이미 당적을 막론하고 넓은 공감대가 형성돼 있습니다. 물론 간단한 문제는 아닙니다. 다양한 문제가 얽혀있는 복잡한 방정식을 푸는 일인 만큼 다각도로 접근해야 합니다. 우선 개인의 데이터 보호 문제부터 생각해볼 수 있습니다. 시민이 자신의 데이터가 수집되고 사용되는 기준을 정할 수 있다는 원칙에서 출발해야 한다고 생각합니다. 애플이 이용자의 데이터를 수집해도 되는지 이용자 본인에게 허락을 구하는 방식을 도입하자, 이용자의 75% 이상이 ‘내 데이터’를 수집하지 말라며 데이터 수집을 거부한 점은 시사하는 바가 큽니다. 지금이라도 국가 차원의 개인정보보호법을 제정해야 합니다.
7~9세 어린이들도 무려 세 명 중 한 명이 소셜미디어 앱을 이용합니다. 온라인에서 어린이를 보호하는 법은 더 강력해야 합니다. 이미 미국 어린이들은 여러모로 해로운 콘텐츠에 그대로 노출돼 있지만, 우리는 소셜미디어 플랫폼들이 고객의 데이터로 무얼 하는지조차 제대로 파악하지 못하고 있습니다. 기업의 이윤을 개인의 사생활이나 개인정보보다 앞세우는 건 막아야 하는 일이지만, 그 대상이 어린이라면 더욱더 시급히 조처해야 합니다.
페이스북이 엄청난 비판에도 꿈쩍하지 않는 이유 중 하나는 페이스북이 싫어도 소비자들이 옮겨 갈 만한 대안이 없기 때문입니다. 2008년에 마크 저커버그는 이메일에서 직접 “경쟁사와 경쟁하는 것보다 사들이는 편이 낫다”고 말했습니다. 페이스북이 인스타그램을 인수하지 못했다면, 인스타그램이 페이스북의 경쟁 플랫폼으로 성장해 개인정보보호에서 훨씬 더 나은 방식으로 소비자들의 신뢰를 받았을지도 모릅니다. 디지털 경제에서도 공정한 경쟁은 시장의 활력을 유지하고 경제 권력의 집중을 막기 위한 필수 조건입니다. 그러려면 반독점법, 경쟁촉진법을 디지털 경제에 맞춰 업데이트하고, 관련 기관에도 규제를 집행하는 데 드는 예산과 자원을 지원해야 합니다.
마지막으로, 우리의 일상을 사실상 지배하고 관장하게 된 알고리듬을 투명하게 관리하고 감독해야 합니다. 페이스북은 팬데믹 동안 건강에 관한 가짜뉴스의 온상이 되는 걸 스스로 막지 못했습니다. 인스타그램은 10대 여자 청소년들에게 자신의 몸을 혐오하게 만들고, 식이장애를 조장했습니다. 잘못된 소셜미디어 알고리듬을 방치하거나 바로잡지 못하면, 반드시 현실세계의 문제로 이어집니다. 해로운 콘텐츠가 걸러지지 않고 어디까지 얼마나 부풀려지고 유통되는지 의회가 지금이라도 철저히 조사해야 합니다.
의회가 팔을 걷어붙이고 나서기 전까지 기업들은 한데 뭉치기 힘든 소비자들의 데이터를 이용해 계속해서 돈을 벌려 할 겁니다. 문제를 인지하고 이해하는 차원을 넘어 무언가 조처를 해야 한다는 데 민주당과 공화당의 생각이 일치한 지금이야말로 의회가 행동에 나서야 할 때입니다.
닉 클레그 “페이스북 혼자서는 문제를 풀 수 없습니다”
2021년 11월 2일 | By: ingppoo | IT, 정치, 칼럼 |
월스트리트저널이 지난 주말판에 “소셜미디어, 어떻게 고칠 수 있을까?(How to Fix Social Media)”라는 제목 아래 여러 전문가의 칼럼을 한데 실었습니다. 이 가운데 지난주 메타(Meta)로 이름을 바꾼 (구) 페이스북의 글로벌 사업 부사장 닉 클레그의 글을 소개합니다.
소셜미디어 전반을 둘러싼 논의는 최근 들어 급격히 변했습니다. 한때 소셜미디어는 사람들에게 스스로 표현하고 드러낼 수 있는 장을 마련해주고, 시간과 공간의 제약을 뛰어넘어 서로 연락을 주고받을 수 있는 도구이자 비슷한 생각을 하는 이들끼리 커뮤니티를 꾸릴 수 있게 해준 기술로 찬사를 받았습니다.
요즘 소셜미디어를 둘러싼 논의는 이상향을 향해 가던 긍정적인 분위기보다는 대개 비관적인 분위기에 어두운 주제가 많아 보입니다. 개인정보보호부터 사람들의 안녕, 정치, 경쟁에 이르기까지 소셜미디어가 어떤 영향을 끼칠지 많은 사람이 걱정하고 있습니다. 사람들이 우려하는 것도 당연합니다. 우리는 분열과 혼란의 시대를 살고 있으니까요. 소셜미디어가 우리 사회를 병들게 한 원인처럼 보이기도 하고, 그저 병 든 사회상이 소셜미디어에 반영된 것처럼 보이기도 합니다.
소셜미디어는 전통적으로 정보가 흐르는 방향을 완전히 뒤바꿔 놓았습니다. 원래 가치 있는 정보는 권력층이 독점하고 통제했었는데, 소셜미디어의 등장으로 이제는 보통 사람들이 원하는 정보를 직접 확인하고 퍼뜨릴 수 있게 됐습니다. 최상층에 집중돼 있던 권력이 더 많은 이들에게 분배되는 과정은 가히 혁명적이었습니다. 최근 들어 소셜미디어, 특히 페이스북을 사회에 해악을 끼치는 존재로 묘사하는 이들이 많아졌지만, 저는 오히려 정반대라고 생각합니다. 사람들에게 스스로 표현하고 정보를 생산해 공유할 수 있는 도구와 힘을 주는 일은 사회에 유익한 일이라고 믿습니다. 그렇게 힘을 얻은 개인들이 모여 다양하고 역동적인 사회를 꾸리고 끌어가니까요.
물론 페이스북 이용자만 수십억 명에 이르다 보니 우리 삶이 그렇듯 좋은 일과 나쁜 일이 한데 뒤섞여 일어나곤 합니다. 삶의 흉측한 모습이 부각되기도 하죠. 이는 페이스북뿐 아니라 모든 소셜미디어 기업에 대단히 어려운 문제를 던진다고 생각합니다. 바로 표현의 자유와 해로운 콘텐츠 사이의 어디쯤 선을 긋는 게 옳으냐의 문제죠. 또는 개인정보보호와 공공의 안전 사이에 선을 긋는 문제일 수도 있습니다.
페이스북은 제품을 만들 때 그 제품이 사회 전반에 미치는 영향을 신중하게 고려하고 검토해야 합니다. 이는 페이스북 정도의 기업에 당연히 따르는 책무라는 걸 잘 알고 있습니다. 그러나 동시에 소셜미디어 산업 자체가 오래된 전통이 있는 산업이 아니라는 점도 생각해야 합니다. 과연 이 제품과 서비스가 사회의 어느 부분에 어떤 영향을 미칠지 늘 명확하지 않다는 겁니다. 이제는 메타(Meta)로 이름을 바꾼 페이스북이 월스트리트저널의 페이스북 파일 기사에서 보도된 바와 같이 수많은 연구를 자체적으로 수행한 이유도 그 때문입니다. 우리 서비스가 사람들에게 어떤 영향을 미치는지 정확히 알아야 서비스를 개선하고 문제를 고칠 수 있다고 생각합니다. 그러나 어떤 기업이 이런 연구를 수행했다는 이유로 비난의 대상이 된다면 기업은 연구를 멈출 수밖에 없습니다. 기업이 해야 할 책무를 다하지 못하게 될 수도 있는 겁니다. 이건 우리가 바라는 모습이 아닐 겁니다.
소셜미디어는 이미 가장 심대한 영향을 미칠 수 있는 몇몇 사안에서는 충분한 견제를 받고 있습니다. 이 점에는 많은 이들이 동의할 겁니다. 페이스북과 인스타그램이 직면한 수많은 문제는 너무 중요해서 해당 기업 혼자서 풀어낼 수 없습니다. 페이스북이 소셜미디어와 디지털 시대를 아우르는 별도의 규제 패러다임을 몇 년 전부터 주장해온 이유가 바로 여기에 있습니다.
이번 기회를 통해 의회가 더 머뭇거리지 말고 행동에 나서야 한다고 생각합니다. 다른 무엇보다 디지털 시장을 관장하는 규제 기관을 새로 만드는 데서 개혁을 시작해야 합니다. 포괄적인 연방 차원의 개인정보보호법을 제정하고, 통신품위법 230조도 개혁해야 합니다. 또 메타와 같은 대기업 불법 콘텐츠, 해로운 콘텐츠를 잘 막고 걸러내는지 감독하는 업무도 직접 할 수 있습니다. 새로운 규제기관은 플랫폼이 연구 목적으로 데이터를 얼마나, 어디까지 공개할 수 있는지도 정해야 합니다. 소셜미디어 플랫폼의 알고리듬 전반에 걸쳐 투명성을 획기적으로 높일 수 있을 겁니다.
소셜미디어는 사라지지 않습니다. 페이스북이 애초에 없었더라도 지금 페이스북과 관련한 문제들이 마법처럼 사라졌을 거라고는 보기 어렵습니다. 지금은 현실성이 떨어지는 규제안에 힘을 낭비하지 말고, 공통분모를 찾아 적절한 규제안을 확립하는 데 힘을 쏟아야 할 때입니다. 인터넷은 결국 훌륭한 도구입니다. 이를 잘 써서 모두에게 이로운 결과를 내는 건 결국, 우리 손에 달렸습니다. 장점과 효용을 최대한 살려내고, 반대로 단점과 폐해를 효과적으로 예방하려면 테크 기업도, 의회도 주어진 책무를 다해야 합니다.
|