Platform or Publisher?
With all the information coming your way all day long it would be understandable if the name Frances Haugen meant nothing to you. In October, she experienced full media immersion for about a week, but what looked like saturation coverage to some was completely missed by most. There is just too much news competing for our attention for all of us to be taking part in the same conversation about public affairs.
Haugen is the former Facebook employee who leaked internal documents from the company. The documents show how Facebook manipulates what we see on its platforms in a way that is profitable for Facebook, but dangerous for the country. After her story first appeared in the Wall St. Journal, she sat down for an interview with the old school news program 60 Minutes, on CBS, and then testified before a Senate committee on Capitol Hill.
In the world before social media, being the subject of news coverage from a major national newspaper and a major American television network would have made anyone famous. In a world dominated by social media, she managed to achieve trending status for a day or two. Hopefully, the right people were listening and hopefully her decision to put her career at risk for the good of her fellow citizens will lead to change.
Haugen has provided the world with research done by the company, that shows Facebook understands how the spread of misinformation and hateful information on its platforms effects society as a whole. Its algorithms are meant to drive engagement, and incendiary information does that better than any other kind. Even though Facebook understands this, it chooses to do very little about it, because making the necessary changes would drive profits down.
The company’s reaction was what you might expect. There was an immediate attempt to discredit Haugen; to claim her job at Facebook had nothing to do with the issues she was raising and to accuse her of drawing conclusions about subjects she did not understand. Facebook made claims about how aggressive its efforts have been toward controlling hate speech on line, but as Haugen pointed out, even using Facebook’s numbers, the company’s attempt to manage the mis-use of its platforms is marginal.
Haugen was prepared to be targeted. She knew what was coming and spent months preparing to make her claims public. She was un-moved by the personal attacks on her competence and she did not use her moment in the spotlight to air grievances or disparage the company’s leadership. She went out of her way to say she does not believe Facebook ever set out to be a malevolent force, but its business model has pushed it in that direction. The system is out of control, and those in charge cannot tame it on their own, given the number of users and the imperative to increase profits. She knew she had the truth on her side, and especially during her testimony before the Senate, it was evident that knowledge gave her strength. The strength to stand up to one of the most powerful companies in the world today.
While Haugen’s testimony focused on Facebook and Instagram, it is important to point out that the question of how to control speech online is a problem facing any company providing such a service to its users. Facebook happens to be the biggest. Haugen happened to work there, so Facebook has become the focal point when considering the issues and the target for lawmakers who are searching for easy, blunt force solutions to a complicated problem.
At the core of the debate about the role of social media in public discourse, is the question of whether social media companies are merely platforms users can access to publish their content, or whether those companies are publishers themselves? Should social media companies review and edit content the way a traditional newspaper would, or are social platforms the modern equivalent of a town square, where anyone can stand on a park bench and say anything they want?
It would be convenient for social media companies if everyone simply adopted the platform-only view of the world, but as it turns out, that view is extremely inconvenient for the rest of us and perhaps even de-stabilizing. This is a problem that cries out for intervention in the form of regulation if the social media world does not take steps to correct the imbalance on its own.
The “we are just a platform” position of social media leaders is disingenuous for several reasons.
Returning to the town square analogy, the difference between a town square and a social media platform is that the town square is owned by the people of the town. It is open to everyone and it is policed by the local government. If someone abuses their right to free speech, the government, in the form of the local police force, can step in to control the situation until the matter can be decided in the courts. Even in an open town square, in a country with freedom of speech, there are limits.
Unlike a town square, social media platforms are owned by a private concern. They are not owned by the public, they are not open to everyone, there are user agreements that theoretically define how the platforms can and cannot be used. When defending themselves against charges that they are not doing enough to control the mis-use of their platforms, social media companies often point first to their user agreements, as if the collection of clicks on an OK button absolves them of all responsibility. It does not. If you are going to set up rules for how your virtual town square is going to be used, then you have a responsibility to police your town square.
Social media companies claim they cannot be responsible for editing content posted by users, because it would amount to censorship and it is physically and financially impossible, because of the size of the user base. Their own behavior and their own claims of responsible management show this is not true.
As Frances Haugen pointed out in her testimony, backed up by internal research from Facebook and previous testimony from Facebook executives, there are computer programs created by Facebook(and other platforms) that are designed to control how content is used, who sees it, and who does not. This is editing. When social media platforms create algorithms to manage content they are carrying out an editing function that is similar to deciding which stories make the front page of a newspaper.
Haugen claims the editing function is currently biased toward content that generates user engagement and therefore revenue, but it can be designed to manage content based on quality, reliability, truthfulness and good citizenship.
Even if you reject the idea that social media platforms are publishers and that they therefore have a responsibility to manage the content that appears on their channels, there remains the question of good corporate citizenship.
With or without the internal research made public by Haugen, anyone who has a social media account can experience for themselves how the content they see is manipulated. If you interact with one item in your feed, you are quickly handed more of the same kind of content. When it comes to political speech, this tends to keep people in their own information bubbles and adds to division within our society. We know this as users and the social media companies know it, because their business model is based on it.
Facebook knows, other social media companies know, that their products are used in ways that motivate people for both good and ill. From the perspective of good corporate citizenship, and the long-term health of the companies, it is the responsibility of leadership to address the issues before they foster public crisis as they did on January 6, 2021 at the U.S. Capitol.
Social media has been with us for less than 20 years. Until now, it has evolved almost unrestrained under the idea that it is a new way of communicating that cannot be subject to outside controls, or that loses its value if not allowed to develop free from oversight.
That simply can’t be true. Even the freest of free speech has never gone completely uncontrolled. Human beings have always sought to find a place of common ground where people can share ideas and disagree respectfully, in a manner that moves society forward. We have never sought to create a place in the public square where people with bad intent are allowed to burn it all down without suffering consequences. Social media companies cannot be allowed to dishonestly use free speech protections and the false threat of censorship as cover to hoard profits.