Skip to main content

Facebook doesn’t care about you: tech giant chooses profits over people every time

The revelations of whistleblower Frances Haugen have shown that the all-enveloping social media giant has every problem a monopoly presents but on a terrifying new level — with our privacy and safety at risk like never before, writes CHAUNCEY K ROBINSON

WHEN it debuted on the internet in 2004, Facebook seemed like a revolutionary innovation. A way for millions of people to engage with one another on a central virtual platform. It was free communication at your fingertips. But as more bombshells pertaining to the mega-giant tech company have hit the news in recent weeks, we now see that nothing was free or revolutionary.

It’s clearer now that we, the users of Facebook, are actually the company’s product, sold to the highest advertising bidder and placed at the mercy of an algorithm that incites human hate in the name of increasing “engagement.” Judging by recently leaked evidence, Facebook doesn’t care about you, your grandmother, or your cat videos. And on top of that, its growing influence and monopoly power pose a real danger to democracy and our society as a whole.

Earlier this month, data scientist Frances Haugen came forward as the Facebook whistleblower. Having previously worked for the company, Haugen shared a trove of internal documents and gave several interviews the past few weeks showing that whenever there was a conflict between the interests of the company and the public good, the social media giant would choose its own interests.

In an interview on 60 Minutes, Haugen stated, “Facebook, over and over again, has shown it chooses profit over safety.” She said, “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Some of her damning claims include the company deciding to dissolve a civic integrity unit she was part of and its failure to put up safeguards against misinformation after President Joe Biden defeated former president Donald Trump in the 2020 elections. Haugen believes this, along with the unregulated festering of dangerous groups on Facebook, helped to bring about the January 6 insurrection at the US Capitol.

In a Senate hearing on children and social media, Haugen testified: “The company intentionally hides vital information from the public, from the US government and from governments around the world. The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages.”

The data scientist’s words are explosive, but if we pay attention to the timeline of events when it comes to Facebook, then we know that the record of such behaviour has been accumulating for some time, simultaneous with the company’s rise to a nearly trillion-dollar market value.

 

Follow the money

 

You may not need to pull out your credit or debit card whenever you log on, but Facebook definitely comes with a price. The most obvious and immediate ones are your attention and privacy, but the way the company makes the lion’s share of its profit is through advertising revenue. Of course, this isn’t a new business model, as television and newspapers have been doing it for a long time. But Facebook (along with other tech companies like Google) have a means, through their algorithms, to control when and how users engage and come across the advertisements.

The Facebook algorithm is a set of calculations the company uses to decide what you see on its platform. The timeline of posts from friends and groups on your Facebook is neither chronological nor complete because the company, based on its observations and records of your behaviour, puts posts in front of you that it believes you will be the most likely to engage with. The longer you are on the platform, the more time Facebook has to show you advertisements.

Engagement is put above all else, even if what is engaged with incites violence or hate. Studies have shown – as has Facebook’s own research if we are to go by Haugen’s claims – that posts that get the most engagement are ones that elicit a negative emotional response. The algorithm is not set up to discern between so-called good or bad engagement. Therefore, if a user is engaging with content that deploys things like racism, sexism, or other forms of bigotry, Facebook’s algorithm isn’t set up to dissuade the user from engaging with it. Quite the opposite. If it keeps the user on the site, then Facebook’s algorithm will continue to peddle this kind of content to them.

This has resulted in a virtual world where many users live in echo chambers of political viewpoints and (mis)information that can be harmful to the wellbeing of themselves and others. Facebook claimed in a statement to 60 Minutes that the divisiveness in the United States was there long before Facebook existed and it is innocent when it comes to the growing divide. However, the company’s claim to be blameless feels hollow.

To drive the point home, think of a situation involving a caregiver and a baby. The caregiver is found to be feeding the baby poison. When questioned, they respond that they continued to feed this to the baby because initially, the infant seemed to respond favourably to the taste. The caregiver claims the final result isn’t their fault, because it’s what the baby continued to go for. The caregiver is aware of the dangers of the toxin, but they seemingly ignore this as the baby keeps coming back for more. The caregiver is only concerned with the fact that the baby is eating their food, rather than what the baby is consuming.

That bleak analogy isn’t far off from Facebook’s dealings. Many users are naive as to the workings of the social platform they rely so heavily upon. The metaphorical poison millions of users have been radicalised with, on top of the potential dangers to our privacy and security, has created a perfect storm fuelled by unchecked capitalism. This not only affects individuals who use the platform but the institutions that we use to govern our society —like elections.

 

Facebook’s transgressions

 

There are some key areas that display just how problematic Facebook’s influence is.

Privacy and Security

This was illustrated by the now infamous Facebook–Cambridge Analytica data scandal. Whistleblower Christopher Wylie exposed how millions of Facebook users had their personal data collected without their consent by the British consulting firm Cambridge Analytica.

This company, which harvested private information from more than 50 million Facebook user profiles, would go on to profile voters for Donald Trump’s 2016 presidential campaign based on their Facebook activity.

Although Facebook would go on to suspend Cambridge Analytica from the platform, the company contended that what Analytica did wasn’t a data breach, since technically users consented — via their privacy settings — to allow the harvesting of their data. Of course, many users were never made aware by Facebook that there was a privacy setting they needed to change in order to prevent such an intrusion.

The $5 million fine Facebook was ordered to pay by the Federal Trade Commission was pennies in comparison to its net worth. The data the company holds and seemingly refuses to take full responsibility for, could have very well determine the outcome of national elections in the hands of bad actors.

Misinformation

From baseless election fraud claims to outright lies regarding the Covid-19 pandemic, Facebook has featured prominently in the spread of socially detrimental rhetoric. According to a study by researchers at New York University and the Université Grenoble Alpes in France, from August 2020 to January 2021, news publishers known for putting out misinformation got six times the number of shares, likes and engagement on the platform in comparison to news sources considered more trustworthy.

As reported in the Washington Post, Facebook officials refuted the study, claiming it didn’t give the full picture. Yet Facebook, which is in the position to give the full picture, refuses to do so. The company has increasingly limited the amount of data others can access regarding what happens on the platform. This was demonstrated by the frustration shown by the Biden administration when it tried to get info from Facebook regarding anti-vaccine propaganda on the website.

The world is still in the midst of a global pandemic. A recent survey found that people who rely exclusively on Facebook for news and information about the coronavirus are less likely than the average American to have been vaccinated. The platform plays a critical role in public safety, yet it chooses not to take the responsibility that comes when you have outside influence.

Eating up the competition

The major Facebook outage that occurred earlier in October might have felt overly dramatic, as some users complained about not being able to get their cat video fix, but the problem was more severe than that. It was the day the digital world stood still as Facebook, along with Instagram and WhatsApp, both owned by Facebook, all went down.

Millions of people in multiple countries were unable to communicate with loved ones or even run their businesses since the WhatsApp platform and Facebook messenger serve as the dominant forms of communication in many developing countries. It’s not a matter of simply leaving the platform as the popular #DeleteFacebook hashtag may have us believe. The company has strategically undermined or acquired potential competitors, leaving many nowhere else to go.

The decimation of independent journalism

A less obvious casualty of the domination of Facebook’s business model is local and independent press. Newspapers and publications at one time relied heavily on advertisements. Now that Facebook and Google command nearly 60 per cent of all digital advertising revenue, that leaves less money to keep local newspapers, alternative publications and community news services afloat.

According to a 2020 Pew Research study, Facebook is a regular source of news for about a third of Americans. It’s a platform rife with misinformation and possesses an algorithm that controls what every user sees. An engaged and independent free press is vital to the health of a society. Facebook’s practices and unchecked dominance put that in peril.

 

What is to be done?

 

Corporate monopolies have been broken up before, as was done to railroad, oil and steel. It’s time to break up Big Tech. US senator Elizabeth Warren, who has been at the forefront of this effort, once pointed out that Facebook has “vast power over our economy and our democracy.” That may sound like hyperbole for a social media platform, but as the aforementioned information in this article makes clear, it is not.

The companies Google, Amazon and Facebook are allowed to dominate their markets in ways that seep into all walks of life as we know it. This includes the information we are presented with, the products we choose to buy and the narratives that shape the perspectives of millions across the planet.

As long as companies like Facebook are treated like new ventures that can’t be properly defined, they will continue to evade regulation. What Facebook is doing is not new when it comes to its push for profit. It is not some kind of anomaly in history that has never been seen before. It is not some benevolent new media tool. It is a corporate entity whose mounting power has put it at odds with the public good.

This is dangerous. A call for regulation, such as the petition from the organisation Color of Change, is a step in the right direction.

Lastly, it is also important to be critical of the capitalist system that allows information to be bought and sold for profit. The perverse notion that we — in the form of our attention and information — should be ok with becoming products to be sold by companies in exchange for efficient forms of communication and human connection, should be rebuked.

This article first appeared in Peoples World — www.peoplesworld.org.

OWNED BY OUR READERS

We're a reader-owned co-operative, which means you can become part of the paper too by buying shares in the People’s Press Printing Society.

 

 

Become a supporter

Fighting fund

You've Raised:£ 9,944
We need:£ 8,056
13 Days remaining
Donate today