Let it be known: On the last day of the year of 2024, I received a class action settlement from Facebook worth $36.29. That’s right. I sued Facebook and won.
I’m probably taking too much credit. As a Saskatchewan resident and a Facebook member in good standing, I was invited to fill out a form to join a class action lawsuit. Apparently, Facebook was stealing some of our personal photos without consent to use in advertising campaigns. I know, can you believe it?!
I understand they harvest our data and know almost everything about us, including the most intimate parts of our life, but... The nerve! To think they could take something that is ours (saved on their platform, mind you, so I’m not sure what is actually “ours”), to secretly profit off it without our consent. To think that our data, the data we collectively worked so hard to post and gain a dedicated tranche of followers, were theirs for the taking. The audacity!
I'd like to say I’m done with Facebook, but I’ve never really been into it. I’m only on it so people can reach me and send me funny memes. I’ve deleted most of my personal photos and rarely post anything. Sure, I check in on people every now and then to see what life has thrown at them, but I don’t make a habit of Facebook stalking.
I joined the class action lawsuit because I wanted at least some compensation. I wanted to say I took a tiny bit back from a $1.5 trillion company that reaps all their profits off our personal posted lives.
We give and give, but for what? Status? “Friends”? A few more likes??
If you use Facebook regularly, don’t take this personally – I completely understand the role it plays in your personal happiness. But let’s recognize Facebook’s immense power over us by simply monitoring our preferences. For every “like”, it gets to know you a little better. After 300 likes, from what I’ve read, Facebook knows your preferences as well as your spouse. The algorithms have an uncanny ability to understand most things about us simply by monitoring our thumbs up.
We should be at least a little concerned. Social media algorithms are designed to keep us on their platforms, and unfortunately, as humans, we stay on for longer when we’re fed controversial, anger-inducing information.
In his book, Nexus: A Brief History of Information Networks from the Stone Age to AI, Yuval Noah Harari identifies the risks today's media platforms pose to humanity. It's more than the direct control some people, like Elon Musk, have over these platforms. Algorithms that promote certain material can have just as harmful consequences.
Facebook’s algorithms have impacted entire populations, most notably fueling the genocide against the Rohingya people in Myanmar in 2017. According to Amnesty International, Facebook’s parent company, Meta, knew that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content, but failed to act.
While I don’t believe in censorship (like the book-burning kind), the alternative can be downright scary. Can we at least make laws against algorithms that cause genocide?
Apparently, Facebook is moving in the opposite direction. CEO Mark Zuckerberg just announced the end to Facebook’s fact-checking program. Bowing to pressure from Donald Trump, Zuckerberg has succumbed to political bullying – and perhaps his true desires – by removing much of the oversight that prevents the spread of falsehoods and hate speech. I’m sure he understands this will also reduce costs and boost profits. Misinformation on the platform, according to technology author and professor Scott Galloway, will likely skyrocket.
Prior to the internet, newspaper and television news organizations were the primary gatekeepers of information. While they allowed for debate, they wouldn’t report something that couldn’t be verified nor would they promote a theory that had no basis in reality – because it would erode trust. What we have today are social media platforms that are untethered from truth.
Identifying disinformation is not to stifle free speech, but to protect people from dangerous ideas and manipulation. It’s to protect populations from being bombarded with so much misinformation, the truth is no longer discernible.
Unless we impact tech companies’ bottom line – and I admit, a payout of $36.29 won’t do much – I’m afraid we’ll get more of the same.