Blog entry: December 17th 2020. Geneva, Switzerland.
Shortly after Biden’s win, Alexandria Ocasio-Cortez (AOC), U.S. Representative for New York’s 14th congressional district, gave a widely-shared interview with the New York Times in which she pointed out the multilayered ways in which the Democratic Party could be much more effective than it currently is.
While I agreed with her arguments, one point she made should be brought up by well-meaning folks around her: “I don’t think anybody who is not on the internet in a real way in the Year of our Lord 2020 and loses an election can blame anyone else when you’re not even really on the internet.”
The problem here is that she was referring to a politician only spending $2,000 on Facebook. As she clearly knows however, what we now call ‘the internet’ has turned, particularly in the past decade or so, into a conglomerate of a handful of giant corporations whose primary business model relies on data harvesting and manipulative behavior modification. Regardless of our best intentions, investing in these tools is investing in manipulation. It is true that the manipulation works, and that political candidates can benefit from it, but it remains manipulation. Spending $200,000 on manipulation is obviously more effective than $2,000.
If politicians have to campaign on social media, the platforms must be held accountable. Relatively strong democracies all around the world are being threatened by them, and quite a few weak democracies are in even worse shape. This isn’t a determinism argument. Social media companies aren’t solely responsible for what’s going on, but they are responsible for parts of it and absolutely responsible for making it extremely difficult for honest debates to even occur in the first place. They are sucking the air out of the room, pumping it with toxins and then selling us the protective gear.
To be clear, AOC is nowhere near the only one making that point. And to be both clear and fair, she has been at the forefront of actually trying to tackle the abuses of the social media giants. I also don’t have the answer as to what to do in the short-term, so this isn’t a ‘here is what AOC must do’ piece. But as many more politicians are thinking along these lines too, and less honestly than she does, this belief needs to be actively challenged. It may be that there’s no other option but to participate in behavior modification technologies because they have become the king- and queen-makers and because Facebook’s own scientists have proudly boasted being able to manipulate people without anyone being any the wiser. That may be true for now, and it would be dishonest to say that we don’t all sometime make these pragmatic calculations. But we should call it for what it is lest we fall into a much deeper cynicism than what is already found on these very platforms.
There is a silver lining to this story though, and it is relatively simple. We can force the tech giants to stop harvesting our data and to stop using manipulative algorithms. We can even start with the latter first and find a (politically feasible) middle ground for the former, but unless the algorithms that are in place are rendered illegal, I can’t see the social ills that are often mentioned – polarization, extremism, misinformation and disinformation – becoming any easier to tackle. There is also the fact that Americans have a particular responsibility here because these are their companies being unleashed on the world at a time when we all need to be tackling the urgent global climate emergency (and here too AOC has been on the right side of things). Doing that is more effective than ordering them or pressuring them to occasionally pay attention to the hate crimes and attacks on reality that their algorithms are facilitating. If the game is rigged, we need to change it, not master it. Their very business model are the threat, and that’s what must be outlawed.
Activism on social media
Beyond electoral calculations though, there is another dimension to this problem that I think is even more urgent, namely the fact that so many activists, journalists and academics are themselves addicted to social media. This is often underappreciated, and it is something I used to do as well. We may recognize the dangers being posed on society, but our misplaced beliefs in our supposed immunity to these manipulation algorithms leads us to excuse ourselves from our responsibility. When we do agree that we are being manipulated cognitive dissonance enters the equation: “not this tweet, not this post, surely. I’m publishing these out of my own free will. I’m replying to this troll or this bot – literally replying to a fake account – because I want to.” The alternative – that we simply cannot know when it is free will and when it is the result of manipulation – is rarely acknowledged.
I come from a position of authority on this, unfortunately. I was extremely active on Facebook before I deleted my account a couple of years ago, and I was extremely active on Twitter until just a couple of months ago. My ‘impressions’ are in the millions, and I am as vulnerable to dopamine hits as anyone else. I now know better. Through a fair amount of research that revealed uncomfortable facts about how Facebook and Twitter changed my own behavior (I had Instagram and TikTok downloaded on my phone too, but I never used them as much, and they’re now deleted). This has led me to wonder whether specific moments on these sites were spent willingly, truly willingly, and the answer is simply no. The problem though is that as our attention span is also negatively affected by addiction, it is very difficult for me to even recount which specific tweets or posts I’m referring to. I am just certain that this happened and that it multiple times.
I am also certain that the vast majority of everyone I know are social media addicts. In healthier times, this would have been a controversial statement. But now, the algorithms have gotten so advanced that we could both acknowledge our own manipulation and still do nothing or very little about it. This cynicism reminds me of Peter Pomerantsev’s book ‘nothing is true and everything is possible’ about Putinist Russia (I had him on the podcast). The difference here is that there are very few humans involved, and that the humans creating the algorithms cannot know what the algorithms will do. They can only design them.
Once unleashed, they can only try their best to ‘improve’ it – and employees at these companies do try their best – but that’s like trying to improve gambling machines that are literally there to get you hooked. The best thing you can do short of abolishing it is to limit it, which is what many gambling institutions are legally required to do because we recognize that it ain’t exactly good for us. For these same reasons, Facebook and Twitter had to make their products worse on US Election Day because they can’t control the manipulation. They had to make political ads more difficult to buy, they had to hire people to manually remove fake news and disinformation, and they had to rely on reputable sources to determine what is true from what is false. In other words, Zuckerberg had to do exactly what he said he doesn’t want to do: intervene.
They minimized the damage, to a certain extent, when it comes to the US elections, but what about the rest of the world? What about non-elections problems? Are they going to always go on emergency mode to minimize the damage they’re causing? No, they neither have the will nor the capacity to do so unless explicitly forced by the law. They also rarely have the incentives to even care (recent developments in the EU may change that). We were extremely lucky this time around and still had to deal with the madness of QAnon, Anti-Vaxxers, Holocaust deniers and so on during a global pandemic and continuously worsening global warming. And that’s only in English. I didn’t even bring up the rest of the world’s languages commonly used on these platforms. These platforms need to be held legally accountable for what they have done, and this is why the US has an obligation to minimize the damage they are causing, fix what can be fixed, and abolish what can’t.
In the meantime, we need to ask ourselves what we – as activists, artists, academics etc – are contributing to by pretending that our activities on these platforms are neutral, that it merely depends on whether we are personally nice or not nice on them when the problem of data monopolies is so much beyond what’s usually thought of as politics.
I strongly believe that my own activism and my own sense of self was impacted by my presence on social media. I am convinced, and we now have the evidence to back that up, that it has contributed to poorer relations between us, even as it has increased the numbers of who counts as ‘us’. I know more people now than I did before Facebook and Twitter. There’s no point denying it, but the costs have gotten too high, and we owe it to ourselves to do something about it. The fastest thing to do is to delete these accounts. If you feel you cannot do so immediately, then do so slowly. Actively seek alternatives, find other ways of getting the news (email or print subscriptions, actually paying for journalism, and so on), find other ways of sharing your thoughts (blogging, like what I’m doing right now), develop interests outside of what you get from social media.
Remove their dominance by knowing that you no longer need them. This to me is even more important than just deleting, although deleting is undoubtedly the surest way of actually achieving that for most people. You will likely experience withdrawal. I am currently experiencing withdrawal, but it has gotten easier. I went through a phase where I forced myself to only read the news via websites, print papers, email subscriptions and selected podcasts. It took some time and I did, for a period, actually know less of what was going on in the world. But that was only temporary, and I can now safely say that I am as well-informed as I used to be when I was hyper-active on Twitter, without the negative side-effects. I can think about the news and feel all the usual negative emotions (fear, exhaustion, anxiety etc) when the news is bad without also having to worry about competing for other people’s attention, about gaming the algorithm, about idiotic trolls and bots or about artificially-inflated pseudo-‘debates’ that do nothing to actually deal with core problems.
My own process
So take your time if you need to, but start the process. Here’s what I went through, and apologies in advance if this isn’t coherently catalogued as I’m writing from memory.
(Note: some of this is mentioned in an upcoming episode of the Fire These Times that I recorded with Katy Cook and which will be released on the 10th of January on Patreon and 11th of January on podcast platforms)
It started around 2015-2016 when the pro-Assad/Putin disinformation campaigns on Syria jumped from Facebook, where I had initially witnessed them, to Twitter, where I was becoming more active. This is what started planting doubts in my mind. Reasonable people, highly-educated people, were falling for what was evidently propaganda. This wasn’t clear to everyone at the time, but it was to those of us paying attention to Syria. This then got more attention when people in both the UK and the US ‘chose’ to screw themselves over (and Brazil, Hungary, Italy, India etc).
These seeds of doubts led me to question my relationship to my phone and, separately, my relationship to social media. I went through a lot – a lot – of trial and error, everything from putting my phone somewhere else, to removing the apps and only using browsers, to adding two-step verification, to adding time limits, to leaving my phone at home when I go on walks. Really, you name it. I did that. I did the same separately with my laptop.
I eventually starting listening to podcasts and reading articles about technology and social media. Around 2017-2018 this became fairly common, and the problems got more and more attention. More people started taking this seriously, although this was mixed (and still is) with exaggerations and misunderstandings. (To name but two podcasts that were helpful: Their Own Devices and IRL Podcast.) Around the same time (2016-2019) I was very active in the digital privacy world as part of Global Voices and IFEX (I was an editor on both sites) and the networks around them. So I went to a dozen or so conferences in half a dozen countries where a whole lot of other nerds were nerding the nerd out about everything technology, from specific algorithms to human rights issues, passing by how to deal with disinformation and so on.
This then developed into a more sophisticated understanding of just how the algorithms work, why the business model needs to be changed and how we were being affected, us, myself, everyone. The incremental changes in behavior aren’t easy to wrap your head around, but you can get the gist of it by reading folks like Jaron Lanier (the book title is fairly straightforward: “Ten Arguments For Deleting Your Social Media Accounts Right Now”). There are dozens of others saying the same things, but I’ve found him especially useful in explaining just how the algorithms work. You can also watch The Social Dilemma on Netflix – yes fellow nerds I know it has flaws, but it’s a good intro for a general public.
I won’t lie to you and say this is easy. It really isn’t, but the more people in your own circles do it, the easier it gets. The biggest difficulty wasn’t even quitting Facebook and Instagram and TikTok or limiting Twitter, but the fact that I am trying to deal with an addiction while being surrounded by other addicts. It’s like a law of gravity, it’s just inescapable unless there’s enough people also doing it. And it’s also not fair: the world around us can be so empty and cruel that even occasional cheering up on social media can be needed. This is certainly true for friends in Lebanon at the moment.
So as I didn’t really have many people around me actively doing this, I considered just becoming that person. I wasn’t even trying to change other people, I was merely trying to prove to myself that I can do it. I wanted to remind myself that a productive and healthy life is possible without relying on manipulative behavior modification. I can’t stress just how important that was.
Everything written above does not invalidate the good things being done on social media. The problem is that data is data and manipulative algorithms can’t tell the good from the bad. That’s not what they’re for. So what I recommend is to start by looking at your own use critically. Do the research you need (I will post a very long list of articles and books to read soon) and set the pace you’re most comfortable with.
If any of this is relatable, I would love to hear from you. My email is available here.