It’s not simply Facebook’s fault
Leave a CommentIn this week’s extended blog John Henry, an ordinand at Ripon College Cuddesdon, argues that we must take some of the responsibility for how our data is used.
My Facebook ID is 36,811,991. This means that if you represent Facebook’s 2.4bn active users as 1,000 people in a field, I would be the 15th person who signed up. This tells you a bit about my age, and a bit about my geekiness—I am normally quite quick to try new technologies.
When I signed up to Facebook, the general reaction of others was either a look of blank incredulity or disdainful eye-rolling. Many people, experts included, foretold Facebook’s eventual demise; surely just a silly fad.
Yet Facebook is now worth about 500 billion dollars—comparable to the GDP of Belgium or Sweden—and is used regularly by about two thirds of the population of the planet who have internet access. It also commands over a fifth of global digital ad spending. Facebook has ignited and driven a technological and social revolution.
Facebook’s growth and success has continued to be guided by pragmatic decision-making to maximise its user base and its advertising revenue. In economic and business terms, the power of Facebook comes from the ‘network effect’—the phenomenon whereby increased numbers of people or participants improves the value of a good or a service. But you cannot just tap into the network effect for free—you also need excellent design and usability to attract and retain users in the first place. And, let’s face it, Facebook is useful. Almost everybody uses it, almost every day, for almost everything. And that’s the problem.
I am currently training at a theological college, surrounded by people who believe that everything we do has a moral significance. There is lots of talk about our response to the unfolding global ecological catastrophe; in our ethics class we debate the morality of abortion and of sexual ethics; in the college bar we argue about the care and protection of the poor and the vulnerable.
At the same time, we regularly hear of the problems involving the usage of social media. Last year, the major political parties spent tens, if not hundreds, of millions of pounds on unregulated Facebook advertising in order to ensure that each UK voter’s media consumption was perfectly tailored to prompt them to behave in the ‘right’ way in the General Election. Cambridge Analytica is gone, but it would be naive in the extreme to believe that a dozen other companies are not offering exactly the same mass-population manipulation services.
And yet, at theological college we are very much addicted to Facebook. The college community uses Facebook to create and share events; to update students on important issues; to chat with each other; even to manage band practice. It is this exclusive and compulsive usage, which keeps us continually inside a single networking and communication technology, that drives the continued power of the Facebook platform. It is what draws the millions of pounds of political advertising from the major political parties.
Equally importantly, by allowing ourselves to reside within the Facebook ecosystem and relying on it for the large majority of our news consumption and communication, we are wilfully allowing ourselves to be trapped inside digital ‘echo-chambers’. We only encounter beliefs or opinions that coincide with our own, so our existing views are reinforced, and we rarely consider alternative ideas. People had their favourite newspaper long before the internet was even dreamt of. But the problem now is that echo-chambers can be far more personalised, far more invisible, and thus far more powerful.
Many have turned on Facebook accusingly. They demand more regulation. And they argue that Facebook should take more responsibility for the content that is posted to peoples’ news feeds. Sacha Baron Cohen, in a speech recently delivered at the Anti-Defamation League, accused a handful of internet companies (including Facebook) as “amounting to the greatest propaganda machine in history” by giving precedent to “stories that appeal to our baser instincts”. Cohen claimed that Mark Zuckerberg “decides what information” the world sees.
But this is only partially correct. The algorithms which provide us with the basic functionality of social media platforms (e.g. filtering the huge volume of potential information down to what is manageable for each of us to read), are the same algorithms which external organisations exploit. The algorithms that keep the service free also allow us to be targeted. Facebook may be a propaganda machine, but it is not simply Facebook that controls the machine; we are responsible too through our behaviours and our choices.
Cohen’s core recommendation for social media tech companies is to hire more monitors: people who are deemed to have both the knowledge and the wisdom to decide whether or not content is acceptable. Putting aside the fact that such decisions can only be unproblematic for the most extreme forms of online behaviour, Cohen is, I suggest, focussing too much on the wrong people. You and I are also to blame.
The basic functionality of social media—the aspect of the technology that has revolutionised the way we communicate—is the same aspect that is dangerous. Social media allows us to share content to a theoretically infinite audience at virtually no cost. There are almost 300k Facebook status updates every minute. There is no way for Facebook, or any single social media platform organisation, with a necessarily finite number of employees, to police the behaviour of billions of people who take advantage of this free service. Furthermore, even if the potential to police this behaviour did exist, the dangers of implementing and controlling such policy are vast.
The truth is that you and I are at least partly responsible for the sad situation we are in because we are not managing our usage of this technology responsibly. We are unwilling to take the time and energy to actively select and control our social networks and communication tools. We are all too happy to enjoy the effortless experience of a single social media and communications environment. And we are addicted. With every post, comment, scroll, share and like we get another mini endorphin hit which draws us back for more. And yet those with selfish or malevolent intentions are now taking advantage of our addiction and our laziness.
There is a fair debate about just how responsible each individual can be expected to be when it comes to technology that is evolving so quickly. Many people may have limited time or ability to actively manage their social toolkits. What we are talking about is not easy. So, there is an important role for government and regulation. There are, no doubt, some policy changes that would greatly decrease the propaganda power of social media tools like Facebook. For a start, social media spend and advertising data by political parties should be fully transparent and reported.
But as much as we might wish political advertising to be banned completely, the AI sophisticated enough to draw the line between a ‘social’ message and a ‘political’ message has yet to be developed. More broadly, it is simply not possible to effectively police social-media disinformation. Instead of “purging lies and conspiracies”, as Cohen argues tech firms should do, we must accept increased inconvenience in our social media experiences. We must consciously choose to participate within social networks that are responsible and trustworthy in their policy and design decisions—even when this means the experience is not quite as easy as the mainstream alternative.
The good news is that the innovation that Facebook led is now a commodity. Previously, the barrier to change was both ‘usability’ (i.e. how well the tool works) and the ‘network effect’ (i.e. being where everyone else is). Today, only the latter is still significant. Although the power of the crowd is still a major challenge, history has shown that this particular barrier can crumble.
We must take control of, and diversify, the tools we use to consume and communicate. Ideally, we should not be using the same tools for both activities (e.g. Facebook and Facebook Messenger). We must be open to using new tools, particularly open-source or subscription services. New tools will come, and most will fail, but we must show our support for them by trying them.
WeMe.com is a recent, hopeful contender for an open-source, non-ad-funded equivalent to Facebook. But do not expect to log on and find all your friends there immediately. We have to be willing to be the first at the party and hang around for a while in the hope that others will come. This is the key action all of us have to take to play our part in curbing the power and influence of the social media giants. Regulation alone will not work.
More blogs on religion and public life…
Review of ‘Tragedies and Christian Congregations: the practical Theology of Trauma’ by Rosie Dawson
Review of ‘Looking beyond Brexit’ by Graham Tomlin by Greg Smth
Review of ‘Love in Action’ by Simon Cuff by Maria Power
Nobody is perfect: in the West, we are all climate hypocrites now by Tim Middleton