Shaping debate on religion in public life.

Author Archives: John Henry

It’s not simply Facebook’s fault

Leave a Comment

In this week’s extended blog John Henry, an ordinand at Ripon College Cuddesdon, argues that we must take some of the responsibility for how our data is used.

My Facebook ID is 36,811,991. This means that if you represent Facebook’s 2.4bn active users as 1,000 people in a field, I would be the 15th person who signed up. This tells you a bit about my age, and a bit about my geekiness—I am normally quite quick to try new technologies.

When I signed up to Facebook, the general reaction of others was either a look of blank incredulity or disdainful eye-rolling. Many people, experts included, foretold Facebook’s eventual demise; surely just a silly fad.

Yet Facebook is now worth about 500 billion dollars—comparable to the GDP of Belgium or Sweden—and is used regularly by about two thirds of the population of the planet who have internet access. It also commands over a fifth of global digital ad spending. Facebook has ignited and driven a technological and social revolution.

Facebook’s growth and success has continued to be guided by pragmatic decision-making to maximise its user base and its advertising revenue. In economic and business terms, the power of Facebook comes from the ‘network effect’—the phenomenon whereby increased numbers of people or participants improves the value of a good or a service. But you cannot just tap into the network effect for free—you also need excellent design and usability to attract and retain users in the first place. And, let’s face it, Facebook is useful. Almost everybody uses it, almost every day, for almost everything. And that’s the problem.

I am currently training at a theological college, surrounded by people who believe that everything we do has a moral significance. There is lots of talk about our response to the unfolding global ecological catastrophe; in our ethics class we debate the morality of abortion and of sexual ethics; in the college bar we argue about the care and protection of the poor and the vulnerable.

At the same time, we regularly hear of the problems involving the usage of social media. Last year, the major political parties spent tens, if not hundreds, of millions of pounds on unregulated Facebook advertising in order to ensure that each UK voter’s media consumption was perfectly tailored to prompt them to behave in the ‘right’ way in the General Election. Cambridge Analytica is gone, but it would be naive in the extreme to believe that a dozen other companies are not offering exactly the same mass-population manipulation services.

And yet, at theological college we are very much addicted to Facebook. The college community uses Facebook to create and share events; to update students on important issues; to chat with each other; even to manage band practice. It is this exclusive and compulsive usage, which keeps us continually inside a single networking and communication technology, that drives the continued power of the Facebook platform. It is what draws the millions of pounds of political advertising from the major political parties.

Equally importantly, by allowing ourselves to reside within the Facebook ecosystem and relying on it for the large majority of our news consumption and communication, we are wilfully allowing ourselves to be trapped inside digital ‘echo-chambers’. We only encounter beliefs or opinions that coincide with our own, so our existing views are reinforced, and we rarely consider alternative ideas. People had their favourite newspaper long before the internet was even dreamt of. But the problem now is that echo-chambers can be far more personalised, far more invisible, and thus far more powerful.

Many have turned on Facebook accusingly. They demand more regulation. And they argue that Facebook should take more responsibility for the content that is posted to peoples’ news feeds. Sacha Baron Cohen, in a speech recently delivered at the Anti-Defamation League, accused a handful of internet companies (including Facebook) as “amounting to the greatest propaganda machine in history” by giving precedent to “stories that appeal to our baser instincts”. Cohen claimed that Mark Zuckerberg “decides what information” the world sees.

But this is only partially correct. The algorithms which provide us with the basic functionality of social media platforms (e.g. filtering the huge volume of potential information down to what is manageable for each of us to read), are the same algorithms which external organisations exploit. The algorithms that keep the service free also allow us to be targeted. Facebook may be a propaganda machine, but it is not simply Facebook that controls the machine; we are responsible too through our behaviours and our choices.

Cohen’s core recommendation for social media tech companies is to hire more monitors: people who are deemed to have both the knowledge and the wisdom to decide whether or not content is acceptable. Putting aside the fact that such decisions can only be unproblematic for the most extreme forms of online behaviour, Cohen is, I suggest, focussing too much on the wrong people. You and I are also to blame.

The basic functionality of social media—the aspect of the technology that has revolutionised the way we communicate—is the same aspect that is dangerous. Social media allows us to share content to a theoretically infinite audience at virtually no cost. There are almost 300k Facebook status updates every minute. There is no way for Facebook, or any single social media platform organisation, with a necessarily finite number of employees, to police the behaviour of billions of people who take advantage of this free service. Furthermore, even if the potential to police this behaviour did exist, the dangers of implementing and controlling such policy are vast.

The truth is that you and I are at least partly responsible for the sad situation we are in because we are not managing our usage of this technology responsibly. We are unwilling to take the time and energy to actively select and control our social networks and communication tools. We are all too happy to enjoy the effortless experience of a single social media and communications environment. And we are addicted. With every post, comment, scroll, share and like we get another mini endorphin hit which draws us back for more. And yet those with selfish or malevolent intentions are now taking advantage of our addiction and our laziness.

There is a fair debate about just how responsible each individual can be expected to be when it comes to technology that is evolving so quickly. Many people may have limited time or ability to actively manage their social toolkits. What we are talking about is not easy. So, there is an important role for government and regulation. There are, no doubt, some policy changes that would greatly decrease the propaganda power of social media tools like Facebook. For a start, social media spend and advertising data by political parties should be fully transparent and reported.

But as much as we might wish political advertising to be banned completely, the AI sophisticated enough to draw the line between a ‘social’ message and a ‘political’ message has yet to be developed. More broadly, it is simply not possible to effectively police social-media disinformation. Instead of “purging lies and conspiracies”, as Cohen argues tech firms should do, we must accept increased inconvenience in our social media experiences. We must consciously choose to participate within social networks that are responsible and trustworthy in their policy and design decisions—even when this means the experience is not quite as easy as the mainstream alternative.

The good news is that the innovation that Facebook led is now a commodity. Previously, the barrier to change was both ‘usability’ (i.e. how well the tool works) and the ‘network effect’ (i.e. being where everyone else is). Today, only the latter is still significant. Although the power of the crowd is still a major challenge, history has shown that this particular barrier can crumble.

We must take control of, and diversify, the tools we use to consume and communicate. Ideally, we should not be using the same tools for both activities (e.g. Facebook and Facebook Messenger). We must be open to using new tools, particularly open-source or subscription services. New tools will come, and most will fail, but we must show our support for them by trying them.

WeMe.com is a recent, hopeful contender for an open-source, non-ad-funded equivalent to Facebook. But do not expect to log on and find all your friends there immediately. We have to be willing to be the first at the party and hang around for a while in the hope that others will come. This is the key action all of us have to take to play our part in curbing the power and influence of the social media giants. Regulation alone will not work.

More blogs on religion and public life…

Review of ‘Tragedies and Christian Congregations: the practical Theology of Trauma’ by Rosie Dawson

Review of ‘Looking beyond Brexit’ by Graham Tomlin by Greg Smth

Review of ‘Love in Action’ by Simon Cuff by Maria Power

Nobody is perfect: in the West, we are all climate hypocrites now by Tim Middleton

Share this page:

Lies and damned lies about statistics

Leave a Comment

John Henry, an ordinand at Ripon College Cuddesdon, makes a passionate moral case for increasing our statistical literacy.

Do you love statistics? Do you get excited at the idea of analysing a couple of million rows of data to understand what is going on? Or examining a complex set of charts with lots of correlation coefficients? Not likely, I suspect. And why bother anyway? After all, there are only lies, damned lies, and statistics. Right?

The origin of this well-known quip is often assumed, but in truth it is unknown. The validity of this statement is similarly assumed by most people, but on further reflection, this is equally misguided. The tragic irony of this endlessly-parroted ideology is that the proper use of statistics is a critical tool for countering falsehood and providing a firm foundation for good decision making and public policy.

The aim and function of statistics is to understand probability and uncertainty in our world. From the probability that the jet engine on your next flight won’t explode at 33,000 feet, to the probability that the amount of vaccine injected into your child won’t kill them, to the probability that global temperatures will rise by 2 degrees Celsius in the next fifty years—statistical methods keep billions of people safe and make life better.

Theologically, statistics (and our ability to understand and apply mathematics more generally) must be seen as one of the central gifts of mind that our Creator has given us. Like all our gifts and powers, we believe we have a moral obligation to use them for good.

But statistics, of course, can also be abused. Statistical findings, which often reach the public sphere as single bits of ‘data’, can form the foundation of corrupt initiatives by individuals and organisations whose objective is to convince, or obfuscate, or both. ‘Let’s spend £350m more on the NHS’ comes to mind at this moment in time.

But abuse through statistics requires two groups of conspirators: the creators of the false narrative based on bad data and poor statistical analysis, and the recipients of the false narrative. Unlike robbery or assault, ‘victims’ of statistical abuse cannot avoid bearing some responsibility for the crime—for it is our lack of understanding of solid statistics and data analysis which makes us vulnerable to abuse.

Whether it be the terrible mathematics teacher we had at high school, or our struggles with frequently abstract concepts, the vast majority of people lack the knowledge and understanding of basic statistical concepts that are required to sufficiently understand the complexity of the modern world. This is especially so when data and statistics are explained to us via the media.

Statistical methods and data analysis are the tools we use to extract meaning from data. But our statistical capabilities have not kept pace with the volume, complexity and importance of data in our world. People frequently complain that too many statistics are bandied about. But this could not be further from the truth—we need more and better statistics, not less.

So, what is the solution? Statistics is hard, and our time and capabilities are limited. It is obviously naive to expect us to all become data journalists or statistical experts. We clearly have no choice but to rely on organisations and individuals whom we trust to analyse and synthesise the complexity of the world and then communicate their insights to us. But given the complex world we live in, we have no option but to increase the statistical standards we expect of ourselves and each other. And I think we need to demand this improvement in two distinct ways.

Firstly, we must demand better statistical rigour from our day-to-day media. When a journalist or commentator uses data in an argument, we must demand a source and a statistical context. Failing that, we must use and support organisations like FullFact.org, whose purpose is to analyse the strength and validity of the data presented to us in the media. On our part, we ought to remind ourselves, or learn for the first time, basic concepts of statistical confidence. We should try to understand correlation coefficients, sample sizes, t-values, and compound annual growth rates. It’s all there on Wikipedia and YouTube. Then we should demand these from our media—and ignore those who refuse to offer the most basic level of statistical rigour.

Secondly, we must demand better data visualisation and properly employ the massive visual processing power of our brains. Professor Edward Tufte of Princeton University has spent his career developing the principles of how to do this well. Statistical analysis is not all about equations. And there are early signs of hope in this regard. ‘Data journalism’ is now a trendy topic. We frequently see ‘infographics’ in our media today. And we increasingly see the use of ‘micro-charts’ to give an immediate visual description of trends of a particular metric, rather than an absolute number or a single growth rate.

But we have a long way to go. Journalists, commentators and politicians continually get away with using single-number sound bites without any statistical context. The quality of ‘infographics’ is frequently so poor that you get little more insight than if the data was listed in a table.

At its heart, this is a profoundly moral issue of the responsibility that we all have to each other as citizens. As a society we have both rights and responsibilities. We live under laws and social constructs which make demands of us for the safety, security and wellbeing of our communities. For many of us, we believe in a divine call to love one another. But for these ideals and standards to be met, we accept the need for training and education. Our safety and the safety of others is protected by demanding sufficient training and testing before we are allowed to participate in activities which impact others (e.g. driving).

The risks, consequences and potential loss of a car accident are clear and visceral, but the risks, consequences and potential loss of poor statistical understanding and data analysis are much more abstract. Yet the risk and potential damage of millions of people misunderstanding the facts about the world is far greater. Our political and social systems, and more importantly, the global ecosystem, are at stake.


More blogs on religion and public life…

Whose “bloody GDP” is it anyway? by Tim Howles

Come the Resurrection…? by Rosie Dawson

Chinese Christian Schools in the 21st Century by Oscar Siu

Tell the truth and act as if the truth is real by Matt Stemp

Share this page: