William Temple Foundation Trustee Maria Power reviews ‘Zucked: Waking up to the Facebook Catastrophe’ by Roger McNamee. The concerns are real and valid, she suggests, but can we do any better with our solutions?
In The Great Hack, Carole Cadwalladr tells us that: “We literally can’t have a free and fair election in this country and we can’t have it because of Facebook.” Brexit, the 2016 US Presidential Election, and the elections in Trinidad and Tobago, were all manipulated by digital media, and any attempts at regulation are met with avoidance tactics devised by teams of lawyers. Oil used to be the world’s most valuable commodity but in recent years data has taken that crown. Whilst we’re very alive to the damage that our thirst for oil has done to the planet, are we yet to wake up to the dangers present in our hunger for data?
In the last couple of years, the digital has been experiencing something of a ‘techlash’ with a series of polemics appearing which focus on the downsides of technology. These books frequently appear in non-fiction best seller lists and offer an insight into where the popular debate is heading. These have covered: big data, for example Invisible Women; the potential of AI; the racism inherent in algorithms; and the risks associated with the large-scale use of social media. Whilst these books function as polemical tracts, and in some cases as mea culpas of early funders of this technology, they have the power to trigger a debate about the place of the digital in our lives, and the control (or potential control) that it has over us.
For me, the most striking of these books was Zucked, written by Roger McNamee, one of the earliest funders of Facebook. As one of the former champions of social media who described himself as a ‘tech optimist’ and cited the Arab Spring as an unmitigated success, McNamee has since experienced a ‘Road to Damascus’ moment in his attitude to social media—and Facebook in particular. His book, part memoir, part polemic, claims that the organisational culture created at Facebook poses a fundamental threat to democracy through the microtargeting that is the foundation of its business model.
McNamee’s fears centre upon Russian influence in the 2016 election, which he estimates cost the Russian government $100,000, the Cambridge Analytica scandal, and the role that Facebook played in the fake news that brought us Brexit. Zucked claims that the company’s motto—move fast and break things—means that users are not people but metrics, and that the company feels no sense of civic responsibility. It was, according to McNamee, the perfect platform to allow such acts of espionage to occur.
However, the most interesting aspect of this book is the cult of personality surrounding Mark Zuckerberg, the corresponding insight it gives into Facebook’s organisational culture, and what this means for users and their relationship to the state and its democratic processes. This book tells us of an organisation run by a man who has lost the ability to see beyond the bubble in which he found himself, leading to a mindset focussed entirely upon growth and the harvesting of user data.
For example, users of Facebook can quickly find themselves stuck within a filter bubble. Facebook is carefully curated to keep us hooked, and to appeal to our baser instincts, which are then carried over into our offline interactions.
“Once a person identifies with an extreme position on an internet platform, he or she will be subject to both filter bubbles and human nature.” (p.93)
The basic premise of the argument is that the more outraged you are, the more content you will share with your friends in the filter bubble, thereby creating hysteria that is mostly fuelled by fake news. It also creates a safe space in which like-minded people can ‘find’ one another:
“Expressing extreme views in the real world can lead to social stigma, which also keeps them in check. By enabling anonymity and/or private Groups, the platforms removed the stigma, enabling like-minded people, including extremists, to find one another, communicate, and, eventually, to lose the fear of social stigma.” (p.91)
Facebook’s main aim is user engagement; to keep us on the site as long as possible, leading to profit for shareholders. The means of achieving this are deliberate:
“It starts out giving users what they want but the algorithms are trained to nudge user attention in the directions that Facebook wants. […] When users pay attention, Facebook calls it engagement, but the goal is behaviour modification that makes advertising more valuable.” (p. 9)
Facebook therefore uses our ‘lizard brain’ emotions to trigger anger and fear which means we’ll consume more content. Facebook is the fourth most valuable company in the US and, according to McNamee, “its value stems from its mastery of surveillance and behavioural modification” described above. (p. 9) It has very little motivation to change.
McNamee has a habit of proposing rhetorical questions, offering answers that fail to look outside the ‘filter bubble’ of Washington and regulation that he has found himself in. My favourite is:
“Now that we know that Facebook has a huge influence on our democracy, what are we going to do about it?” (p. 103)
The only solutions offered by McNamee are regulatory and focus on the top-down control of social media. This is where so much digital ‘ethics washing’ goes on, and, as he frequently points out, it is unlikely to work because companies employ lawyers whose sole purpose is to circumvent regulation. I came away from Zucked with far more questions than answers. And I was left pondering the following: How do we convince tech firms to programme a meaningful sense of civic responsibility into their algorithms? What should this civic responsibility look like? How can grassroots faith activists respond to this challenge? And is there a value in creating a theology of digital ethics?
More blogs on religion and public life…
Do not despise the day of small things by Gill Reeve
Blinded by grace? by Val Barron
How to debug theology? by John Reader