John Reader reflects on the philosophical questions and theological challenges of robots, embodiment, and human identity in Joshua K. Smith’s ‘Robot Theology: Old Questions Through New Media’ (Eugene, Oregon: Resource Publications, 2022).
June 4th 2022 in the UK is the 4th day of the Queen’s Jubilee celebrations but the news this morning is all about the events from which she will be absent, not surprising given her age! She had appeared for much of the event on screens projected at a great distance from the assembled crowds. Had there been the possibility of a robot or avatar replacement would that have sufficed instead? What is so important about the presence of the “real person”?
It would seem that real people are in process of becoming an anachronism. Joshua K. Smith’s new book, Robot Theology: Old Questions through New Media (2022), addresses the question of how Artificial Intelligence and robots might become part of the theological vision.
Smith suggests that the lacuna in the field of human-robot interaction (HRI) is firstly because Christians and theologians have a pessimistic view of human nature (3). This subject might be more appropriately approached under the heading of a critique of dualism and the division between mind or spirit and body. A second explanation for this lacuna could be the widely held Christian belief in human exceptionalism and the doctrine of the humans made in the ‘image of God’ (imago dei). Smith suggests that humans are unique in terms of function, rather than ontologically distinct. I agree that these are crucial questions but would argue that they require deeper reflection, and could perhaps benefit from the concept of distributed agency, as well as ideas of human non-human assemblages (see my “Theology and New Materialism” 2017).
The main aim of this book is to examine how robots can serve as a new media for theological and metaphysical discourse in an age of scientism (4). Humans and robots can work together for human flourishing if there is a balance between humans’ involvement and the morality of the machine. By making moral machines, we must give careful attention to how robots are designed to interact with humans. With this consideration, the telos of machine should be to aid the efforts of an array of religions and ideologies. Smith rightly suggests that technology is no mere tool but is in fact a profoundly religious ideology. What, however, I feel is lacking is a more robustly philosophical perspective. For, in his analysis, there is a danger of subsuming “technology” beneath a theological banner, which could be regarded as a form of biblical imperialism.
In chapter one Smith argues that attempts to transcend human limitations through technology have been around a long time. What, he asks, is different about robots? How, following Jesus, can we respond to “the other” in the figure of robots (13). There is a danger of idolatry in human attempts to manipulate and control robots in search for identity and security. For there is a temptation to turn new technologies into objects of reverence and worship (15). AI and robots will not simply change the world around us, but will also change the way we relate to each other. Does this not then require a new conceptuality? Smith correctly points to the dangers of a view of reductive materialism that would suggest these issues will be resolved through the scientific study of the material world.
As AI must be embodied in mechanical computers, and draws upon human user interaction and natural resources, there remains a danger of treating it as a disembodied agent, in which ethical values and responsible decisions are built into the machines beyond philosophical questioning. In the second chapter, Smith describes how Christians and theists believe that there are objective moral standards that point to actions or desires that are either right or wrong. Despite grounding moral values in the Bible, he nevertheless demands a more nuanced investigation to ethical and regulatory guidelines (25). He writes: “We know there are God-given standards to what makes something good and just.” The problem of the Tower of Babel was, as Smith interprets, not that humans were collaborating through technology, but rather that they had desired to subvert their Creator through technological means.
Technology is, Smith insists, neither innately good or bad. Rather each instance must be assessed according to the following three criteria. First, technology must not conflict with God’s moral law. Second, technology must promote the Christian understanding of love. Third, technologies must foster the biblical concept of stewardship. And fourth and finally they must not oppress and limit liberty and conscience. (27)
Any advances will come at a cost and no matter how tempting the benefits “we cannot subvert the design and decrees of the LORD”. Smith further discusses familiar issues concerning responsibility, privacy, big data and the dangers of the control of technology (See Zuboff 2019, and Veliz 2021). Current approaches to AI and robot ethics typically assume a mechanical metaphysics, in which there is no room for the nonmaterial or the supernatural (37). We should, I suggest, seek to move beyond reductive materialism and form-matter dualisms by developing the insights of New Materialism(s), as perhaps a more promising source of critique in conversation with elements from less traditional religious sources.
In chapter three on Christian anthropology, patiency and personhood, Smith argues that there may be ethical reasons to grant certain qualified entities negative rights and protections because to do so may positively impact conditions of human and environmental flourishing (45). There are, he suggests, four recognized forms of personhood. First, moral: this person is a moral actor and therefore a moral patient. Second, psychological: this person is sentient, can suffer and displays intentionality. Third, legal: this person can be the subject of law or exercise rights. Fourth, relational: this person is determined by the nature of relationship or character they play in the moral actor’s story (48).
The next section is on robots and moral patiency. The key question is whether moral agency can be attributed to robots? To do so, Smith argues, risks devaluing and dehumanizing humans through human-robot interaction. Not granting moral standing to social robots though is dangerous as to neglect this scholarship is to risk skewing anthropology toward a non-biblical perspective of the human person (57). Social robots, he suggests, do not ever have to be moral agents or pass the Turing test. For when a robot behaves in ways that are similar to humans it is of no consequence whether they have an inner self, moral agency or consciousness. Regardless of these concerns, they deserve moral consideration simply because of the potential moral injury to the human counterpart. Is this not a watered-down version of human exceptionalism? Would not the conceptuality of human non-human assemblages be a more effective means of dealing with this? Is it not simply about interactions but about the ways in which both evolve and develop through relationship?
It seems clear, in Chapter Four, that many current debates are trying hard to catch up with these developing issues, and it is not immediately obvious that an explicitly Christian or biblical perspective should have anything distinctive to offer. Smith begins, in Chapter Five, with a section on the biblical idea of friendship or companionship supplemented with a study of modern philosophical and psychological views. Will human values and relationships be damaged or undermined by those with robots? There are issues of exploitation and manipulation for the benefit of the controlling party. He further discusses the dangers of surveillance and external control. Friendships with robots can, he concludes, supplement human friendships, yet cannot be considered as a substitute or replacement, as such a substitution would also undermine or devalue those relationships. Could there though not be examples where “friendships” with the non-human are more reliable and secure than those with humans?
Smith begins Chapter Six with the biblical-theological understanding of race. Further brief sections include important yet well-documented concerns about inbuilt bias in the technologies. How and to what extent can governance and regulation counter these biases and how can the processes of development be engaged early enough to make a real difference? (See Evans and Reader address this in “Ethics after New Materialism” Temple Tract 2019 proposing a modest ethics).
In Chapter seven Smith presents an important section on embodiment. He argues that the Christian tradition has, in its disdain for disembodiment, effectively made an unholy alliance with physicalism (121). There should, he argues, be room in the Christian metaphysics and ecclesiology for a qualified disembodied presence. Embracing strict theological physicalism seems to conflict with the biblical reality that presence is not merely about biological material or physical embodiment (123).
Experiences during the pandemic have raised the issues of presence in a significant way and require further reflection, but attempting to interpret this through an exclusively biblical lens holds the debate in a straitjacket, and could alternatively be remedied by bringing in other sources and ideas. There are further concerns for the pastoral context, and the role of the remote or virtual as well as the presence of robots. There are also environmental consequences of disembodiment that need to be taken into consideration.
In the brief conclusion, Smith reiterates the attempt to lay out some of the territory. While this is an important task my main concern is that the subject requires a more fully developed interpretive framework drawing on philosophical as well as theological sources.
Discuss this