WORKSHOP 6: Social Robots Between Trust and Deception. The Impact on Institutions and Practices

Organizers

Paolo Dario, The BioRobotics Institute, Pisa, Italy

PAOLO DARIO is Professor of Biomedical Robotics and scientific coordinator of the research area Robot companions for citizens at the Sant'Anna School of Advanced Studies-Pisa (SSSA). He is one of the most authoritative voices of Italian robotics. He is the director of the Italian Competence Centre on Advanced and Collaborative Robotics and Artificial Intelligence ARTES 4.0.  

Marianna Capasso, The BioRobotics Institute, Pisa, Italy

Marianna Capasso is Research Fellow at the Institute of BioRobotics at SSSA, Italy. Her main research interests are: philosophy of technology, applied ethics, political theory. In particular, Marianna’s research is focused on Meaningful Human Control, Responsibility with AI-driven systems, Neorepublicanism, Value Sensitive Design.  

Federica Merenda, The Dirpolis Institute, Italy

Federica Merenda, is a Post-Doc Research Fellow at SSSA. She is a researcher in Political Philosophy and Human Rights working on the impact of Robotics and AI in such fields. She also serves as an Expert in the Law and Ethics of AI for the Italian Ministry for Technological Innovation and Digital Transition.

Fiorella Battaglia, University of Salento, Lecce, Italy

Fiorella Battaglia is Professor of Philosophy at the University of Salento and the Ludwig-Maximilians-University-Munich. She has a longstanding competence in ethics, metaethics, normative ethics, and philosophy of technology. Her outline of a normative concept of ‘human nature’ (2017) is a cornerstone in the debate about anthropological questions related to the tech-discourse.

Abstract

Since robots with a high degree of adaptability and autonomy are now increasingly supporting humans both at work and in their private life, new applications are emerging, to make optimal use of robots that need to cooperate with humans in different social settings. All these factors are pushing the growth of social robotics market to new frontiers, whose ambition is to provide sophisticated and novel approaches to robots performance, social acceptability and sustainability. This workshop is designed to critically evaluate these questions from a transdisciplinary perspective, by exploring how social robots can affect and challenge trust in human social institutions and practices.


Speaker

Paolo Dario, Sant’Anna School of Advanced Studies, The BioRobotics Institute, Italy

PAOLO DARIO (paolo.dario@santannapisa.it) is Professor of Biomedical Robotics and scientific coordinator of the research area Robot Companions for citizens at the Sant'Anna School of Advanced Studies-Pisa (SSSA). He is one of the most authoritative voices of Italian robotics. He is the director of the Italian Competence Centre on Advanced and Collaborative Robotics and Artificial Intelligence ARTES 4.0. He has published more than 200 articles and 300 chapters and holds 50 international patents. He has been serving as a mentor for a number of established scholars such as Maria Chiara Carrozza, Cecilia Laschi and Arianna Menciassi.

Robot Companions for Citizens and Sustainable Development

Recently, numerous studies have acknowledged the role of robotics in the framework of the SDG targets (Vineuesa et al. 2020). There are a variety of ways in which robotics can and must intervene as an enabler and facilitator of sustainable development. Existing examples include rescue robotics that e.g., through cleanup marine robots can clean oceans’ seabed and remove floating plastic or help after devastating oil spills as in the Deepwater Horizon case (Murphy 2017). But the contribution of robots is not to be limited to environmental sustainability (Yang et al. 2020). SDGs also refer to social sustainability. Robot Companions (RCs) are particularly appropriate to meet this challenge, as they can also effectively contribute to mitigate the gap between developing and developed countries, and between social groups in the same country (Dario et al. 2011; Ferri et al. 2011), which is a core concern of the concept of sustainability as laid out in the UN 2030 agenda (UN 2015). In this presentation, I will discuss the opportunities that the introduction of Robots Companions opens to reach social good and help build up a more sustainable society, in line with the TERRINet aim of proposing measures to promote the introduction of Robotics in society. 


Speaker

Marianna Capasso, Sant’Anna School of Advanced Studies, The BioRobotics Institute, Pisa, Italy

Marianna Capasso is Research Fellow at the Institute of BioRobotics at SSSA, Italy. Her main research interests are: philosophy of technology, applied ethics, political theory. In particular, Marianna’s research is focused on Meaningful Human Control, Responsibility with AI-driven systems, Neorepublicanism, Value Sensitive Design. She has authored or co-authored articles on such topics in journals such as Medicine, Health Care and Philosophy, Minds and Machines, Frontiers in Robotics and AI, and others. 

Is There A Need For Critical Robotics Research?

Predictions indicate that robots will become part of our social environments, and they are already taking over tasks in many sectors (Gasser 2021). The development of robotic skills that are able to understand the social context, to respect social norms and recognise social norms violations are fundamental for ensuring trust in human-robot interaction practices. To date, the question of how social robots can comply with social norms is highly debated, but the current state of the art still needs considerable improvements from a transdisciplinary perspective, especially from Social sciences and Humanities disciplines (Avelino and al. 2021). One the major objectives of TERRINet was to reach out other scientific communities beyond the Robotics one, in order to foster a more meaningful approach towards innovation. Thus, my aim is to situate the question of Social Robotics in the theoretical literature on social norms. In social norm theory, a social norm can be defined as a collective practice sustained by empirical and normative expectations (Bicchieri, 2006; descriptive norms and injunctive norms in Cialdini and al. 1990). By investigating the frameworks used in social-norms theories, I identify four theoretical spaces of inquiry for social norms (nature, relation, evolution, categories of actors) (Lagros and Cislaghi 2020). Finally, I discuss how the introduction of social robots can impact on these four themes, to further advance the understanding of the societal significance of robots. 

Speaker

Federica Merenda, Sant’Anna School of Advanced Studies, The Dirpolis Institute, Pisa, Italy

Federica Merenda is a Post-Doc Research Fellow at SSSA, where she obtained her Ph.D. cum laude. She is a researcher in Political Philosophy and Human Rights working on the impact of Robotics and AI in such fields. She also serves as an Expert in the Law and Ethics of AI for the Italian Ministry for Technological Innovation and Digital Transition. She has authored and co-authored articles in peer-review journals such as Minds and Machines, Frontiers in Robotics and AI and Ragion Pratica and she participated in edited volumes by Routledge, Edward Elgar and others. 

Assessing the Quality of Democracy in the Era of Social Robotics. Public Service Robots and Institutional Trust

Political Science methodologies to assess the quality of democracy identify three dimensions to evaluate the performances of contemporary democratic institutions: procedure, content and result (Diamond and Morlino 2004). To evaluate the quality of democracy with respect to its result a pivotal role is played by responsiveness, that is the ability of governants to honour the trust that citizens award them with (Dahl 1973; Almond and Verba 1979). How does the introduction of robots in public service design influence citizens’ trust in the public administration (Kok and al.2020)? In my research I look at the impact that the employment of robotic systems to substitute or to second public officials has on institutional trust (Luhmann 1990), that is the dynamic relation between an individual and the social/political system he interacts with when using public services. Case studies will be identified in the security and healthcare sectors. 

Speaker

Fiorella Battaglia, University of Salento, Department of Humanities, Lecce, Italy

Fiorella Battaglia is Professor of Philosophy at the University of Salento and the Ludwig-Maximilians-University-Munich. She has a longstanding competence in ethics, metaethics, normative ethics, and philosophy of technology. Her outline of a normative concept of ‘human nature’ (2017) is a cornerstone in the debate about anthropological questions related to the tech-discourse. She has been working extensively in action theory, roboethics, ethics of information technology and published more than 70 scientific articles on the normative, anthropocentric, and ethical aspects of techno-social systems.

The Risk of Deception

Trust is an essential feature of social interaction. Without trust and mutual respect it is impossible to establish any genuine and valid cooperative relationship (Baier 1993; Jones 1996; Oshana 2014). As sharers of a common interest, as members of the same family, as colleagues, as friends, as lovers, as citizens, as chance parties in an enormous range of transactions and encounters, we cannot help but to trust others and show ourselves to be trustworthy partners towards those involved in the relationship. Trust characterizes not only relationships among humans but also those between humans and machines (Weckert 2005; Ess 2010). This point questions the possibility of trusting robots as Robot Companions and consequently our very notion of trustworthiness. Trust also entails the possibility of deception, which can hurt us in many respects. Given this potential threatening aspect of trust, it is important to assess carefully when trust is properly placed. In a similar way, it is fundamental to investigate possible forms of hindrances that could impede the development of trustworthiness (Scheutz 2009; Danaher 2020; Sætra 2021; Malle et al 2015). Both inquiries are required, since making room for some forms of deception could have some undesired impact on the foundations of human cooperation and social practices.