Skip to main content

Highlights

Virtual Ethical Innovation Lecture

VEIL - Aidan Peppin

Deepening Participation in Data and AI Policy

 

Summarized by Sabrina Breyer

 

 

Highlights of the VEIL

This term’s first Virtual Innovation Lecture featured Aidan Peppin of the Ada Lovelace Institute, which was presented as “an independent research institute and deliberative body with a mission to ensure data and AI work for people and society”. The interdisciplinary structure consisting of academic, policy and industrial expertise aims at „convening diverse voices“, “building evidence”, as well as “shaping and informing good policy and practice” across five thematic areas: biometrics, health data, public sector use, ethics in practice, and future of regulation.

Why do we need participation?

In his presentation, Aidan Peppin started out by elaborating why there is a need for participation in developing AI technologies in the first place. He refers to the argument that data-driven technologies are rarely disconnected from societal and cultural contexts in which they are deployed (Peppin et al. 2020). Instead, data and technologies are shaped by value-based choices, which means–among other things–that these determine what the product aims at, how it is designed, what it affects, how transparently it is developed, and which benefits it shall bring (and for whom). Here, the main concern is that these values are rarely determined by those most impacted by the AI technology which provokes an incongruence between the values that shape AI and the values of society as the most affected. Hence, it should be ensured that the values of those affected and the ones guiding the innovation are in alignment and therefore there is a need for creating space to amplify the voices to achieve a deepening of participation. For highlighting the intended benefits, Aidan Peppin refers to his colleagues from the Ada Lovelace Institute (Patel et al. 2021), who show different aspects being increased by deepening participation in data and AI policy: data use and data-governance models underpinned by a strong sense of trust and confidence; fairer and more equitable outcomes; broader social welfare outcomes supported by data; increased social license and active participation; more representative, inclusive and proportionate uses.

What does public participation look like?

For demonstrating what public participation can look like, Aidan Peppin refers to a loose framework of five categories: inform, consult, involve, collaborate, and empower (Patel et al. 2021). To show how these five categories can be implemented, Aidan Peppin highlights different examples and case studies.

The category “inform” is one-directional and is more narrowly construed as telling people what they might need to know. For instance, to facilitate the transparent use of patient data, the explanation of what data is being used, by whom, why and how it is collected and looked after is important. This means that most of the time there is no need for in-depth knowledge on behalf of the users and those affected. However, this does not preclude that there is no need to co-shape in terms of intended goals and means employed. Another example pondered the insight that there are two dominant narratives about data–‘as a natural force to be controlled and as a natural resource to be consumed’ (Puschmann and Burgess 2014). Being aware of this distinction can identify where the power lies.

The category “consult” is more bi-directional and focuses on gathering people’s views. The case study “Beyond Face Value” (Ada Lovelace Institute 2019) gathered what people are comfortable with in connection to facial recognition. A further example about “Location data dialogue” (Sciencewise, Travers and the Ada Lovelace Institute 2021) gathered implications to understand public attitudes about location data.

The category “involve” is much more conversational. An example project run by the Ada Lovelace Institute is “The Citizens’ Biometrics Council” (Ada Lovelace Institute 2020) which consisted in a mini-public citizen council with 50 members of the public across the UK and 60+ hours of deliberation. Different experts (campaigners, academics, tech developers) have been brought in with the aim to develop recommendations ensuring an independent body of oversight, stronger regulation and standards with respect to what biometrics must adhere to.

The category “collaborate” aims at actually involving people in the writing of policy. The example of the “Camden Data Charter” (Camden 2021) explores and develops a policy framework and set of principles to guide the deliberative approach on how to use data now and in the future.

The category “empower” shall give people the power to directly decide how data is used or to shape policy. The example “Salus Coop” (citizen data cooperative that accelerates research and innovation in the healthcare sector) provides a place for citizens to govern how their own data is collected and used.

What can we learn from participation?

The biggest lesson that Aidan Peppin learned from engaging in participative methods is that we need to strive towards building trustworthiness, but not trust. This is about shifting away from asking the public to be more trusting to asking the powerful to be more trustworthy (see: O’Neill 2018).

The second lesson shows that often diversity is more important and more useful than representativeness. So, on the one hand large surveys need to be representative but at the same time on the other hand minority perspectives are underrepresented or completely ignored. It is not about having every view represented, it is about having a range as diverse as possible in order to not ignore any perspective in developing policy and generating outcome (see: Steel et al. 2020)

The third lesson highlights that there is no “one-size-fits-all” approach to participation. The recommendation emphasizes to identify the problem first and then the methodology which considers the objectives, the achievement, the target group, rather than trying to survey a topic with a prematurely picked method (see: Arnstein 1969).

The last lesson Aidan Peppin shared tackled a common preconception that the general public is not able to handle the complexity inherent to the technological issues. Peppin disagrees and–from his experience–has oftentimes experienced that people can handle complexity, if given the space and time. In contrast, Aidan Peppin emphasizes that they can handle the complexity and, hence, can meaningfully participate provided that they are given the space, the resources, the information, the support, the access to material and the time to engage with these topics (see: Dryzek et al. 2019).

Q&A

Questions during the Q&A session revolved around the assurance and implementation of diversity in participation and how to foster trustworthiness in the context of the fear of data and AI. Furthermore, participation and other approaches as a justification of the social license to operate have been discussed.

Further literature

Peppin A., Patel R., Parker I. (2020): No Green Lights, No Red Lines. Public perspectives on COVID-19 technologies. Ada Lovelace Institute, https://www.adalovelaceinstitute.org/wp-content/uploads/2020/07/No-green-lights-no-red-lines-final.pdf

Patel R. et al. (2021): Participatory Data Stewardship. A framework for involving people in the use of data. Ada lovelace Institute, https://www.adalovelaceinstitute.org/wp-content/uploads/2021/09/Participatory-data-stewardship_Final-report.pdf

Puschmann C., Burgess J. (2014): Big Data, Big Questions. Metaphors of Big Data. International Journal of Communicatio. 8, 1690-1709. https://ijoc.org/index.php/ijoc/article/view/2169/1162

O’Neill O. (2018): Linking Trust to Trustworthiness. International Journal of Philosophical Studies. 26(2), 293-300. https://doi.org/10.1080/09672559.2018.1454637

Steel D., Bolduc N., Jenei K. Burgess M. (2020): Rethinking Representation and Diversity in Deliberative Minipublics. journals of Deliberative Democracy. 16(1), 46-57. https://doi.org/10.16997/jdd.398

Arnstein S. R. (1969): A Ladder of Citizen Participation. Journal of the American Institute of Planners, 35(4), 216-224. https://doi.org/10.1080/01944366908977225

Dryzek J. S. et al. (2019): The crisis of democracy and the science of deliberation. Science. 363(6432), 1144-1146. https://doi.org/10.1126/science.aaw2694