×
This paper discusses ELSA (Ethical, Legal, and Social Aspects of technology) as an emerging methodology for transdisciplinary AI research, characterized by anticipatory technology assessment through close collaboration with diverse (societal) stakeholders. We offer a methodological reflection based on a 1,5 year-long case study on public safety and AI in Lombardijen, a neighbourhood in Rotterdam, the Netherlands, where we engaged residents as citizen stakeholders. Lombardijen is paradoxically under-resourced, meaning historically neglected and stigmatized as a ‘problem district’, yet over-researched, i.e. scrutinized by countless researchers who engage in what has been called ‘drive-by’ research – driving by, extracting data, and disappearing, often without benefits for the community. The community’s ensuing alienation from governmental and academic institutions means that citizens’ valuable contextual knowledge is often overlooked in public deliberation on AI. This raises our research question: How can citizens in low-trust neighbourhoods be meaningfully and reciprocally engaged in transdisciplinary AI research, and what does an ELSA approach offer in this regard? The paper details our experiences in Lombardijen respectively from ethical, legal, social, and technological perspectives. We candidly discuss our learnings, (modest) successes and limitations, ultimately emphasizing the importance of situated responsibility as a precondition for transdisciplinary AI research.