About

Hi there!👋   I’m currently a PhD researcher at Leiden University under the supervision of Prof Dorota Mokrosinska and Prof James McAllister, working on privacy and social issues in affective computing. My work spans topics in philosophy of technology, social psychology, pragmatic epistemology, information ethics, and privacy, and mainly addresses issues in normative and practical ethics of new and emergent technologies, specifically the ones at play in human-computer interactions and in the development of emotional AI.

In my PhD research, I investigate the social implications of AI artefacts that detect, recognise, and analyse peoples’ emotions. The central aim is to go beyond current work that has pinpointed the consequences (and potential harms) that directly arise from their technical limitations - such as gender bias and group discrimination - and to explore the fundamental issues raised by the exposure and commodification of the affective life. To do so I currently adopt an anticipatory approach informed by empirical research to develop technomoral scenarios that reveal the ethical risks of using emotion recognition technology (ERT) as a type of decision support system in human-human relations. I further explore the implications of this new practice for mainstream conceptions of (social) privacy and, drawing on conceptual engineering theories, suggest that the recognition of ‘emotional privacy’ a a sub-concept of privacy is necessary to provide adequate protection against the pervasive effects of ERTs.

My main objective is to analyse the social effect that emotion recognition technologies (ERTs) will have on human relationships and the Self. What if we had easy access to real-time recognition of emotions? How would this impact the social dynamics? In which contexts would this be desirable and in which contexts would this be detrimental? Do we have any legitimate claims to privacy over the emotions we display in public social settings? These, and more, are the questions I am currently trying to answer.

In my Master’s thesis, I looked at the social and moral problems directly caused by the use of technically constrained ERTs. My master’s thesis was funded by an OBVIA grant (Canada) and is entitled Affective Computing: Is the Use of Emotion Recognition Technology Consistent With Social Justice?. It’s available on the university website or you can check it out here.

My PhD research is funded by the SSHRC of Canada.

You can download my cv here


Articles, Podcasts, and News About my Research https://github.com/AlexandraPregent/AlexandraPregent.github.io/assets/165220288/f362716c-61f5-4a93-9807-b0812a4f4c81

🗣️ Parlons Éthique (2021) / 🧔🏻‍♂️Podcast Animator: Keven Bisson

🎥 Contextual Integrity & Socially Disruptive Technology (2024) / 🔴 ▶️ YouTube Video

🎭 False Emotions (2021) / By Raymond Poirier News Article

🤖 Émotions, Contexte et IA: facile de se tromper (2021) / By Thot Cursus News Article