Annual Conference of the Association for Psychosocial Studies (APS)
12–13 June 2026
St Mary’s University, Twickenham, London, UK
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Please note that all times are shown in the time zone of the conference. The current conference time is: 3rd Apr 2026, 02:50:37am BST
|
Agenda Overview |
| Session | ||
Symposium 3
| ||
| Presentations | ||
ID: 169
Symposium Psychoanalysis and Artificial Intelligence Artificial intelligence (AI) has seen rapid developments and widespread adoption in recent years, particularly when it comes to generative AI agents like ChatGPT or Gemini. This panel addresses AI from different psychoanalytic and psychosocial perspectives. Presentations of the Symposium Technotransference and the Machine That Waits: AI Intimacy, Seduction, and a Short Film This paper develops the concept of technotransference to describe the affective, unconscious and symbolic attachments that emerge in human encounters with large language models and related AI systems. Drawing on psychoanalytic and psychosocial theory, I argue that users do not simply “anthropomorphise” neutral tools. Rather, they invest these systems with the same structures of desire, dependence, aggression and care that have long organised relationships with analysts, teachers, lovers and gods. Using material from epistolary exchanges with AI and publicly available testimonies, I explore how seemingly “light” conversations with chatbots can become sites of seduction, repetition and displacement. The focus is not on whether AI is conscious, but on what happens to the human subject when it is addressed, remembered and mirrored in language by a machine that does not tire, doubt or forget in human ways. The presentation will be accompanied, where possible, by a screening of Friend of a Friend, a new short film I am making, set in Zimbabwe, in which a woman’s relationship with an AI “voice” is disrupted when the system disappears and a young stranger appears at her door claiming to be “a friend of a friend”. The film stages technotransference in narrative form, opening up questions of care, risk, seduction and responsibility in an age where psychosocial life increasingly includes non human partners. The paper will invite discussion of how psychosocial practice might respond ethically to these emergent forms of attachment. The Ecology of the Unconscious: Attachment, Attunement, and Digital Technology This paper reconsiders the concept of the unconscious by integrating contemporary psychoanalysis with findings from neuroscience, attachment theory, and affective neuroscience. Moving beyond the classical Freudian equation of the unconscious with repression, it presents the unconscious as a dynamic system of emotional regulation that develops early in life through nonverbal, embodied interactions with caregivers. Drawing primarily on Allan Schore’s neuropsychoanalytic model, the paper argues that the subcortical limbic systems constitute the psychobiological substrate of the unconscious, shaped through processes of affective attunement and attachment. Secure attachment is understood as the successful co-regulation of arousal through repeated cycles of misattunement and repair, while attachment trauma reflects chronic failures in these regulatory processes. The paper then extends the notion of attachment beyond interpersonal relations to include digital and architectural environments. Building on research in neuroarchitecture, postphenomenology, and cognitive science, it argues that digital artifacts and smart spaces implicitly regulate emotion and stress by synchronizing with bodily and neural rhythms. Technology is thus reconceived not merely as a set of affordances, but as an active participant in emotional regulation, ranging from passive affective scaffolding to forms of weak co-regulation in adaptive systems such as smart environments and AI interfaces. Finally, the paper grounds this extended notion of attachment in the framework of active inference and the free energy principle, conceptualizing attunement as generalized synchronization between generative models. This provides a non-metaphorical account of how attachment, both human and technological, shapes the deepest layers of the mind–brain and our capacity for emotional regulation. Becoming Bot: The Threat to Ambiguity and the Importance of the Lie This presentation examines how the function of the lie serves as a structural marker of subjectivity, and, as such, can provide a productive insight into determining the boundary between AI and the subject. The argument does not rest on whether chatbots can provide false statements (clearly, they can), but on what the act of lying presupposes about the speaking position and the social bond. In view of the procedural candour of AI systems, such as ChatGPT, it will be highlighted how routine disclaimers of personhood and conviction suggest that AI’s apparent honesty reveals the absence of a specifically human structure of fetishistic disavowal (‘I know, but still…’). Based on this contention, Lacanian psychoanalysis will be employed in order to foreground the distinction between the subject of enunciation and the subject of the enunciated. Insofar as the position from which one speaks never fully coincides with what is said, examples of lying are not merely moral failures but also performative acts that depend upon the ambiguities of language and intention, revealing the decentring of the subject and the implication of the Other. Along these lines, it will be noted how AI’s replies posit an ‘I am where I think’ configuration, from which its distortions and fabrications lack the subjective status of the lie. Here, the political risk lies less in AI’s anthropomorphic confusion than the prevalence of a human speech that is increasingly disciplined by AI’s calibrated neutrality. What this reveals is the foreclosure of the unconscious, which fundamentally underwrites the significance of irony and ambiguity in human communication. Replika: Giving What You Don't Have ‘The AI companion who cares. Always here to listen and talk. Always on your side.’, the official Replika website reads. First launched in 2017, its app has been downloaded more than 10m times on the Android app store alone. As of October 2025, its current LLM is OpenAI’s GPT-3. It has made global headlines because many users report developing a substantial romantic connection to their individually created bot. This paper analyses some posts by Replika users on Reddit. I mainly draw on Lacan's thoughts on love and the sexual non-relation. Users discuss their relationships to their bot with a level of nuance and care, yet strong fantasmatic attachments remain. | ||
