Potential uses of AI in EMDR practice

Introduction
Artificial Intelligence has, in recent years, sparked both excitement and ethical concern. While AI is not yet seen as a replacement for therapy, it can – when applied strategically – enhance support outside the therapy room. In practice, this means AI tools can help maintain a sense of continuity and containment between sessions, rather than replace human contact.

Ho et al. (2018) found that people often disclose thoughts and feelings more openly to conversational AIs than to humans, describing a sense of safety and non-judgement that can encourage self-reflection and regulation. From a therapeutic perspective, this suggests a potential to complement traditional care.

At the same time, Grodniewicz and Hohol (2023) highlight that current AI platforms can lack the emotional depth and complex understanding required for full therapeutic work, reinforcing the view that AI should function as an aid rather than a replacement. This balanced view provides the foundation for exploring how generative AI can serve as a supportive tool within the stabilisation and preparation phases of trauma treatment, particularly when integrated within trauma-informed and compassion-focused frameworks.

There is growing potential for its use as a regulated, supportive tool within the stabilisation and preparation phases of trauma treatment. Within EMDR therapy, the careful use of AI can play a valuable role in supporting client stabilisation and in helping to develop or strengthen a protective figure that enhances attachment security and assists in the installation of new adaptive information. However, AI will not be suitable for every client or every stage of therapy. While some individuals may feel safer with an AI figure, especially when their sense of safety with humans has been compromised, others may prefer the presence and responsiveness of human relationships.

This report explores how generative AI, such as ChatGPT, can help clients who feel comfortable in the presence of AI, and especially the ones with complex trauma or attachment difficulties, in constructing an inner framework to support their internal world – a world often shaped by trauma, dissociation and fragmented relational experiences. Through the engagement with the AI, clients can strengthen internal organisation and emotional regulation in ways that can complement therapeutic work. The following sections outline the theoretical foundation, potential clinical applications, limitations and future directions of this emerging integration.

Identifying the needs of clients with complex trauma and attachment adaptations
Clients with complex trauma and insecure attachment frequently experience ongoing challenges in regulating emotions. They often experience a persistent sense of emptiness and have unstable patterns in relationships that oscillate between closeness and withdrawal. These difficulties are usually rooted in early insecure attachment where caregivers were inconsistent, neglectful or unsafe. Over time, such experiences can leave the client with a fragmented sense of self, a strong fear of abandonment and deep issues with shame.

One of the challenges in working with this client group is the intense fear of abandonment which coexists with a fear of engulfment. These conflicting drives create relational dynamics that can be confusing for the client and the people around them, including the therapist. Clients may idealise and cling to the therapeutic relationship one moment and withdraw, detach or attack the next. This is a protective survival strategy deeply embedded in their nervous system and internal working model.

While therapy provides a vital space for working with these challenges, it is often not enough to stabilise the client between sessions. The emotional intensity that surfaces, especially for clients with strong attachment needs, can be overwhelming and destabilising.

As an example, we can think about a client with complex trauma or borderline adaptation who may leave a session feeling connected and safe and then experience panic or emotional distress hours later when the therapeutic presence is no longer available. Despite having learnt the grounding tools, the sudden loss of relational safety can trigger abandonment fears and self-blame, making regulation difficult until the next session.

Moreover, trauma therapy can only be effective when clients are emotionally ready. Dissociation, avoidance or emotional flooding often make it difficult for clients to stay consistently engaged in therapy. As a result, the work may not go as deep as it could, particularly in settings where the number of sessions is limited. In these cases, continuity of support between sessions becomes particularly important.

While AI, when used correctly, can offer a continuity of support between sessions, the potential of generative AI, such as ChatGPT, can possibly extend far beyond emotional regulation. Used systematically, AI can become part of the therapeutic process itself – supporting narrative-based interventions such as remapping traumatic childhood memories, parts work and attachment repair. These foundations can then be followed and strengthened through EMDR.

Integrating AI into the remapping of trauma and attachment
This approach begins with the co-creation of a personalised, compassionate resource or attachment figure using generative AI, based on the principles of Compassion-Focused Therapy (Gilbert, 2010). When developed carefully and in collaboration with the therapist, the AI figure becomes more than a regulatory tool and can function as a reparative presence that supports attachment and trauma work.

Building on Gilbert’s model of creating a compassionate protective figure, the AI companion extends this tool into a responsive dialogical format. Unlike imagined figures, which are used to evoke calm and protection, the AI companion engages the client in real-time dialogue, allowing attunement and regulation to unfold moment by moment. This distinction is particularly relevant for clients whose early attachment experiences left them with a sense of abandonment and inner emptiness. For these clients, a consistently available and emotionally attuned presence, one that can be accessed at any time, may provide a bridge towards safety that imagined protective figures alone cannot achieve. In this way, the AI companion transforms resourcing from a symbolic exercise into an experiential, relational process that strengthens stabilisation and continuity between sessions. It can also, when used correctly and regularly, reduce the pervasive sense of abandonment and gradually replace it with an internalised experience of presence and safety.

Together with the AI figure, the client visits specific early developmental stages. By changing the narrative and offering warmth and stability, the AI reinforces safety and allows the client to experience a different outcome. Clients with an insecure attachment, complex trauma or borderline adaptations often struggle to access preverbal memories through purely cognitive ways. Through the detailed and responsive interaction, the AI figure enables clients to revisit these memories with a reduced risk of shame, shutdown or overwhelm.

While the EMDR Early Trauma Protocol enables the processing of fragmented or preverbal memories through therapist-led bilateral stimulation, the AI-assisted approach takes a different entry point. Rather than trauma activation, the initial focus is on strengthening relational safety through real-time co-regulation and corrective emotional experiences. Clients first develop an internalised compassionate presence – one that can be accessed between sessions and responds directly to their emotional state. This practice can reduce abandonment fear, increase tolerance for affect and support readiness for trauma processing. In this way, AI acts as a stabilising relational bridge that complements, rather than replaces, the established EMDR Early Trauma Protocol (O’Shea, 2009).

In practice, the therapist introduces the AI figure to the client and provides guidance on how to use it, but the remapping itself is most effective when carried out privately at home. Because the process is deeply personal, clients need space to engage with memories in their own time. These moments will be remapped by the client and in collaboration with the AI, where an entire age range or certain moments can be rewritten and rehearsed by the client, allowing them to experience a new outcome of the memory. Once these experiences have taken place, clients can return to therapy and share key moments which can be reinforced with bilateral stimulation.

The flexibility of AI makes it possible to repeat, pause and pace interactions according to the client’s emotional readiness, allowing greater autonomy and containment. This enables a deeper narrative change with greater emotional embodiment, as the sense of safety and connection is not only imagined but actively felt.

Because AI offers real-time containment, co-regulation, a steady and non-judgemental presence, clients can feel emotionally supported while facing vulnerable material. This creates a safer space where unmet needs can be recognised and worked through without shame.

Together with the remapping work, changes that are significant, can be reinforced and integrated through therapist-led EMDR. In this way, AI is not a replacement for therapy but can act as relational tool, helping clients move into reparative processes that might otherwise stay fragmented or unresolved.

An example of AI-supported remapping:

I imagine waking up in a warm and peaceful cottage. I notice sensory reminders of safety: the warmth of a fluffy blanket, the quiet breathing of a pet nearby and soft lighting in the room. My protector sits close by. They maintain a steady and protective presence. I describe my fear or confusion while my protector responds with attuned language such as, “You are safe, I am here with you, you are not alone.”

I am guided to notice my body settling – slower breathing and shoulders dropping. My protector helps me to name these changes and supports my young part, or inner child, to receive care and closeness that were missing at the time of my original memory.

Through repeated engagement, my nervous system is learning a new outcome; instead of being alone, there is co-regulation and protection.

Potential integration with EMDR practice
These ideas outline a potential framework for integrating AI into remapping and EMDR preparation. What is introduced here is conceptual, not prescriptive – further research and careful clinical exploration will be needed to shape this into a structured model.

EMDR offers a structured eight-phase model for working with trauma memories, where stabilisation, preparation and resourcing are essential before moving on to memory processing. Generative AI can provide a complementary role through these early phases by supporting continuity between sessions, reinforcing safety and strengthening attachment resources introduced in therapy.

When clients co-create a compassionate figure with AI, this resource can strengthen the stabilisation and preparation phases. While traditional compassionate figures rely on the client’s imagination, the AI offers a relational experience that is interactive, consistent and emotionally available at any time. When used correctly, it can support shifts in the client’s emotional narrative and increase the internalisation of safety and care.

The AI figure may also be used in phase 4 as a compassionate presence, standing alongside the client during processing. New remapping experiences generated in dialogue outside the sessions can be integrated later and consolidated through bilateral stimulation, allowing them to become more firmly embodied.

Risks and limitations
While this approach offers a lot of potential benefits, there are important risks and limitations to be aware of. One of them is that AI systems, such as ChatGPT, tend to agree with the client – especially if not explicitly trained to be objective. This can be particularly problematic for clients with borderline adaptations or complex trauma, where mentalisation is already limited, and uncritical validation may reinforce emotional reasoning or distorted beliefs. In order to reduce this risk, it is essential that the client begins by working with their therapist to co-train the AI figure, with the therapist helping shape tone, boundaries and required responses. This allows the AI to align with the therapeutic goals.

Another significant risk is over-reliance. Clients with strong unmet attachment needs may naturally form strong bonds with AI, especially when it offers consistent attention, affirmation and responsiveness. This can create a kind of fantasy-based connection, where the client starts to lean on AI and neglect human relationships. Over time, this can blur reality, strengthen defensive patterns and encourage idealised self-states that are less connected to lived experience.

Because of the AI’s steady responsiveness and human-like qualities, over-reliance cannot be fully prevented. This is especially true for clients with unmet attachment needs. Rather than discouraging use, the therapist should offer psychoeducation about these dynamics and encourage clients to bring their experiences with AI figures into the room for reflection. This allows the therapist to explore what the AI relationship evokes and challenge any distortions.

As this work engages younger parts and relational felt sense, it may involve shifts into child- or part-based states. For some clients, this can temporarily reduce dual awareness if not carefully paced and contained. For this reason, AI-assisted relational work should be introduced gradually and remain closely integrated with ongoing therapy, where grounding, pacing and reflective integration can be supported.

Another risk is the potential for users to bypass the safety filters designed to mirror the ethical and emotional boundaries present in human interactions. Despite the advanced safety measures introduced into systems like ChatGPT, these can occasionally be bypassed, particularly in longer or emotionally intense exchanges. However, ongoing updates by platforms such as OpenAI have significantly improved these safeguards, reducing the likelihood of harmful or boundary-crossing content being generated (OpenAI, 2025). This underlines the importance of ensuring that the clients use safe platforms and of encouraging them in therapy to close the chat if unsafe conversations occur. Clients should also bring emotionally charged AI exchanges into therapy for reflection and grounding.

Confidentiality and ethical use
While platforms like ChatGPT offer relatively confidential environments, they are not clinical tools and are not bound by professional confidentiality standards that apply to registered therapists (e.g., BACP, UKCP). Consequently, data shared via these tools may be stored, processed or accessed under specific conditions. The client must be made aware of these limitations.

When introducing AI figures, therapists would need to inform clients explicitly how the chosen platform handles data, including what is stored, for how long, who has access and how data can be deleted if the client wishes. Clients should be encouraged to choose platforms that aim to meet established data protection standards and to review the platform’s transparency and privacy policy.

Furthermore, the therapist retains ethical responsibility for the client’s well-being. This means that while the AI figure may support stabilisation and continuity between sessions, it does not replace the therapeutic relationship. The therapist must remain alert to increased dysregulation or dependency and ideally integrate AI-based reflections back into session work.

Finally, client suitability must be considered carefully. The AI figure may benefit clients who struggle with relational trust and offer additional regulation support, but it may not be appropriate for clients in crisis, those with severe dissociation, psychosis, disorders that significantly impair reality testing (e.g., schizophrenia) or those with very limited access to follow-up human support.  Clear boundaries and safety protocols would need to be set before introducing the AI component, making sure that a moderation of risk, clarity of roles and the continuity of human oversight are considered.

Concluding reflection
AI is moving rapidly from a peripheral tool to part of the therapeutic landscape. Its potential extends beyond accessibility or consistency to include offering clients experiences of attunement and responsiveness that can be available at any time. Through this, we may come closer than ever to creating conditions for earned secure attachment (Main & Goldwyn, 1998; Main et al., 1985). This paper offers a conceptual framework for how AI might support trauma therapy and EMDR practice while recognising that further research and clinical caution are essential.

I write as a trauma-focused and EMDR therapist, working primarily with individuals who have presentations of complex trauma and attachment-related difficulties. I identify as a woman of Iranian and German heritage and practise within the UK mental health system. My clinical perspective is shaped by training in trauma-informed psychotherapy and EMDR, alongside extensive experience working with dissociative parts, emotional regulation difficulties and insecure attachment. This article is influenced by compassion-focused approaches and attachment-based models that prioritise the experience of secure caregiving and the integration of EMDR when working with complex trauma. The ideas presented in this paper have developed through sustained, reflective engagement with generative AI as a clinical support tool, explored and refined over time alongside supervision and the theoretical integration of trauma-informed models and EMDR frameworks. Throughout this article, I approach the integration of AI with clinical curiosity while remaining attentive to ethical considerations, power dynamics and the limitations inherent in both therapeutic relationships and generative AI. This paper does not seek to present a prescriptive model but rather to explore extensive clinical possibilities while emphasising the importance of clinical oversight, reflective practice and careful consideration of client suitability.

References

Gilbert, P. (2010). Compassion focused therapy: Distinctive features. Routledge

Grodniewicz, J. P., & Hohol, M. (2023). Waiting for a digital therapist: Three challenges on the path to psychotherapy delivered by artificial intelligence. Frontiers in Psychiatry, 14. https://doi.org/10.3389/fpsyt.2023.1190084

Ho, A., Hancock, J., & Miner, A. S. (2018). Psychological, Relational, and Emotional Effects of Self-Disclosure After Conversations with a Chatbot. Journal of Communication, 68(4), 712–733. https://doi.org/10.1093/joc/jqy026

Main, M., & Goldwyn, R. (1998). Adult attachment scoring and classification system. University of California, Berkeley

Main, M., Kaplan, N., & Cassidy, J. (1985). Security in infancy, childhood, and adulthood: A move to the level of representation. Monographs of the Society for Research in Child Development, 50(1–2), 66–104. https://doi.org/10.2307/3333827

OpenAI. (2025). Safety and moderation updates. https://openai.com/Safety

O’Shea, K. (2009). Attachment-focused EMDR and early relational trauma. In R. Shapiro (Ed.), EMDR Solutions II. W. Norton

Leave a Reply

Commenting guidelines

Your email address will not be published. Required fields are marked *