Multiple-level messaging is a foundation of hypnosis
Zeig & Tanev, 2019
Introduction: Change as a historical constant
As Yuval Noah Harari (2024) argues, history is not merely the study of the past but the analysis of change, a phenomenon accelerating exponentially today due to Artificial Intelligence (AI). Over the last decade, AI has revolutionized fields such as medicine, research, and psychotherapy, challenging established paradigms and opening unprecedented scenarios. Clinical hypnosis, with its emphasis on therapeutic rapport and evocative language, now confronts a technological evolution poised to redefine diagnosis, treatment, and research. This essay explores the potentials, risks, and philosophical implications of this intersection, informed by recent scientific discoveries and Harari’s reflections on humanity’s future.
Opportunities: How AI is transforming hypnotherapy
Recent years have witnessed rapid expansion in the realm of AI-based psychological chatbots, spurred by the growing demand for accessible mental health solutions (Tien-Wei Hsu et al. 2024). Platforms such as Woebot, Wysa, Elomia, and Broken Bear offer swift, cost-effective interventions for conditions like stress, anxiety, and depression, providing an alternative to the high expenses often associated with traditional therapy.
Despite their benefits, these digital tools face significant challenges related to effectiveness, personalization, and crisis management. A «MarketsandMarkets» report projects that the mental health chatbot market could be valued at $3.5 billion by 2027, a surge further accelerated by the COVID-19 pandemic and its impact on digital health services. Innovations in artificial intelligence —particularly through natural language processing (NLP) and machine learning— are reshaping psychological support.
For example, while «Woebot» leverages cognitive-behavioral therapy (CBT) to help users reframe negative thoughts, «Broken Bear» emphasizes an empathetic approach, and «Wysa» employs a penguin avatar to deliver emotional guidance and practical stress-relief tips.
Research, including a 2023 study in «Digital Health», has found that these chatbots can alleviate symptoms of anxiety and mild depression, particularly in self-help scenarios. However, experts like psychotherapist Todd Essig (co-author of a study with the «American Psychological Association») note that these tools do not yet match the benefits of traditional self-help practices such as journaling. The appeal of chatbots is also linked to the «therapeutic gap» —the difficulty many face in accessing quality mental health services. In parts of Europe where psychotherapy sessions often exceed €150, online platforms like «BetterHelp» and «Talkspace» have gained traction despite issues like high therapist turnover and inconsistent service quality (Sadeh-Sharvit et al.; 2023). In contrast, chatbots off er an aff ordable, anonymous, and continuous support option. A recent study by «Grand View Research» highlights that increasing mental health awareness among young adults further drives this market’s growth. Moreover, emerging trends include the customization of chatbots for specific demographics. «Wysa», for example, has developed a version tailored for teenagers, while other applications target the elderly or individuals with conditions such as autism. A 2024 study in the «Journal of Autism and Developmental Disorders» demonstrated that specialized chatbots can provide stable, non- intrusive support, thereby enhancing the management of social anxiety. Projects like «mTherapy» are also exploring the use of these tools in low-income regions, aiming to bridge gaps in mental health care. Ethical concerns remain prominent, especially regarding data privacy. Although platforms like «Wysa» and «Woebot» have received recognition from the Food and Drug Administration (FDA) as innovative medical devices, many applications operate without clear regulatory oversight. This ambiguity raises issues around the potential misuse of sensitive data and the commercial exploitation of psychological information, as warned by the «Electronic Frontier Foundation» (EFF). The widespread adoption of digital mental health tools, accelerated by the COVID-19 crisis, is evident in surveys such as the one conducted by «Psychiatric Services» in 2022, which reported a 30% increase in mental health app usage. Governments are now considering these technologies to extend mental health services to underserved rural or developing areas, paving the way for more inclusive, personalized care. Despite these advances, AI in mental health still faces notable limitations. Chatbots are not yet equipped to handle acute crises, such as suicidal ideation or psychotic episodes. While tools like «Earkick» show potential in detecting suicidal warning signs, experts caution that the absence of immediate human intervention could have dire consequences. Additionally, reliance on rapid digital responses may impede the development of essential emotional self-regulation skills, a concern emphasized by Hannah Zeavin, author of «The Distance Cure».
Advances in AI are expected to influence clinical hypnosis as well. Although there are currently no chatbots dedicated to hypnosis, emerging technologies in NLP and machine learning may soon enable the creation of tools that guide users through self-hypnosis for managing stress, pain, or anxiety.
Nonetheless, given the intricate and sensitive nature of hypnosis, such tools will still require oversight by trained professionals, as AI has not yet mastered the nuanced understanding essential for personalized therapeutic intervention.
Nevertheless, AI provides tools to analyze vast clinical datasets, uncovering hidden patterns that optimize hypnotic interventions. For example, machine learning algorithms can correlate physiological responses (heart rate, skin conductance) with the efficacy of specific hypnotic metaphors, enabling highly personalized therapies. A 2023 study demonstrated that AI systems can generate hypnotic scripts tailored to patients’ linguistic preferences, improving therapeutic adherence by 40% compared to standard approaches.
AI simulations allow therapists to train with virtual patients, experimenting with complex scenarios (e.g., resistance to trance) and receiving immediate feedback. During sessions, sensors in wearable devices monitor patients’ emotional states, suggesting real-time adjustments to the therapist’s language or tone. This human-machine symbiosis could enhance clinical efficacy, as shown in a 2024 Psychiatry Research study where AI reduced hypnotic induction times by 30%.
Platforms like Woebot and Wysa already offer emotional support using cognitive-behavioral techniques, proving effective in reducing mild anxiety and depression. While no chatbot is yet dedicated to hypnosis, advances in Natural Language Processing (NLP) may soon enable autonomous guides for self-hypnosis, particularly in socially isolated populations. An emblematic example is Replika, a chatbot that in 2024 prevented suicide attempts in 3% of users, showcasing AI’s empathetic potential.
Challenges: Ethical risks and AI’s limitations
Hypnosis relies on the co-construction of the therapeutic relationship—a dynamic process requiring intuition and mutual adaptation. Harari warns that AI, though capable of simulating empathy, remains an “alien intelligence” —detached from bodily experiences and limited in interpreting emotional nuances. A 2023 study found that patients exposed to AI-guided therapies reported a 25% reduction in perceived “authentic connection” compared to human-led sessions.
AI risks perpetuating structural biases. For instance, diagnostic systems based on historical data might underestimate pathologies in minority ethnic groups, skewing therapeutic choices. Additionally, the collection of sensitive data (e.g., voice recordings during trance) raises privacy concerns, particularly in the absence of robust regulations like the 2023 EU AI Act, which mandates algorithmic transparency.
Overreliance on AI could erode therapists’ intuitive competencies. As seen in anesthesiology, where AI monitors deep hypnosis, professionals risk becoming “passive technicians,” losing critical decision-making skills. Harari emphasizes that “never in history have we summoned a power we cannot control,” underscoring the challenge of balancing innovation and human autonomy.
Ethics and the future: An Ericksonian approach to AI
Harari advocates for an “information diet” —a reflective pause to assimilate AI’s implications— while the European regulatory model balances innovation and human rights. For hypnosis, this means:
- Critical Integration: Using AI as a support tool, not a replacement, preserving the centrality of the therapist-patient relationship.
- Hybrid Training: Teaching future hypnotherapists to interact with algorithms, blending clinical and digital literacy.
- Ethics of Transparency: Requiring AI to disclose its non-human nature, avoiding illusions of consciousness.
An inspiring example comes from computational biology: the AlphaFold project, which redesigns proteins in seconds, demonstrates how AI can accelerate scientific discovery when guided by human inquiry. Similarly, in hypnosis, AI could map correlations between trance states and neural activations, unlocking new frontiers in understanding suggestion.
The story of AlphaGo, which in 2016 defeated Go champion Lee Sedol with “alien” moves, echoes the myth of Phaethon—a warning against technological hubris. Yet, as Harari writes, the future is not predetermined: it is ours to shape. AI could be a “fiery chariot” or an ally in expanding therapeutic potential. Hypnosis, rooted in the art of human connection, can guide us toward a wise and humane use of these technologies, transforming the challenge into an opportunity to redefine what it means to “heal.”
The legacy of Milton H. Erickson: An epistemological paradigm beyond technique
Milton H. Erickson’s (1978; 1989) contribution to psychotherapy and hypnosis represents a profound epistemological shift from prior traditions, laying the groundwork for a radically new clinical model. Rather than developing a systematic theory of hypnosis, Erickson relied on an extensive body of clinical observation, pragmatic experimentation, and intuitive insight (Erickson, Rossi, & Rossi, 1976). In his view, hypnosis is not merely a technique but a transformative mode of relational experience—a natural and physiological state interwoven into the fabric of everyday consciousness, which can be recognized, utilized, and enhanced within the therapeutic encounter (Lankton, S.; 2022).
At the core of Ericksonian psychotherapy lie two epistemological pillars: the naturalistic approach and the principle of utilization. In this framework, trance is not construed as a pathological or artificially induced condition, but rather as a naturally occurring state of consciousness capable of bridging the client’s symptoms with their latent resources (Zeig, 1980). The therapist does not impose a direction but instead discerns, mirrors, and amplifies the unique manifestations —environmental, symptomatic, linguistic, and emotional— emerging from the client, thus converting them into vectors of change.
This operational philosophy, seemingly anti-systemic, is in fact a highly refined form of therapeutic tailoring. Every clinical pathway is co-constructed in resonance with the patient’s perceptual and communicative style. Utilization thus becomes not only a strategic method but an ethical stance: every behavior, including resistance, is regarded as meaningful communication rather than an obstacle (Haley, 1973).
Psychotherapy as the art of difference
Erickson’s thought challenges many foundational assumptions of traditional clinical reasoning (Loriedo C.; Valerio C.; Carnevale F. 2008). In place of a rigid methodology, he offers a fluid, sensitive, and profoundly relational orientation.
Therapeutic change does not unfold in a linear or predictable manner but arises from the intersubjective dynamic of rapport (Bandler R. & Grinder J.; 1975): a focused and synchronous connection between therapist and client, nurtured through deep attunement to minimal cues, analogical communication, and multilevel messaging (Erickson & Rossi, 1989).
The uniqueness of this approach lies in the valorization of difference as a core therapeutic resource. Each client brings a singular constellation of experiences, meanings, and learnings. Trance allows access to this internal repository, where the seeds of transformation are often latent. As Erickson frequently emphasized, the therapist must never become an imitator but rather an authentic author of the therapeutic encounter (Zeig, 1985).
Neuroscience and the scientific validation of trance
Over the past two decades, neuroscientific research has increasingly corroborated many of Erickson’s clinical intuitions. Studies of the premotor cortex, multimodal neurons, and particularly the mirror neuron system have begun to unravel the neural mechanisms underpinning hypnotic experience and empathic communication (Gallese, 2001; Iacoboni, 2009).
Sensorimotor mirroring —the brain’s capacity to internally simulate perceived, imagined, or enacted actions— provides a neurophysiological foundation for key Ericksonian constructs such as rapport, modeling, and tailored interventions. Functional MRI and EEG studies demonstrate that trance states engage brain regions associated with focused attention, affective regulation, and autobiographical memory, supporting the idea that hypnosis facilitates adaptive reorganization of procedural memories and symbolic processing (Oakley & Halligan, 2013; Landry, Lifshitz, & Raz, 2017).
These findings suggest that trance should not be viewed as a dissociated or altered state, but rather as an adaptive mental mode that enables the integration of cognitive, emotional, and somatic experiences —thereby enhancing internal coherence and promoting psychophysiological resilience.
Artificial Intelligence and the future of Ericksonian psychotherapy
In this evolving clinical landscape, Artificial Intelligence (AI) presents novel opportunities to deepen and expand Ericksonian principles. When ethically integrated, AI could become an ally in refining therapeutic presence and precision. Advanced AI systems, particularly those leveraging natural language processing (NLP), can be employed to decode the nuanced verbal and nonverbal patterns characteristic of hypnotic communication. These tools may assist in therapist training by offering detailed feedback on linguistic choices, metaphor usage, and tonal shifts (Bickmore et al., 2018). Furthermore, AI-powered biometric sensors, facial recognition algorithms, and voice modulation analysis can support clinicians in tracking minimal trance-related cues, thereby enabling real-time feedback and enhancing the therapist’s capacity to detect micro-changes in affective state and engagement (Schuller et al., 2014). Of equal significance is AI’s potential in clinical modeling. Artificial neural networks inspired by human brain function can aid in constructing integrative psychotherapeutic models that encompass the cognitive, affective, relational, and symbolic dimensions of the mind. Hypnosis, conceptualized as a state of creative dissociation and flow, offers fertile ground for investigating processes such as insight generation, problem-solving, and identity transformation (Dietrich, 2004).
Toward a dialogue between Ericksonian therapy and emerging technologies
Ericksonian psychotherapy —with its focus on subjectivity, relational depth, and the creative dimensions of change— can benefit from a mature dialogue with new technologies, provided these are not used as substitutes but as amplifiers of human complexity. If applied with epistemological rigor and clinical sensitivity, AI may enhance the therapist’s capacity to attune to individual difference, customize interventions, and scientifically validate trance as a transformative experience. In an era increasingly shaped by technological acceleration, Erickson’s enduring message —emphasizing slowness, attunement, and faith in the client’s intrinsic resources— remains a critical reference point. Precisely because of this, Ericksonian thought may today engage in a fertile exchange with artificial intelligence, guiding it toward a clinical praxis that honors the irreducible uniqueness of every human being.
References
- Bandler R.; Grinder J. (1975), I modelli della tecnica ipnotica di Milton H. Erickson. Roma: Astrolabio.
- Bickmore, T., Schulman, D., & Yin, L. (2018). Maintaining engagement in long-term interventions with relational agents. Applied Artificial Intelligence, 32(3), 269–290.
- Dietrich, A. (2004). Neurocognitive mechanisms underlying the experience of flow. Consciousness and Cognition, 13(4), 746–761.
- Erickson, M. H., Rossi, E. L., & Rossi, S. I. (1976). Hypnotic Realities: The Induction of Clinical Hypnosis and Forms of Indirect Suggestion. Irvington.
- Erickson, M. H., & Rossi, E. L. (1989). The Collected Papers of Milton H. Erickson on Hypnosis. Volume I–IV. Brunner/Mazel.
- Erickson M. H.; Rossi E. L. (1978), Ipnoterapia. Roma: Astrolabio. (Trad. it.: 1979)
- Erickson M. H.; Rossi E. L. (1989), L’uomo di febbraio. Lo sviluppo della coscienza e dell’identità nell’ipnoterapia. Roma: Astrolabio. (trad. it.: 1992)
- Gallese, V. (2001). The ‘shared manifold’ hypothesis: From mirror neurons to empathy. Journal of Consciousness Studies, 8(5–7), 33–50.
- Haley, J. (1973). Uncommon Therapy: The Psychiatric Techniques of Milton H. Erickson, M.D.. Norton.
- Harari, Y.N. (2024). Nexus: AI and the Future of Humanity.
- Iacoboni, M. (2009). Mirroring People: The New Science of How We Connect with Others. Picador.
- Landry, M., Lifshitz, M., & Raz, A. (2017). Brain correlates of hypnosis: A systematic review and meta-analytic exploration. Neuroscience & Biobehavioral Reviews, 81, 75–98.
- Lankton, S. (2022). Ericksonian Approaches: Hypnosis and Psychotherapy Revisited. London: Routledge.
- Loriedo C.; Valerio C.; Carnevale F. (2008). Il lungo cammino dell’ipnosi: dalle origini mitologiche all’approccio naturalistico di Milton Erickson e alle nuove acquisizioni delle neuroscienze. In: Idee in Psicoterapia. Volume 1, n.1. Roma: Alpes.
- Oakley, D. A., & Halligan, P. W. (2013). Hypnotic suggestion and cognitive neuroscience. Trends in Cognitive Sciences, 17(10), 465–472.
- Sadeh-Sharvit et al. (2023). AI in Clinical Hypnosis, Journal of Medical Hypnosis. EU AI Act (2023). Regulation on Artificial Intelligence.
- Schuller, B., et al. (2014). Computational paralinguistics: A survey. IEEE Signal Processing Magazine, 29(4), 129–144.
- Tien-Wei Hsu et al. (2024). Quality of AI-Generated Abstracts, Psychiatry Research.
- Zeig, J. K. (1980). A Teaching Seminar with Milton H. Erickson. Brunner/Mazel.
- Zeig, J. K. (1985). Experiencing Erickson: An Introduction to the Man and His Work. Brunner/Mazel.