Dialogue with AI integrates useful confrontation for pragmatic support – By author and essayist Hella Ahmed © 08/10/2025

AI dialogues: insightful clashes, for problem solving

(Par Hella Ahmed) I recently read a paper by a psychologist who advocates for therapy conducted solely by psychologists, with a particular preference for psychoanalysis, associating it with the essence of being human in therapeutic practice. The author argues that AI lacks the confrontational edge needed for personal growth, suggesting it allows users to feel unburdened but fails to provide the structured guidance offered by licensed therapists through psychological strategies. 

I disagree with this perspective. AI is not designed to fully replace therapists, and it can indeed be confrontational in a constructive, logical manner that fosters clarity and some growth. Also, there are qualified and highly competent human therapists who are not psychologists or avid supporters of psychoanalysis.

The benefits of pragmatic AI dialogues

AI systems, such as cognitive behavioural therapy (CBT)-based chatbots like Woebot or Youper, demonstrate this capability. Woebot uses conversational AI to help users reframe negative thoughts and manage stress, while Youper offers personalized exercises to boost mood and reduce anxiety. These tools challenge irrational thoughts by presenting evidence-based counterpoints, fostering lucidity without emotional manipulation. For those interested in exploring AI-based mental health tools, these resources are available on their official websites woebot.health, youper.ai, or through app stores for more information about their features.

Moreover, even AI not specifically designed for mental health—such as general conversational models—can enhance mental well-being when used wisely. Engaging in pragmatic dialogues with AI, seeking fact-based knowledge, and focusing on rational clarity can empower users, sharpen their intelligence, and promote emotional resilience. This approach encourages self-reflection and grounded decision-making, contributing to mental health in a way that complements traditional therapy.

In contrast, some licensed psychologists, particularly those favoring psychoanalytic methods, seem to prioritize provoking emotional pain, asserting that suffering is necessary for change. They argue that confronting negative emotions through intense interventions drives growth, and they consider the phenomenon of repetition as a basis for their interpretation of human behaviour in relation to psychological distress within the framework of their practice intended to be therapeutic (in a dramatic way)!

To me, this approach feels like striking someone emotionally and labeling the resulting agony as “emotional liberation.” While certain psychoanalytic techniques, such as fostering self-reflection, are useful, an overreliance on suffering risks alienating clients and causing harm, which doesn’t seem caring at all.

AI’s logical confrontation and accessibility

AI offers a different kind of confrontation—one rooted in logic and objectivity. Tools designed for CBT-based support or combining CBT with mindfulness engage users with structured questions and feedback, promoting self-awareness without the emotional intensity of some human-led sessions. For example, these tools can guide users through exercises that challenge distorted thinking patterns, offering practical solutions to daily mental health challenges. This pragmatic approach redirects attention toward actionable insights, reducing the emotional weight that can obscure the horizon’s visibility. 

The psychologist’s claim that AI lacks confrontation overlooks its ability to deliver clear, evidence-based feedback. AI’s strength lies in its accessibility and objectivity, complementing traditional therapy rather than replacing it. By dismissing AI’s potential, the psychologist overstates the limitations of technology in mental health contexts.

Conclusion

AI’s pragmatic dialogues provide a balanced path toward mental clarity and some improvement, challenging the notion that only human therapists can facilitate meaningful change. While AI may not replicate the emotional depth of human-led therapy, its logical, accessible support empowers users to navigate challenges effectively.

When used thoughtfully, AI—whether designed for mental health or not—can enhance well-being, offering a complementary approach to traditional therapy that prioritizes clarity, empowerment, and resilience.

Hella Ahmed 2025 © All rights reserved – Find my books on Amazon