AI therapists might flatten humanity into patterns of prediction, and so sacrifice the intimate, individualized care that’s anticipated of conventional human therapists. “The logic of PAI results in a future the place we might all discover ourselves sufferers in an algorithmic asylum administered by digital wardens,” Oberhaus writes. “Within the algorithmic asylum there is no such thing as a want for bars on the window or white padded rooms as a result of there is no such thing as a chance of escape. The asylum is already in every single place—in your properties and places of work, faculties and hospitals, courtrooms and barracks. Wherever there’s an web connection, the asylum is ready.”

A Vital Evaluation of
AI Psychological Well being Remedy
Eoin Fullam
ROUTLEDGE, 2025
Eoin Fullam, a researcher who research the intersection of expertise and psychological well being, echoes among the identical considerations in Chatbot Remedy: A Vital Evaluation of AI Psychological Well being Remedy. A heady educational primer, the e-book analyzes the assumptions underlying the automated therapies provided by AI chatbots and the best way capitalist incentives might corrupt these sorts of instruments.
Fullam observes that the capitalist mentality behind new applied sciences “typically results in questionable, illegitimate, and unlawful enterprise practices by which the shoppers’ pursuits are secondary to methods of market dominance.”
That doesn’t imply that therapy-bot makers “will inevitably conduct nefarious actions opposite to the customers’ pursuits within the pursuit of market dominance,” Fullam writes.
However he notes that the success of AI remedy is dependent upon the inseparable impulses to earn cash and to heal individuals. On this logic, exploitation and remedy feed one another: Each digital remedy session generates knowledge, and that knowledge fuels the system that income as unpaid customers search care. The more practical the remedy appears, the extra the cycle entrenches itself, making it more durable to tell apart between care and commodification. “The extra the customers profit from the app when it comes to its therapeutic or another psychological well being intervention,” he writes, “the extra they endure exploitation.”
This sense of an financial and psychological ouroboros—the snake that eats its personal tail—serves as a central metaphor in Sike, the debut novel from Fred Lunzer, an writer with a analysis background in AI.
Described as a “story of boy meets lady meets AI psychotherapist,” Sike follows Adrian, a younger Londoner who makes a dwelling ghostwriting rap lyrics, in his romance with Maquie, a enterprise skilled with a knack for recognizing profitable applied sciences within the beta section.

Fred Lunzer
CELADON BOOKS, 2025
The title refers to a splashy business AI therapist referred to as Sike, uploaded into sensible glasses, that Adrian makes use of to interrogate his myriad anxieties. “After I signed as much as Sike, we arrange my dashboard, a large black panel like an airplane’s cockpit that confirmed my day by day ‘vitals,’” Adrian narrates. “Sike can analyze the best way you stroll, the best way you make eye contact, the stuff you discuss, the stuff you put on, how typically you piss, shit, snicker, cry, kiss, lie, whine, and cough.”









