Opinion: Mental health professionals and therapy chatbots - Los Angeles Times
Advertisement

Letters to the Editor: Where are mental health professionals in the development of therapy chatbots?

The logo of Bing, Microsoft's artificially intelligent search engine.
The logo of Bing, Microsoft’s artificially intelligent search engine, in February.
(Richard Drew / Associated Press)
Share via

To the editor: Dr. Elisabeth Rosenthal makes some important points about the problems of AI in substituting for human psychotherapists. Unfortunately, her limitations as a non-psychiatrist are evident.

For example, most non-medical mental healthcare is provided by professionals like clinical psychologists and social workers, whose work almost always involves non-biological interventions.

Any discussion of AI should consider which kind of psychotherapeutic intervention an AI program is designed to mimic.

Advertisement

Finally, Rosenthal errs in defining transference as “the empathic flow between patient and doctor.†This is simply incorrect, as anyone familiar with psychoanalytic therapy knows.

AI poses some interesting challenges and opportunities for psychotherapy that merit discussion by informed mental health professionals.

Gerald C. Davison, USC professor of psychology, Los Angeles

Advertisement

..

To the editor: As a practicing physician, attempting to utilize AI chatbots for mental health conditions is a goal for insurance companies, including Medicare.

Not that it will be efficacious and improve the mental health of younger and older alike, but it will be less expensive than paying a physician or qualified healthcare professional to provide appropriate care.

Artificial intelligence will find benefits in many arenas including healthcare. But as we try to find parity between physical and mental health, we cannot allow insurance company profits to dictate treatment of those who are psychologically suffering.

Advertisement

Gene Dorio, Saugus

Advertisement