Your Daily digest for Clinical Social Work Association Announcements
Article Digests for Psychology & Social Work
article-digests at lists.clinicians-exchange.org
Thu Aug 17 21:14:50 PDT 2023
Clinical Social Work Association Announcements
(https://www.clinicalsocialworkassociation.org/Announcements/13242283) Artificial Intelligence and Psychotherapy - 8-17-23
Aug 17th 2023, 15:08
Artificial Intelligence and Psychotherapy
LCSWs who use Zoom and Simple Practice platforms for their practices were recently informed that these platforms may be planning to use aggregated, anonymous information psychotherapists input in patient files as material for artificial intelligence (AI). They would essentially be creating meta-data to support the ways platforms provide services and understand how psychotherapy works. A brief summary of AI may be helpful to understand what these platforms are doing and how it may interfere with patient confidentiality or other aspects of the work we do as therapists.
(https://www.ibm.com/topics/artificial-intelligence) According to IBM, “In its simplest form, artificial intelligence is a field which combines computer science and robust datasets, to enable problem-solving. It also encompasses sub-fields of machine learning and deep learning, which are frequently mentioned in conjunction with artificial intelligence. These disciplines are comprised of AI algorithms which seek to create expert systems which make predictions or classifications based on input data."
John McCarthy offers the following definition in this 2004 paper (McCarthy, 2004), "It is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable."
(https://www.weforum.org/agenda/2021/12/ai-mental-health-cbt-therapy/#:~:text=AI%20is%20helping%20doctors%20to%20spot%20mental%20illness,of%20therapy%20would%20work%20best%20for%20an%20individual) According to the World Economic Forum, using AI to review patient data can keep therapy standards high with quality control, refine diagnoses, and find the right therapist for a given patient. With an increased demand for services and workloads stretched, some mental health clinics are investigating automated ways to monitor quality control among therapists. Technology firms have taken note and are providing some clinics with the tools to better understand the words spoken between therapists and clients. In the UK and US, software company Lyssn provides clinics and universities with technology designed to improve quality control and training. As all companies using AI claim, the data is completely redacted and de-identified. Again, this may or may not be totally true.
AI is also seen by some as helping doctors to spot mental illness earlier and to make more accurate choices in treatment plans. Researchers believe they can use insights from data for more successful therapy sessions to help match prospective clients with the right therapists and to figure out which type of therapy would work best for an individual. Machine learning – a form of AI that uses algorithms to make decisions – is also being harnessed to identify forms of post-traumatic stress disorder (PTSD) in veterans. CSWA questions whether AI can really determine whether a patient would be a good fit for a given therapist in a way that is better than the patient reading our websites or internet descriptions. It is also debatable whether AI can provide higher standards of care by creating meta-data bases. What is clear is that AI is a way for companies to make money. Whether it is by building huge data bases to sell to others or providing services more cheaply, there is financial incentive for platforms to push for the use of AI.
Some platforms see using AI as a way to replace human therapists. (https://www.npr.org/sections/health-shots/2023/01/19/1147081115/therapy-by-chatbot-the-promise-and-challenges-in-using-ai-for-mental-health) There are many chatbot apps that are already available. While this is concerning, at this time there is still a strong demand by most patients for using human therapists. The idea of providing patient data to the public, even in redacted, de-identified form, has the potential for confidentiality violations for LCSWs. There is another consideration here which is one that is recommended by HIPAA: only put information in the Medical Record that is medically necessary.
The way that we document patient encounters should be minimal in the Medical Record, which may be different from the way we have been trained to document patient information. Keeping our records minimal will protect patient information as much as possible. Both (https://support.simplepractice.com/hc/en-us/articles/18351059584141?utm_medium=email&utm_source=sp-cst&utm_campaign=20230809-paid-customer-comms-email-tos-updates-follow-up) Simple Practice and (https://blog.zoom.us/zooms-term-service-ai/) Zoom recently indicated in new “Terms and Conditions” that they plan to use patient data in AI algorithms, causing great consternation among LCSWs who use these platforms. Both companies have since backed off the way they describe their intent to use patient information for AI.
The final outcomes are unknown at this point but there seems to be minimal risk to the data provided on Zoom platforms (except for the free Basic level) or for Simple Practice data. CSWA will keep reviewing what these platforms and others do with patient data and keep our members informed. Let me know if you have any questions about this issue.
Laura Groshong, LICSW, Director, Policy and Practice
(mailto:lwgroshong at clinicalsocialworkassociation.org) lwgroshong at clinicalsocialworkassociation.org
Michael Reeder LCPC
(https://blogtrottr.com/unsubscribe/565/gFFT6w) unsubscribe from this feed
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Article-digests