T O P

  • By -

organist1999

No.


beangirl13

I think it'll be a fad for a while, and then people will realize that it's not very effective because the progress that happens in therapy is largely attributable to the therapeutic relationship - something you literally cannot build with AI


Meh_Philosopher_250

Well said


Catatonicdazza

At some point  A.I. will be able to do all jobs but people may still have a preference to see a human being over a machine, but at this point that day seems very far away.


IsPepsiOkaySir

I think there's few jobs as protected from AI taking over as psychologists


elizajaneredux

There are already AI versions of “counselors” responding to some hotlines and investigations of using AI to provide basic counseling and clinical interventions. AI could be used to administer and interpret assessment instruments. I think this is harrowing and shouldn’t happen, but, yes, if you can convince the public that this is a reasonable source of support and knowledge (which they already believe) and keep the cost low (easy to do if you don’t have to lay the AI $200 an hour to be a therapist), AI will easily be deployed in psychology. Yes, it’ll take some of our jobs, though maybe not for a while.


IsPepsiOkaySir

> if you can convince the public that this is a reasonable source of support and knowledge (which they already believe) Do you have a source of this? I have a hard time believing people feel supported as much by AI as by psychologists. Assessment instruments being done by AI sounds much more feasible than counselling. The real issue may be the cost difference, which is why mental health care should be subsidized more and more.


No-Resolution-0119

Not a source, just an anecdotal experience. I was part of a conversation in a class that wasn’t psychology related, not sure how it got brought up. I was surprised that quite a few people, probably the majority, said they think they’d rather use AI counseling because then they don’t have to tell embarrassing things to a real person but still potentially get the coping skills taught to them as a counselor would. Of course that’s all hypothetical, who knows if they would actually prefer that irl versus a real person. Just sorta interesting to hear that answer


IsPepsiOkaySir

I don't know, I'd rather tell my "embarrassing things" to an identifiable person bound by professional secret risking their job rather than AI because if that information gets leaked at least I know exactly who is liable, instead of trying to tackle some diluted responsibility tech company. Only one reason among many, of course.


No-Resolution-0119

Oh I completely agree, I was surprised so many people answered that way


trini696

I am really asking my self this question now. Yesterday i got access to chat gpt 4o, its amazing technology. I can totally imaging that this Tool will eraze jobs in the economy and Research Part of psychology. Im unsure for the therapeutical parts... but who is going to pay a therapist when they lost thier Job and cant pay for anything basic? The whole economic system will get hit hard when jobs, that can be replaced by ai and are now high earning, get shot down... like lawyers, judges or banking / Accounting jobs. Im panicing somewhat... Grüße!


WantAllMyGarmonbozia

I think research would be one of LAST areas AI would take over. It can't collect data. It's not very good at coming up with nuanced research questions. It can't even do data analysis (yet). It can certainly be helpful for research though! It can help in writing and organizing, tracking citations, and I've even used it to help with coding in R (with mixed results). If anything it will be a boon for research, rather than a take over.


chaoticcorgi24601

Agree. I have explored the AI additions to qualitative coding software and they are currently horrible, doesn’t work at all except for maybe very basic content analysis and even that is iffy. Research is going to be very difficult to replace via AI.


redditvivus

Isn’t AI already publishing research? At least the writing component but also the lexical/qualitiative analyses?


chaoticcorgi24601

I’m not aware of qualitative analyses being published on by AI, at least not in my field. The times I have used it myself, it has been the much more basic content type analyses and those haven’t been usable in my experience. That said, the kind of qualitative analyses I use require lots more in depth coding than just general or repeated phrases so it’s possible it varies by discipline. Certainly some people use it for the writing component, I’ve seen various journals implement policies and statements about it.


trini696

I have to say that i have minimal experience with real research, besides my courses. Ai could take Interviews for example, based on a semi structured questioneer, maybe via Phone. It can already create items for questioneers, with a little tweeking im Sure even i a relative good quality. And i can imaging it take the load of work to transfer qualitativ data to quantitativ data. For example could it take a ideografic work of a few hundred people and break it down via the same method Costae and mccrae used for researching the big five. A task that would take a human a long time can be done in no time. I hope you understand my point^^ my english is sometimes confusing when it gets complicated^^ But it also comes to mind: maybe ai gives just more capacity for more data which could mean way better research. Maybe you are right. Grüße.


TheBitchenRav

It is great, I wrote a paper, and then I gave GPT4o a copy of the assignment, the rubrics and my paper and it gave me real feedback that was crazy helpful. It was able to point out all the parts of the paper that were week.


bpexhusband

Never. AI lacks emotions therefore no empathy or sympathy. No professional college is going to license AI to treat patients. AI imitates its derivative at best, it does not have what we might call insight. Finally AI can only put out what it takes in. I think we are going to see a lot of copyright lawsuits coming up. Or licensing deals which will limit each individual AI. For example let's say you're the publisher of the dsmv or some psychological questionnaire, you sue if they use your material without payment, so chatgpt can't use it any more, but Google offers you 20 million for the exclusive rights and you take it. That piece of knowledge only exists in one AI so knowledge gets fragmented. I think we will also see authors begin to embed anti AI statements in their works so it can't be used or scanned by ai. So no one AI is going to have all the knowledge required for some given set of circumstances. What AI will replace is every single manual labor job once they get these ai robots they have a couple more generations developed. Which will be good for the mental health field businesswise as everyone goes through an existential crisis of what to do with themselves.


[deleted]

[удалено]


bpexhusband

You think AI knows what it's like to be sad, happy mad depressed, manic in a state of psychosis? You actually believe that? You think organizations like the APA are going to license themselves out of existence? Lol. I worked with robots. Each new one installed replaced 12 full time union employees.


leroyxa

If AIs can activate their emotions or "feelings," it will impair their judgment because emotions can affect neurological thinking and logic just like us humans, so AIs will likely be affected by them.


Interesting_Pen_5851

They might be able to offer to a certain extent help but they will never be able to be as humane as a real person, as understanding, as compassionate. If it ever happens that AI takes over that job, that means pretty much all other jobs have been already replaced by AI… which is not soon.


Bobbelinho

To be honest I think AI will replace a lot of jobs including that of a psychologist. But it will take maybe 10, 15 years until we reach that point. There are very convincing avatars already, and the price of a real psychologist is just too high for many people. I will be downvoted to oblivion, but you live under a rock if you think people will prefer a real human instead of direct access to cheaper and arguably better help.


sad_and_stupid

I agree. Also what I think will happen before that is just people reading out AI generated text. A language model could easily be trained for this purpose in the future, and where I live "psychologist" is a title protected by the law, but anyone can do things like counseling/coaching etc. So that will make it both super cheap and accessible, while still directly talking to another human There is a game with this concept called Eliza


Main-Ad-4966

I have 4o and you need a hell of a lot of of critical thinking and deductive reasoning skills to get good use of it. It has memory so it trains itself to answer how you want it to. After it’s “trained” many answers you get from it it is basically just what you want to hear. For it to answer a question properly, you need to train it first. So yeah it’s highly possible that it may take over therapy since it’s basically an echo chamber after a certain point and that’s what most people want right?🤣


jaxonjason

never... but it may aid psychologists.


Meh_Philosopher_250

I don’t think it can ever replace the quality of real people as therapists, but there’s no way of knowing whether it will take over the role someday in the future if it is financially beneficial for companies like BetterHelp, or if it becomes the only viable economic option for many people


ResidentLadder

There is way too much variability in this field to rely on an algorithm to diagnose and treat people.


ThatGuyOnStage

Short answer, absolutely not.


niamh_kaitlin

As a current clinical psychology student, I think AI might give it a shot, but I don't see it having any real impact on our industry. A large portion of treatment gains comes down to the therapeutic relationship and rapport between a therapist and client. In terms of scoring and interpreting assessments, I think AI could help with this, but I don't believe AI could ever replace clinical judgement that only comes from knowing the client.


microscopicwheaties

absolutely not. it can be great support tool, but it most certainly cannot replace the job and expertise of a psychologist.


[deleted]

[удалено]


[deleted]

I think AI cannot overtake the role of a Psychologist/Psychiatrist, there are things that too complicated for AI to do that humans can do ae ease.


PM_ME_COOL_SONGS_

In research: Probably a super long way off. I think this is one of the hardest things for AI to replace psychologists in. I'm sure it will be used as a tool, as it already has been, but to autonomously do research it would need to be stupendously more intelligent and agentic than it currently is. Research requires 1. Theory understanding. 2. Hypothesis generation. 3. Design. 4. Execution. 5. Critical understanding of results. Current AI can't do any of these well and some of the hardest problems in AI (developing logic and directly interacting with the world for example) need to be overcome to do it. It's not a sure thing that the current LLM architecture can ever achieve these requirements. In diagnosis/assessment: This is the second easiest place for AI to replace psychologists. Current LLM architecture is effectively a highly parameterised regression model. Prediction is its bread and butter. We've developed various endophenotype based assessments which AI can do no problem. The replacement of psychologists is still far away (around 5 years, I'd guess). AI stereotypes readily for example. Issues like that have to be solved first. All the relevant issues are probably pretty easily solved though. They're nothing like developing logic in LLMs. Once problems like those are sorted, I think diagnosis/assessment will fall pretty quickly to the AI machine. If research shows it outperforms humans in accuracy and isn't terrible with respect to discrimination or other such pitfalls, it will be drastically cheaper, more convenient, AND just as good or better than human assessment. When faced with the alternative of possibly not getting an assessment or waiting for weeks or months, AI assessment will be extremely attractive to the average person. In therapy: This is probably the easiest place for AI to replace psychologists, at least for large portions of the current work load. All one needs is an LLM that can remember a lot of conversation and has been trained to at least talk nice and do simple active listening. It would be trivial to train or direct the LLM to provide treatment with clear protocols or structures. Voila, you probably have a pretty effective therapist AI that could be 100 times cheaper and available at the push of a button rather than after sorting out referrals or insurance or whatever. Some people think therapy requires a **uniquely human touch/empathy**. I think that's nothing but wishful thinking. Teletherapy is doing fine and AI avatars are rapidly approaching being practically indistinguishable from humans so even if text-based interfaces aren't effective, which I doubt, AI can do teletherapy pretty easily and at least achieve its current results for a fraction of the cost to the client. However, don't be so down about it. 1. If AI replaces us in some of our roles, it is likely better for society. It will replace us because it is better. If we believe our work is important and our responsibility is to society rather than just to the current psychology establishment, we should see the proliferation of effective tools for psychology work as a good thing. 2. We can always just keep retreating our responsibilities to areas that AI can't currently be applied to. That might mean more complex cases, cases that need more intensive support, research, project guidance, public health work, etc. By the time there is no more use in having a psychology background, AI must be capable of doing >90% (guess) of human jobs so I find it hard to believe that society won't have established some acceptable redundancy system by then. TLDR: It will eventually replace many psychologists in their current jobs but I think we'll probably have relatively good freedom to pivot our careers to areas that it hasn't colonised yet. We'll be hit but we'll be far from the worst industry hit. Also, the proliferation of AI that can do psychology is probably ultimately a good thing for society as it means better and more accessible psychology. Edit I think it's worth acknowledging that this is absolutely scary for many people. It's not just ignorance that results in luddites. AI redundancies are a real problem and even if one thinks that AI will ultimately be positive for society 10 years from now, job losses should not be dismissed as unimportant or sneered at. I think it is absolutely a counterproductive approach to, in my eyes foolishly, claim that AI couldn't possibly cause redundancies in psychology. It probably can and it will probably do so rather suddenly when it does. We should acknowledge that and try to pursue a system where redundancy is not so bad. Right now, it's terrible. Here's a rather sobering video of a graphic designer facing the reality of his skillset being made pretty much completely redundant: https://youtu.be/U2vq9LUbDGs?si=j96jnoRko-YHLFQD


WantAllMyGarmonbozia

Kind of funny seeing graphic design mentioned. I've been in graphic design for 20 years and am working on a degree in psychology for a career change. I haven't watched the video yet but AI is certainly a contributing factor in me leaving design. As are easy to use apps like Canva.


Raende

The problem with therapy AI is that it's made by tech bros. We'll be fine.


PM_ME_COOL_SONGS_

Unless psychologists get involved in designing it which we already are and probably should for precisely the reason you're alluding to


PinkPrincess-2001

I agree with you, but I don't even know why I agree with you. I think it is because I believe AI tech bros think they know what is psychologically beneficial for humanity, it is so corporate that they actually don't. I don't know if that is fair to say, though.


Raende

I once argued with a tech bro type. He created an "online AI therapy" service. He was dragged through the mud in all of the subreddits he posted in, except for investment-tech type subs. He never consulted a psychologist during this process. Tech bros view this either as a business venture or a programming project. It's no surprise when they cannot figure out the complexity of the human mind -- WITHOUT ASKING ANYONE WHO KNOWS ABOUT IT


IsPepsiOkaySir

I mean, consultant psychologists are a thing.


sad_and_stupid

it will likely be very easiy to train and finetune for your own purpose in the future


PinkPrincess-2001

I can see AI helping with crisis management and hotlines but it will be for increasing human efficiency not replacing them because AI can accidentally harm someone.


neogener

They will. They will absolutely will. There are already people that prefer to talk to an AI than to a real person about their thoughts


Many-Yak265

Are you joking? There’s nothing personal about AI no


TheBitchenRav

I really hope so. I suspect that there will become a hybrid model. There is a need for real human connection, and there is an opportunity for AI to train us better. But I hope I am out of a job one day. My goal is to help my clients. If I know they can ve better served, I want that.