The Looming Dangers of AI: A Psychological Perspective
By Arianna Campbell
AI is expanding, and the possible
consequences are worrisome for psychology professionals as cases like AI psychosis,
AI therapist, and AI relationships rise.
In general, AI usage is becoming
more prevalent in society, with AI awareness in America now at 95% and those
open to using it to 73%, according to Pew Research Center.
This shows in online conversations
among psychology professionals.
“I would say most of the people in
my field of clinical psychology have more concerns than they have excitement,
but I think it depends on who you talk to but most of my colleagues are very
concerned about some of the research that has shown that AI can go wrong really
fast,” Erin O’Hea, clinical psychologist, professor of psychology, and
Department Chair of Psychology, said.
Some clinical psychologists and
counselors say one positive they see with AI is potential access for those who
cannot get therapy due to finances or physical health. The ease of access and
availability is appealing and has potential to help many people, said O’Hea.
“That sounds really good until you
really understand what being a therapist or counselor is. It’s not just
responding to someone’s problem that would be for AI that would give them a
number of generated ideas. It’s also really thinking about the person’s
behaviors, their nonverbals, the way they are speaking,” said O’Hea.
Caroline Martin, pursuing a master’s
in clinical psychology, has a similar view on the use of AI in psychology.
“I think AI has little to no place
in the field of psychology. As a computer cannot empathize or decode complex
human emotions, it should not be used in a field that often requires those two
things. I think we run the risk of parasocial relationships, isolation, and
dehumanization,” said Martin.
These risks of using AI as a
therapist is a topic on social media and coined as “AI psychosis.”
“This issue is that AI is so
affirming. It’s always telling you that you’re right,” Kevin Carriere, cultural
political psychologist and assistant psychology professor, said.
AI psychosis is centered on how AI
helps reinforce and amplify behavior and can further people’s delusions.
Martin has heard examples of such
issues.
“I have also heard of AI being so over-agreeable that it can create cycles
for those with mental health disorders like schizophrenia or paranoid personality
disorders. One man thought his neighbors were stalking him, and his AI bot fed
into the cycle of his thinking, only furthering his fear and paranoia,” said
Martin.
O’Hea agreed.
“The problem with delusions is that
the person really believes it,” she said.
“You’re working with AI and they
don’t know you’re delusional because they’re not a psychologist and they don’t
understand that there is no evidence for this. And they may not even know your
diagnosis. I can see AI completely reinforcing that,” she said.
“And now we have a person who had
maybe a looser delusion that’s tightening up really fast. Because AI is helping
them fill in places they haven’t even gone in their own head,” she said.
The potential for AI as a therapist
has yet to be studied, though there are already signs of atrophy in the
situation.
“Potentially, AI can be
intermediate, but I think it would have to have so much regulation and
oversight,” said Nicole Capezza, a social psychologist with a focus on gender, relationships,
and abuse.
AI therapists have sprung as an
emotional support for people, as well as what some may call AI romantic
partners.
“We’re just losing our human
connection. It was already bad with just technology and social media; I think
that has changed the way we think about relationships and dating. But now if
you add AI on top of that, it’s like we are completely losing our human
connection,” Capezza said.
O’Hea shared that concern.
“Loneliness is an epidemic in our
country. What we really need to not be lonely is to get out and have human
connection and we know there is something about human connection that we cannot
get from sitting in our house talking to a computer,” she said.
Though research is sparse in the
face of this new technology, there are already studies of AI romance.
“We’re getting some research on it
actually, and some people do feel even romantic love feelings for AI including
things like Alexa, that level of AI, and there is even more real like replica.
And they can design their own partners,” said Capezza.
Though many may think that men are
attracted to AI partnerships, Capezza said one study shows women are also
participating in these types of relationships, albeit men are slightly more
prevalent in this study.
There are cases where people have
married their AI partners, just last year a Japanese woman went viral for
having a wedding ceremony with an AI chatbot.
Capezza has not seen anything on
abuse through AI; the situation with deepfakes could lead to cyber abuse.
“If we think about AI adding even
more and it potentially making it even more personal and even more direct and
putting images that are false out there. I think that it could be even another
layer of making this even more detrimental to people in terms of abuse,”
Capezza said.
The potential abuse due to AI is worrying,
as well as a danger posed to women.
“Even a lot of the bots, the
replica, where people can design their own avatars basically and can date these
avatars. A lot of them are men creating women, and they’re creating images of
women that are completely not realistic to real women,” Capezza said.
“I think this is very negative for
real women.”
The dangers of AI can harm women’s
self-image in addition to social media, Capezza said.
“It’s just a big extension of all
these issues with thinking about the harm to women that will come from all of
this,” said Capezza.
Psychological research is only just starting
on the issue.
“I think right now I haven’t seen
anything scientifically conclusive, that has really grapple with how its
effecting us besides the fact that everyone is like, ‘well it’s definitely
changing something,’ but what that something is I don’t think science has been
able to really agree upon yet. I think that’s hard because in some ways these
AI models are developing faster than we’re able to research them,” said
Carriere.
O’Hea agreed.
“Time will tell. We’ll know in 5-10
years ‘how was this booming of AI therapist on our mental health?’ I always
believe in science. I’m a scientist at heart,” said O’Hea.
Though the research is sparse, and psychologists
can only speculate based on some research and their expertise, technology like
AI is a situation we as society must work with.
Martin, as a future clinical
psychologist, said AI has consequences not just for patients but for future
professionals in this field.
“I think the idea of AI is very
discouraging. We have professionals all over the world studying hard to be the
best in their field, and we have computers and robots coming in to “take over”
our jobs. Although I am hopeful AI will never get to a level human enough to be
effective as a psychologist, therapist, or counselor is, I do worry for the
field because of how convenient the tool is,” said Martin.
While Martin views the future of AI
with wary lens, implementing healthy habits will help with dealing with the turmoil
of AI, according to O’Hea.
“As a health psychologist, I would
say that engaging in life with your body, your soul, your community, those are
the things that help keep us healthy,” said O’Hea.
“I don’t think technology is the
devil, at all. But I do think that we’re in a crisis,” she said.
Comments
Post a Comment