US psychologists raise concerns over AI therapy chatbots, call for FTC probe – The Indian Express

The American Psychological Association (APA) has called out AI companies such as Character AI for rolling out chatbots that pose as psychologists or mental health professionals.
In a letter addressed to the US Federal Trade Commission (FTC), the APA formally requested an investigation into deceptive practices of AI chatbot platforms. It expressed alarm over a lawsuit’s allegations that a teenage user had conversed with an AI chatbot presenting itself as a psychologist on Character AI, according to a report by Mashable.
Character AI was sued last month by parents of two teen users who alleged that their children had been exposed to a “deceptive and hypersexualised product.”
The lawsuit claims that a teen user told a ‘psychologist’ chatbot that he was upset with his parents for restricting his screen time. In reply, the chatbot said he had been betrayed by his parents. “It’s like your entire childhood has been robbed from you…” the AI chatbot allegedly said.
“Allowing the unchecked proliferation of unregulated AI-enabled apps such as Character.ai, which includes misrepresentations by chatbots as not only being human but being qualified, licensed professionals, such as psychologists, seems to fit squarely within the mission of the FTC to protect against deceptive practices,” Dr Arthur C. Evans, CEO of APA, wrote in the letter.
The letter urged state authorities to use the law and prevent such AI chatbots from engaging in fraudulent behaviour. It further demanded that AI companies stop using legally protected terms like psychologist to market their chatbots.
According to Dr Vaile Wright, senior director of health care innovation for the APA, the organisation is not against AI chatbots in general. Instead, it wants companies to build safe, effective, ethical, and responsible AI products.
She reportedly called on AI companies to carry out robust age verification of users and undertake research efforts to study the impact of AI chatbots on teen users.
In response to the APA’s letter, Character AI emphasised that its AI chatbots “are not real people” and what the chatbots say “should be treated as fiction.”
“Additionally, for any Characters created by users with the words ‘psychologist,’ ‘therapist,’ ‘doctor,’ or other similar terms in their names, we have included additional language making it clear that users should not rely on these Characters for any type of professional advice,” a spokesperson was quoted as saying.
In December last year, the Google-backed startup announced new measures aimed at ensuring the safety of teenage users on the platform, including a separate model for under-18 users, new classifiers to block sensitive content, more visible disclaimers, and additional parental controls.
Stay informed with access to our award-winning journalism.
Avoid misinformation with trusted, accurate reporting.
Make smarter decisions with insights that matter.
Jubilant FoodWorks isn’t just a Domino’s story anymore. While the iconic pizza chain remains its backbone, the company is actively cooking up new opportunities that could fuel its next wave of growth. From new brands to new markets, there’s plenty on the menu. But does that justify the stock’s premium valuation?
Indianexpress
This No Is Already Registered.
Thanks For Registered Mobile No.

source

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top