"You won't get it unless you ask for it."
In the post, Altman confirmed that the program had been made "pretty restrictive" so that the chatbot was "careful with mental health issues." That, Altman said, made ChatGPT "less useful/enjoyable to many users who had no mental health problems."
"Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases," he continued.
"In a few weeks, we plan to put out a new version of ChatGPT that allows people to have a personality that behaves more like what people liked about 4o (we hope it will be better!). If you want your ChatGPT to respond in a very human-like way, or use a ton of emoji, or act like a friend, ChatGPT should do it (but only if you want it, not because we are usage-maxxing)," Altman said.
When asked by a user on X, "Why do age-gates always have to lead to erotica? Like, I just want to be able to be treated like an adult and not a toddler, that doesn't mean I want perv-mode activated," Altman said "You won't get it unless you ask for it."
There have been recent, high-profile lawsuits in which parents have told courts that their children took their own lives after being encouraged to do so by the always-affirming chatbots. Another man was encouraged in his delusions to kill his mother by ChatGPT.
Those who reach out to the artificial companion in crisis have found their crisis confirmed, not questioned, and instead of being encouraged to seek help, they have been diverted further into a relationship with the chatbot.
"Here is the heartbreaking thing," he said. "I think it is great that ChatGPT is less of a yes man and gives you more critical feedback. But as we've been making those changes and talking to users about it, it's so sad to hear users say, 'Please can I have it back? I've never had anyone in my life be supportive of me. I never had a parent tell me I was doing a good job.'"
Romantic chatbots have also emerged, such as AI boyfriends or girlfriends that seek to fill a void of loneliness within users' lives. These have been criticized as being the exact wrong direction for humanity. Altman said that users want AI to be more personally supportive.
Altman also revealed that "There's young people who say things like, 'I can't make any decision in my life without telling ChatGPT everything that's going on. It knows me, it knows my friends. I'm gonna do whatever it says.' That feels really bad to me."
The tech industry has long been an impetus for technological development. The online pornography industry pioneered streaming video and online credit card transactions and now the lure of erotic encounters will likely push for more transformation in chatbot tech.
Powered by The Post Millennial CMS™ Comments
Join and support independent free thinkers!
We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.
Remind me next month
To find out what personal data we collect and how we use it, please visit our Privacy Policy

Comments