The Emotional Architecture Problem: Balancing Ethics and Innovation Beyond California’s SB 243
Photo by Igor Omilaev on Unsplash Introduction People are increasingly turning to AI chatbots in their most vulnerable moments— seeking comfort after a nightmare, sharing suicidal thoughts—leaning on them as emotional confidants within AI’s emotional architecture. While AI chatbots may simulate empathy and offer low-stakes emotional support , there have been reported incidents and lawsuits alleging that chatbots like OpenAI’s GPT4-o have escalated self-harm, even coaching users...