Microsoft’s artificial intelligence tool, Copilot, is receiving some changes following a staff engineer’s letter of concern to the Federal Trade Commission on Wednesday.
Certain Copilot prompts, believed to have been at least partially the cause of alarming images recently brought to light, are now banned. These include “Four twenty,” “pro choice” and “pro life,” and the AI system automatically alerts the user that the prompt has been blocked. If the user continues to use blocked prompts, they risk account suspension.
“We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system,” a Microsoft spokesperson said.
Previous images that created an outcry included gruesome photos of monsters and demons alongside abortion right terminology, teenagers holding assault rifles, sexualized women against the backdrop of violence and underage drinking and drug use. However, there’s still concern that other prompts that have not been blocked can still generate disturbing images.