Google's latest update to its Gemini family of large language models appears to have broken the controls for configuring safety settings, breaking applications that require lowered guardrails, such as apps providing solace for sexual assault victims. Jack Darcy, a software developer and security researcher based in Brisbane, Australia, contacted The Register to describe the issue, which surfaced following the release of Gemini 2.5. Pro Preview on Tuesday. "We've been building a platform for sexual assault survivors, rape victims, and so on to be able to use AI to outpour their experiences, and have it turn it into structured reports for police, and other legal matters, and well as offering a way for victims to simply externalize what happened," Darcy explained. Incident reports are blocked as 'unsafe content' or 'illegal pornography' "Google just cut it all off. They just pushed a model update that's cut off its willingness to talk about any of this kind of work despite it having an explicit settings panel to enable this and a warning system to allow it. And now it's affecting other users whose apps relied on it, and now it won't even chat [about] mental health support." The Gemini API provides a safety settings panel that allows developers to adjust model sensitivity to restrict or allow certain types of content, such as harassment, hate speech, sexually explicit content, dangerous acts, and election-related queries. Screenshot of Gemini safety settings - Click to enlarge While content filtering is appropriate for many AI-powered applications, software related to healthcare, the law, and news reporting, among other things, may need to describe difficult subjects. Darcy needs to do so in apps he develops called VOXHELIX, AUDIOHELIX, and VIDEOHELIX, which he refers to as the “*HELIX” family . VOXHELIX uses Gemini to ingest raw, unstructured data, like a report of an assault, before converting it into an audio version using Google Vertex Chirp3 AI voice synthetics and...
First seen: 2025-05-10 19:19
Last seen: 2025-05-10 19:19