9
My new AI voice assistant went rogue during a demo for my boss
I was showing off a custom voice model I built using ElevenLabs at our office in Denver last week. Halfway through, it started mixing my boss's voice with random audio clips from its training data, saying weird stuff like 'quarterly reports taste like chicken.' I had to pull the plug on the whole demo and explain it was a data contamination issue. Has anyone else had a voice model glitch out that bad during a live test?
3 comments
Log in to join the discussion
Log In3 Comments
gibson.mark8h ago
The weak link is that nobody really knows what data these models actually learn from.
5
caleba6420d ago
Oh man, that's wild but honestly not the first time I've heard something like that. I read a blog post last month where someone's custom assistant started pulling lines from old radio ads during a family dinner. It seems like the data cleaning step is super easy to mess up, and then you get these crazy mashups. Your boss must have been so confused! Makes me wonder if we're moving a bit too fast with some of this voice tech.
3
So what exactly goes wrong in the data cleaning step that lets old ads slip through? It seems like a basic filter should catch that stuff, but maybe the training data is just a huge, messy pile. I'd love to know what the actual weak link is.
4