Imagine opening Spotify and finding “your” new songs waiting for listeners—except you never wrote them, never recorded them, and never uploaded them. That is the nightmare Murphy Campbell says she lived through, and it turns AI cloning someone’s voice without permission from a scary idea into something much more real.
Campbell, a folk musician, reportedly found songs on her Spotify profile that sounded like her but were not hers at all. According to the account, someone had used AI to copy her voice from YouTube videos and release fake tracks under her identity.
This is not just weird. It is deeply personal.

For most people, music is not just content. A voice carries personality, history, vulnerability, and years of work.
So when someone uses AI to imitate a singer’s voice and uploads fake songs under their name, it is not just copyright trouble. It feels more like identity theft with a melody attached.
That is what makes this story so unsettling even for people who are not professional musicians. If your voice, face, style, or words live online, they can potentially be copied, remixed, and presented as “you” without your consent.
Why this hits artists where it hurts most

For an artist, a Spotify profile is not just a page. It is reputation, discovery, and income.
If fake AI songs appear there, a few bad things can happen at once. Fans may get confused. Real listeners may drift away. Streams can be diverted. Money can leak out in small amounts that still matter.
And then there is the emotional damage. Imagine having to explain to your own audience that the music on your profile is not actually yours.
The technology part is simple, and that is the problem

Voice cloning means using software to copy the sound and style of a real person’s voice. If there is enough audio online, the system can create new speech or singing that sounds close enough to fool many listeners.
That used to sound futuristic. Now it is cheap, fast, and accessible enough that a bad actor does not need a giant lab or a movie studio budget.
That is why AI cloning someone’s voice without permission is becoming such a serious issue. The barrier to doing harm has dropped, while the systems to stop it are still weak.
Platforms are not ready for this
One of the most frustrating parts of stories like this is how ordinary the platform response often feels. Many major platforms still do not have a reliable way to catch, label, or quickly remove fake AI content before damage is done.
That leaves creators in a terrible position. They have to notice the fake, prove it is fake, report it, wait for a response, and hope the platform treats the issue with urgency.
Meanwhile, the fake content may keep collecting plays, attention, and algorithmic momentum. Algorithmic momentum just means a platform keeps pushing something because it is already getting activity.
This fear is spreading far beyond music
Murphy Campbell’s experience feels extreme, but the emotion behind it is becoming very familiar. Writers worry that their style will be copied. Photographers worry their images will be scraped and remade. Voice actors worry that one clean sample is all someone needs.
Even hobbyist creators feel this now. A person posting covers on TikTok, sharing drawings on Instagram, or uploading narration to YouTube may suddenly wonder whether they are also training an impersonator.
That changes how people share. Not because they become less creative, but because they become less sure the internet will treat their work as theirs.
Proving you are yourself is getting harder
This may be the most unsettling part. For years, the internet taught people to document everything, publish everything, and build a public presence.
Now that same public trail can be turned against them. The more voice clips, photos, writing samples, and videos you have online, the easier it may be for someone to imitate you.
And when a fake appears, the burden often lands on the real person. They have to prove authenticity instead of the fake being blocked automatically.
That is a backwards system, but right now it is often the one creators are stuck with.
Why everyday people should care
This is not only a celebrity problem. It is not even only a professional artist problem.
If you have ever posted a song cover, a podcast clip, a spoken video, a headshot, a poem, or original artwork, you already have a digital identity. Digital identity simply means the online trail of things that look and sound like you.
Once AI tools can mimic that identity, the risk spreads to anyone with enough online material to copy. That includes students, freelancers, hobby musicians, teachers, streamers, and regular social media users.
What creators can do right now
There is no perfect defense yet, which is frustrating. But there are still sensible steps that can help reduce risk and make response easier.
- Regularly check your artist pages, social profiles, and search results for fake uploads or impersonation
- Keep records of your original files, upload dates, and publishing history
- Use consistent official links so fans know where your real work lives
- Tell your audience how to spot your verified accounts and releases
- Report fake content quickly and document every step
- If possible, work with distributors or platforms that offer stronger identity controls
What fans and platforms should do
Fans can help by paying attention when something feels off. If a song sounds strange, appears out of nowhere, or does not match an artist’s usual style, it is worth checking before sharing it around.
Platforms need to do much more. They should have better verification, better labeling, faster takedowns, and clearer rules around AI cloning someone’s voice without permission.
Right now, too much of the burden sits on the person being copied. That is unfair, and it is not sustainable.
The hopeful part is that people are finally paying attention
Stories like Murphy Campbell’s are alarming, but they also make the problem harder to ignore. Once people see that this is happening to a real artist on a real platform, the conversation changes.
It stops being a thought experiment and starts becoming a rights issue, a trust issue, and a platform responsibility issue. That matters because public pressure is often what forces better rules.
AI cloning someone’s voice without permission should not become one of those problems society shrugs at until everyone has been burned at least once.
The bigger lesson
The internet made sharing easy. AI is making control much harder.
That does not mean people should stop creating, posting, singing, or showing up online. But it does mean we need stronger norms and stronger protections around identity, consent, and originality.
Because if a musician can wake up and find fake AI songs on her own Spotify profile, the question is no longer whether this can happen. The question is how quickly the rest of us decide that it should not be allowed to keep happening.
Frequently Asked Questions
Yes, AI can now replicate someone’s voice with just a few seconds of audio, and it doesn’t require their consent. This technology is becoming easier and cheaper to access, which is why it’s important to understand the risks and know how to protect yourself.
Laws are still catching up with AI technology, but in many places it is illegal to impersonate someone or use their voice without permission, especially for fraud. However, enforcement varies by country and state, so protections aren’t universal yet.
Limit sharing voice recordings online, be cautious about who you give audio samples to, and consider using voice authentication tools if available. If you suspect your voice has been cloned, report it to platforms where it appeared and contact local authorities if it was used for fraud or harm.
Stay ahead of AI — weekly digest
Get the most useful AI updates delivered to your inbox every week. No noise, just what matters.