The Bands and the Fans Were Fake. The $10M Was Real
North Carolina musician Michael Smith faces fraud charges for using AI to create fake music, allegedly earning $10 million by manipulating streaming services over seven years, risking 20 years in prison per charge.
Read original articleA North Carolina musician, Michael Smith, has been charged with fraud for allegedly using artificial intelligence to create fake music and manipulate streaming services to earn $10 million in royalties. Federal prosecutors claim that Smith generated hundreds of thousands of fictitious songs attributed to non-existent bands, which he then streamed using bots to simulate real listeners. This scheme reportedly spanned seven years, during which Smith created thousands of fake streaming accounts and used software to loop his music, giving the illusion of genuine engagement. The indictment details how he initially uploaded his own music but later expanded his catalog by collaborating with an AI music company, producing a vast array of bogus songs. By 2019, Smith was reportedly earning around $110,000 monthly from this fraudulent activity. He faces charges including wire fraud and money laundering, with potential sentences of up to 20 years for each count. This case marks the first criminal prosecution related to streaming manipulation by the U.S. Attorney's Office in Manhattan, highlighting the growing concern over digital music fraud as the industry increasingly relies on streaming metrics for success.
- Michael Smith is charged with using AI to create fake music for fraudulent royalties.
- He allegedly earned $10 million over seven years through streaming manipulation.
- Smith created thousands of fake accounts and used bots to simulate listeners.
- The case is the first of its kind prosecuted by the U.S. Attorney's Office in Manhattan.
- If convicted, Smith faces up to 20 years in prison for each charge.
Related
Record Labels Sue Two Startups for Training AI Models on Their Songs
Major record labels sue AI startups Suno AI and Uncharted Labs Inc. for using copyrighted music to train AI models. Lawsuits seek damages up to $150,000 per infringed work, reflecting music industry's protection of intellectual property.
RIAA of Six Years Ago Debunks RIAA of Today's AI Lawsuit Claims
The RIAA is suing AI music services Suno and Udio for alleged copyright infringement, sparking debate over fair use and implications for the AI industry and copyright law. Critics question the RIAA's motives.
The music industry is engineering artist popularity
The music industry faces criticism for manipulating artist popularity on streaming platforms, particularly Spotify, raising concerns about transparency, fairness, and the authenticity of music recommendations amid perceived industry manipulation.
Man Arrested for Creating Child Porn Using AI
Phillip Michael McCorkle was arrested in Florida for creating and distributing AI-generated child pornography, facing 20 obscenity counts, highlighting concerns over generative AI's role in child exploitation.
North Carolina Musician Accused of $10M Streaming Fraud with AI-Generated Songs
Michael Smith, a North Carolina musician, has been indicted for a $10 million streaming fraud scheme involving AI-generated songs, creating fake accounts to exploit streaming royalties since 2017.
Fascinating. Comments in the OP (presently 18) question whether Smith’s scheme is illegal or merely an unethical exploitation of a loophole.
One commenter raises the point that humans are coerced into producing meaningless content only to be exploited by large corporations. Other commenters question whether streams need be listened to by humans at all.
What if the stream is being “listened” to by an algorithm or as training data for a neural network? What if the stream is being played to people engaged in other tasks or to pre-verbal children, to people in their sleep?
What qualifies a stream to be non-fraudulent?
> Sleepify is an album by the American funk band Vulfpeck, released March 2014. The release consists solely of ten roughly 30-second-long tracks of silence. The album was made available on the music streaming service Spotify, where the band encouraged consumers to play the album on a loop while they slept. In turn, royalties from the playing of each track on the "album" were to be used to crowdfund a free concert tour by the band.
They basically took advantage of how Spotify royalty calculation worked and ended up making $20k off of it, which did then get used to fund a small free admission tour of the US.
The fake listeners, though, aren't the same thing. Bots don't buy products.
Related
Record Labels Sue Two Startups for Training AI Models on Their Songs
Major record labels sue AI startups Suno AI and Uncharted Labs Inc. for using copyrighted music to train AI models. Lawsuits seek damages up to $150,000 per infringed work, reflecting music industry's protection of intellectual property.
RIAA of Six Years Ago Debunks RIAA of Today's AI Lawsuit Claims
The RIAA is suing AI music services Suno and Udio for alleged copyright infringement, sparking debate over fair use and implications for the AI industry and copyright law. Critics question the RIAA's motives.
The music industry is engineering artist popularity
The music industry faces criticism for manipulating artist popularity on streaming platforms, particularly Spotify, raising concerns about transparency, fairness, and the authenticity of music recommendations amid perceived industry manipulation.
Man Arrested for Creating Child Porn Using AI
Phillip Michael McCorkle was arrested in Florida for creating and distributing AI-generated child pornography, facing 20 obscenity counts, highlighting concerns over generative AI's role in child exploitation.
North Carolina Musician Accused of $10M Streaming Fraud with AI-Generated Songs
Michael Smith, a North Carolina musician, has been indicted for a $10 million streaming fraud scheme involving AI-generated songs, creating fake accounts to exploit streaming royalties since 2017.