Credit
Article. Seo Seongdeok (Music Critic)
Photo Credit. Shutterstock
What’s the first thing that comes to mind when you hear expressions like “streaming fraud” and “fake streaming”? For most people, it will be what’s commonly referred to as chart manipulation, where mass streaming with fake accounts boosts a song’s chart performance. When this happens, rather than the charts acting as a reflection of the popularity of a song, it works in reverse: A song’s position on the chart causes it to trend. While listeners are quick to see through this kind of artificial popularity, it’s just the tip of the iceberg; there are far worse nightmares that hound the music industry.

Streaming services like YouTube and Spotify pay a certain percentage of the revenue they earn from paid subscriptions and ad-supported streams to record companies and artists as dividends. In simple terms, the percentage they receive is predetermined, so the more a song is streamed, the more profits there are to go around. So what if someone other than those groups mentioned above gets a cut of the money? When that happens, they’re siphoning off a portion of the profits that rightly belong to the labels and artists. While the same can be said of chart manipulation, when a bad actor is focused purely on money rather than name recognition, the damage done is far greater and more varied than it is with click farms.

First of all, how much greater are we talking? According to a report published in France by the Centre National de la Musique (CNM) earlier this year, between 1% and 3% of all streaming in the country was detected to be fraudulent. When accounting for undetected fraud, it’s estimated that the actual figure is closer to 10%. Now let’s extrapolate this to a worldwide scale. According to a 2022 report put out by the International Federation of the Phonographic Industry (IFPI), the streaming market is worth $17.5 billion globally. That means that around $1.8 billion goes straight into the wrong pockets. For reference, Warner Music Group made $590 million in revenue and $55.1 million in profits that year.

Second, how different are other types of fraud from fake streaming? Let’s use a headline case from last year involving a business called MediaMuv. Using fake contracts, the people behind MediaMuv swindled YouTube and a company that handles copyright claims for music used in YouTube videos. They stole about $23 million in royalties meant to be distributed to rights holders of about 50,000 songs by different musicians, including famous artists like Daddy Yankee and Julio Iglesias. Most of the victims were unaware the copyright fraud even took place. Unfortunately for MediaMuv, they got too brazen with their scheme, claiming 100% of the rights. It’s presumed that there are far more scams taking place more covertly, claiming smaller percentages.

Then there are cases where scam artists pose as someone else. One relatively innocuous example is when people upload their song to a streaming service under an artist name and song title that are nearly identical to someone far more famous. In such cases, they can fool some listeners by claiming to be releasing the full version of a song that was only partially released on social media or that was played live but never officially released. What about when they reupload popular songs, especially those that have gone viral on social media? Often, they won’t even change the artist name or the album cover. The most common approach lately is when a remix of a song goes viral on TikTok and the poser uploads a copy of that version to their own account. Most notable are sped-up versions of songs that became popular and are found in their entirety elsewhere, but most have nothing to do with the original artists. Try searching “Cupid sped up” on Spotify and you’ll find millions of streams going to songs that have no connection with FIFTY FIFTY whatsoever. Given that this fraud is happening right under their noses, you can’t blame artists for releasing their own official sped-up versions or accuse them of lacking creativity.

What all these scams have in common is that they allow people who have nothing to do with the music industry, let alone creating a song, to take a share of a musician’s deserved profits. Artists under smaller labels suffer the most because they don’t have the resources to properly exercise their copyrights. Even if someone never intended to defraud the artist, their actions still nibble away at the royalties that the artist deserves. That’s what places streaming fraud on a level above music piracy through MP3 sharing. The rise of illegal MP3 sharing was a consequence of consumer habits and the consumers eventually became aware of the structural problems they were contributing to. However, with the advent of streaming, consumers now pay a fair price and have no way of knowing—or even need to know—where exactly their money goes. But all artists, popular and otherwise, are being hurt by fraud.
And then there’s the serious wake-up call that is music generated by artificial intelligence. On April 4, a song called “Heart on My Sleeve” appeared on Spotify and YouTube. It was an entirely original song that mimicked the voices of Drake and the Weeknd using AI. The song went viral in mid-April and had over 15 million streams on TikTok alone. Then, on April 17, it was announced that the song would be removed from all streaming services, including Apple Music and Spotify, at the request of Universal Music, who represents both Drake and the Weeknd. In a statement, Universal said the incident is a reminder that everyone in the industry now has to decide which side of the debate they’re on: whether to back artists, fans and creative human expression, or continue down a road that allows fake music to deprive artists of what they deserve.

AI-generated voices are being used in many different ways. As in the above case, it’s possible to have them sing songs that don’t actually exist. You could invent a new Oasis song and have an AI Liam Gallagher sing the lyrics. The Internet is already awash in AI cover songs; there’s countless AI covers of K-pop songs alone, from Bruno Mars doing NewJeans songs to Michael Jackson covering FIFTY FIFTY. It seems people are strictly focused on whether AI vocals should be allowed or not given that they introduce all kinds of sticky legal issues like the question of the right of publicity.

It would shortsighted to believe that simply disclosing when something was created by AI will be enough to solve the problem. And voice is just the beginning. Don’t be surprised when we start seeing songs that have been created from scratch entirely by AI. The biggest concern surrounding AI as far as the music industry is concerned isn’t whether it’s plausible for machines to invent music that reach a certain standard of quality, but how much it can produce and how quickly. We already live in a time when 100,000 new songs pop up every single day. AI might be able to mimic the latest trends as they happen and flood streaming services with its output. If streaming fraud unjustly takes a share of profits from artists, AI could dilute the pool altogether. And while AI isn’t exactly the same thing as fraud, it gives people with malicious intent a brand-new tool to play with. Who could have guessed that streaming would be thrown such a curveball? Nothing gold can stay.

The market evolves rapidly and continues to move forward, never taking time to ponder philosophical questions like the place of human creativity in the creation of music. But at some point, we will have to consider such questions for ourselves. Have you ever looked at a painting generated by AI and wondered whether we really need people for such work at all? How amazing was the Star Wars trailer in the style of Wes Anderson created by AI, for example? ChatGPT can already write you a perfect resumé, too. Then wait—what about music created by AI? What grounds would be there for us to stop here?