AI songs that mimic popular artists raising alarms in the music industry
"I think artists should be more afraid," one producer says.
New York-based songwriter and music producer who goes by the name SIXFOOT 5 has been toying away at "The Cape," a song that he has been working on for a while.
After some practice, he croons the lyrics into his computer.
In a few minutes his tune plays through the speakers, but instead of his voice, it sounds like Adele.
"There will be thousands of people that will think, 'Oh, it's a great Adele song,'" SIXFOOT 5 told ABC News Live.
The producer recorded and uploaded his vocals to an online server where users around the world upload famous singers' isolated vocals to create vocal profiles of the artists' voices without the artists' knowledge or consent.
SIXFOOT 5 demonstrated how he could take Adele's vocal profile from the server and convert her vocals in place of his own on his original song.
The result was a song that sounded like Adele singing SIXFOOT 5's original lyrics.
He said he has no intention of ever releasing the song to the public or selling it; rather, he says it could serve as a potential proof of concept that he could one day pitch his song to Adele herself.
However, SIXFOOT 5 said that this technology and growing capabilities should serve as a serious warning to all artists and producers.
"I think artists should be more afraid because I could see the music industry saying, 'We don't really need you anymore. We have your vocal profile,'" he said.
Representatives for Adele did not respond to ABC News Live requests for comment.
Other artists and producers are also raising their voices over the issue, but some contend that the technology could be used ethically to promote artists.
The AI vocal programs are created by people from across the globe who upload thousands of samples from artists such as Adele, Beyonce and Taylor Swift. The more prolific the artist's songs are, the more thorough the AI vocal profile will be.
Anyone with access to the vocal program can then use it to transform songs they sing into ones that sound like the artist, without the artist or their music studio knowing or giving permission.
The technology stirred controversy over the summer when TikTok user "Ghostwriter" released an AI-generated song that was made to sound like it was "performed" by Drake and The Weeknd.
The song was taken down after Universal Music Group filed a copyright lawsuit.
A person claiming to be "Ghostwriter" responded to ABC News Live stating, in part, "AI is going to play a major role in the future of everything, and music is no exception."
"I want to do what I can to spark conversation and move the industry forward so we are taking the necessary steps to be ready for it as it comes," the person who claims to be Ghostwriter told ABC News Live in a statement.
Among those conversations is the legality behind the technology.
Karl Fowlkes, an entertainment attorney, told ABC News Live that copyright laws were never written with AI in mind, but right of publicity laws do provide protections to people regarding the commercial use of their likeness, including their voice.
However, there are no national right of publicity laws and they vary from state to state.
"If you're a record label, you're like, 'OK, sure. Maybe we can make our own AI tracks,' but what about the thousands of artists that we sign on a day-to-day basis? We have to protect their interests," Fowlkes said.
Fowlkes warned that the technology could lead to the music industry putting more backing behind an AI-generated avatar of a hit singer and putting out songs that were generated by a computer program.
"I think the fake Drake and fake Weeknd record was indicative of that," he said.
Some artists are adding clauses to their contracts that address the use of their voices by AI programs, according to Fowlkes. Universal Music Group also has been lobbying Congress to come up with a legislative solution.
ABC News Live reached out to the big three music record labels as well as Apple Music and Spotify for comment on how they plan to handle any AI-generated music that may end up on their platforms, but none responded to the requests.
As technology has helped up-and-coming artists break through the industry, many are calling out the problems that AI-generated music poses.
Johnny Venus and Doctur Dot of the hip-hop duo EARTHGANG said they were able to get their first albums produced because of cheap, available recording software that allowed them to produce their songs without an expensive studio.
However, the duo said they were not happy over the rise of "FN Meka." While the CGI rapper did not sing the duo's songs, FN Meka did perform AI-generated songs that went viral online.
The "artist" was nearly signed by a record label until pushback from fans canceled the deal.
"You're using this thing to make money when you actually just hire somebody for real and just put money behind somebody who has a real story as a real talent," Johnny Venus told ABC News Live.
EARTHGANG's latest album "R.I.P. Human Art" is a satirical look at the role humans play in creativity and plays up the theme of human art.
"[AI] is going to force a lot of people to actually be better and be different," Doctur Dot said.
And while SIXFOOT 5 has been exploring the creative sides of Al in music and music production, he emphasized that the technology still has hiccups and, most importantly, that Al vocals cannot bring life to a record in the same way ahuman can.
In the end, he said there is one big difference between a copy and the real deal: a soul.
"I want to hear the human element in music, in arts. So, I'd be curious if Adele actually sang the song," SIXFOOT 5 said. " I would love to hear what she would do and how her voice would sound on it, because there are just some things that machines cannot do."