Sony Music Entertainment CEO Rob Stringer. Photo Credit: Josh Cheuse

Having “sent close to 10,000 takedowns” over artificial intelligence “deepfakes,” Sony Music Entertainment (SME) is now preparing to release a “first of its kind” AI product for consumers.

The noteworthy information and several other interesting details emerged in remarks that SME president of global digital business and U.S. sales Dennis Kooker delivered as part of the Senate’s recently held AI Insight Forum.

Against the backdrop of AI’s rapidly evolving role in the music space – referring not only to voice cloning and a steady stream of auto-generated tracks but to high-profile partnerships and more – Spotify, the National Association of Broadcasters, and SAG-AFTRA also submitted comments for the seventh AI forum.

But the longtime Sony Music exec Kooker disclosed especially significant particulars about artificial intelligence’s prevalence on the infringement front and how his company intends to capitalize on the unprecedented technology.

Sony Music’s Sent Approximately 10,000 Takedown Notices Targeting AI-Powered Soundalikes

Notwithstanding unauthorized soundalike tracks’ continued absence from dedicated streaming services like Spotify and Apple Music, Kooker said that Sony Music has sent somewhere in the ballpark of 10,000 takedown notices targeting the releases.

“To date, SME has sent close to 10,000 takedowns to a variety of platforms hosting unauthorized deepfakes that SME artists asked us to take down,” stated Kooker.

“In this context, platforms are quick to point to the loopholes in the law as an excuse to drag their feet or to not take the deepfakes down when requested,” he continued, proceeding to use the opportunity to reiterate his company’s support for the No Fakes Act that lawmakers introduced last month.

SME’s Exploring Hundreds of Potential Partnerships With AI Companies

Though the AI-focused partnerships of Warner Music and Universal Music have perhaps dominated related headlines as of late, including with tie-ups involving artist estates, YouTube, creation platforms, and others, Kooker provided a look at the scope of Sony Music’s own discussions.

“We currently have roughly 200 active conversations taking place with start-ups and established players about building new products and developing new tools,” specified Kooker.

“These discussions range from tools for creative or marketing assistance, to tools that potentially give us the ability to better protect artist content or find it when used in an unauthorized fashion, to brand new products that have never been launched before.

“They also include potential equity investments which would accelerate the development of these companies,” he proceeded.

Sony Music’s Teeing Up An AI Project That’s “Very Much An Experiment”

Bearing in mind the 200 “active conversations” in which SME is said to be engaged, Kooker also teased a forthcoming project that’s “very much an experiment and a first of its kind.”

Although he didn’t divulge an abundance of information (or a release window), it’s worth highlighting the conspicuous absence of Sony Music artists from the initial list of participants in the pilot for YouTube Shorts’ Dream Track.

“It will have a visual element and a separate audio element,” Kooker described of the as-yet-unnamed offering. “The consumer can combine both experiences to create a new visual and audio experience.

“The training model for the visual element has been developed, with the artists’ enthusiastic blessing, using the artwork and other graphics from the album project,” he continued. “The audio is developed from excerpts of the music from the album.

“Fans will be able to input prompts while listening to the music that will transform the visual experience. They will also be able to play with aspects of the audio to create a new remixed excerpt of the music.

“Consumers will be able to download the visuals and 30 second clips of the music. Eventually, they will be able to purchase a longer version of the song they remixed.”

SME’s Push for AI Regulation

Predictably, given the nature of the forum, Kooker in conclusion laid out his company’s position on legislation governing AI.

Largely aligned with other industry players’ prior statements, these points center on clarifying that training AI models with protected works sans permission doesn’t constitute fair use, compelling developers to maintain comprehensive records of training data, and stopping unapproved soundalike releases via bolstered NIL protections.