Oct. 30, 2023
By Douglas L. Johnson and Daniel B. Lifschitz
The recent boom in public-facing artificial intelligence tools has ushered in a wave of disruption for the entertainment industry on a level not seen since the dawn of the public internet. Within this wave, a particularly vexing form of generative software has been celebrity voice synthesizers, which allow users to seamlessly appropriate the vocal stylings of well-known musicians to create original audio. The resulting explosion of unauthorized “deepfake” songs populating social media platforms has sent industry lawyers rummaging through their desk drawers for any legal doctrines that may help to set guardrails for this unprecedented technology.
As with most legal quandaries related to media, copyright tends to be the first place the industry looks for answers. This has already proven to be the case with visual and textual art, as multiple lawsuits have been filed against services such as ChatGPT, Stable Diffusion, and Midjourney based on their ingestion of copyrighted works to train the algorithms powering these tools. It was therefore unsurprising when Bloomberg's Lucas Sha reported on Oct. 22, 2023 that YouTube was prophylactically negotiating with record labels to bless the rollout of a voice synthesizer tool – one presumably trained on the labels' copyright-protected music catalogs.
The industry's focus on copyright law as the north star for resolving generative AI issues overlooks the pivotal role (and stumbling block) that the right of publicity may play in the peculiar realm of vocal imitation. For many decades, states have granted public personas the right to control the commercial exploitation of their identifying attributes, including name, image, likeness, signature, and – as relevant here – voice. See, e.g., Midler v. Ford Motor Co., 849 F.2d 460, 462 (9th Cir. 1988); Cal. Civ. Code § 3344(a) (providing protection against misappropriation of one's “name, voice, signature, photograph, or likeness”) (emphasis added).
There is some doctrinal tension between the right of publicity and copyright law, the latter of which generally permits intentionally imitative performances under 17 U.S.C. § 114(b). VMG Salsoul, Ltd. Liab. Co. v. Ciccone, 824 F.3d 871, 884 (9th Cir. 2016) (“Congress intended to make clear [under § 114(b)] that imitation of a recorded performance cannot be infringement so long as no actual copying is done.”). Imitations, however, require an “independent fixation” of sounds, whereas AI programs derive their output from the ingestion of a performer's actual voice, raising novel questions regarding § 114(b)'s application. Cf. Capitol Records, LLC v. BlueBeat, Inc., 765 F. Supp. 2d 1198, 1204 (C.D. Cal. 2010).
Similarly, while copyright law generally preempts state law claims that attempt to prohibit the reuse of one lawfully fixed copyrighted work within another – see, e.g., Laws v. Sony Music Entm't, Inc., 448 F.3d 1134 (9th Cir. 2006) – it can hardly be said that vocal synthesizers, which create output substantially dissimilar to the ingested works in everything but vocal tone, are being charged with mere reproduction of an artist's voice as embodied in their recordings. Laws, 448 F.3d at 1144 (holding recording artist “gave up the right to reproduce her voice--at least insofar as it is incorporated in [a particular recording]--when she contracted with [her record label]” and granted it the copyright to said recording).
It would thus be a mistake to assume that record labels, simply by dint of controlling certain copyrighted works implicated in a prospective licensee's activities, necessarily have the authority to bargain away the personal rights of the artists on their rosters who created those works. Indeed, this issue has arisen at least once in litigation regarding anti-bootlegging protections. See ABKCO Music, Inc. v. William Sagan, Norton, LLC, 2018 U.S. Dist. LEXIS 60026 (S.D.N.Y. Mar. 30, 2018), reversed on other grounds, ABKCO Music, Inc. v. Sagan, 50 F.4th 309 (2d Cir. 2022).
In ABKCO Music, a consortium of music publishers sued the operator of a website, Wolfgang's Vault, over the latter's decision to exploit a collection of bootlegged musical performances (chiefly derived from the archives of concert promoter Bill Graham) without the consent of the featured artists. One of the website's defenses was that it had signed agreements with the artists' record labels to purportedly bless the exploitation. Id. at *19. However, the anti-bootlegging right was personal to the artists under 17 U.S.C. § 1101, and none of the agreements “provide[d] any written consent from the artists themselves, nor [did] they purport to state that the artists consented to the recording of their performances.” Id. (emphasis added).
So, too, is the right of publicity personal to artists. See Midler, 849 F.2d at 463 (“To impersonate [one's] voice is to pirate [their] identity.”). And while it would certainly be preferable for Congress to take action and provide a uniform set of protections for individuals against the misappropriation of their identities via generative AI (as the recently introduced bipartisan “NO FAKES Act” seeks to do), until such legislation passes, the extant role of state law in providing this protection should not be overlooked.