Search
Close this search box.
Exeter, Devon UK • [date-today] • VOL XII
Home Tech AI under fire from lawsuits

AI under fire from lawsuits

Helena Hughes looks at the growing pressure on AI companies over intellectual property laws and the use of Scarlett Johansson's voice.
2 mins read
Written by
OpenAI’s Website (Jonathon Kemper via Unsplash)

In a digital age where AI is growing in prominence, there is mounting pressure and pushback from companies and individuals seeking to protect their intellectual property from being absorbed by AI’s evolving algorithms. Over the past month Sony Music Group, Elon Musk, Scarlett Johansson and a variety of other key players in the creative industries have spoken out against OpenAI and other AI companies around plagiarism of material.  

Scarlett Johansson’s case is particularly interesting, given that it has played out publicly on an international stage. The OpenAI conference, hosted by founder Sam Altman, used the voice of Johansson for the launch of ChatGPT-4o, only the voice was AI generated and Johansson had expressly denied a request to use her voice from Altman. While Altman professes to want to use OpenAI as a force for “all of humanity” this instance with Johansson seems like an extremely clear ethical violation in addition to potentially crossing various legal boundaries.  

NIL or “Name, Image, Likeness” laws exist to protect celebrities from being replaced for cheaper look-alikes or “sound-alikes” by large corporations. This use of Johansson’s voice without her permission could be a direct infraction of these laws, despite it being a generation of her voice or “Scarlett Faux-hansson”. Clearly, this could fall within the “likeness” portion of the NIL laws, where Johansson is not being paid for her intellectual property or use of her (AI synthesised) voice.  

Sony Music Group has also been extremely involved within the AI debate, with companies now only able to use their music when provided explicit permission, SMG have sent out over 700 letters in the last few weeks to companies threatening legal action should this requirement not be fulfilled. The letter that was distributed was clear that there had already been AI infractions from a variety of companies and this would not be tolerated going forward. CEO of SMG, Rob Stringer, has stated that “Artificial intelligence represents a generational inflection point to music and content in general. A sustainable business rights model needs to be established and respected.” This desire for ethical business practise appears to be becoming a priority for many companies when considering AI usage, with the AI industry branching so broadly into areas such as law, music and visual art.  

Although creators must be entitled to the rights to their own material, the integration of high quality information naturally improves the overall workings of AI.

There is a complexity when considering limitations around the information being processed by AI, given that naturally the more that the AI is able to process, the higher quality of the intelligence it is able to produce. Although creators must be entitled to the rights to their own material, the integration of high quality information naturally improves the overall workings of AI. Equally if that AI is using material that would normally be behind a pay-wall it does increase accessibility of information to the general public, however this material does not belong to the AI companies which is where a clear conflict arises. The legalities surrounding AI are ever evolving, however the next few months are looking to be transformative in the way companies treat AI and how individuals react to being used for AI purposes.

You may also like

Subscribe to our newsletter

Sign Up for Our Newsletter