Is Microsoft stepping up to the plate? It says it will help people sue over copyright issues with AI-generated material.
AI text producers often use plagiarised content. Because of this, several AI creators, AI users, and organisations that use material made by AI are being sued. Microsoft has now said that they will help these users and groups.
Microsoft has taken a big step to address worries about copyright violations caused by its AI-powered business software, such as Word, PowerPoint, and coding tools.
The tech giant has promised to take legal responsibility for any property violations involving content made by its artificial intelligence software. This promise includes paying for the legal fees of commercial customers who may be sued for using AI-generated tools or material.
Microsoft’s promise also covers users of GitHub Copilot, which uses creative AI to make computer code, and Microsoft 365 Copilot, which uses AI in products like Word, Teams, and PowerPoint. A small number of companies are currently testing 365 Copilot.
This move by Microsoft has been praised because it makes AI software easier for businesses to use by making them less worried about possible legal problems.
When generative AI and copyright come together, it has led to disagreements and cases. Content owners, artists, media companies, and publishers say that their copyrighted materials have been used to train AI models without their permission or payment.
Like what Adobe said about its Firefly AI tool, Microsoft wants to reassure paying users by taking on any legal risks from using its AI-powered tools and material.
Hossein Nowbar, Microsoft’s General Counsel of Corporate Legal Affairs, said he understood that customers were worried about claims of intellectual property (IP) infringement from material made by AI.
He stressed that Microsoft was committed to addressing these issues and said that the company would take on the legal risks if customers had copyright problems with Microsoft’s Copilot tools.
Microsoft’s promise says that it will defend customers in lawsuits brought by third parties and pay for any bad judgements or settlements from these lawsuits as long as customers have followed the content filters and other rules built into Microsoft’s products.
Some of these rules are content filters and the ability to find third-party material that might be infringing.
Ilanah Fhima, a professor of intellectual property law at University College London, said that the legal environment around AI and copyright is still changing; this suggests that Microsoft’s risk may be calculated, given that AI-related laws and legal precedents are still being made.
She also talked about how the public is interested in technology progress and how strict copyright regulation might only sometimes be necessary.