GitHub launches Copilot for Business plan as legal questions remain unresolved

GitHub launches Copilot for Business plan as legal questions remain unresolved
.

GitHub Copilot is GitHub’s intelligent code suggestion service. Now available in a plan for businesses months after it was launched for individuals and educators.

Called GitHub Copilot for Business, the new plan, which costs $19 per user per month, comes with all the features in the single-license Copilot tier along with corporate licensing and policy controls. IT administrators can block developers from seeing code that matches public code on GitHub. This is likely to be a response to the intellectual property controversies surrounding Copilot.

Copilot is a downloadable extension that can be used in development environments such as Microsoft Visual Studio, JetBrains and Neovim. It uses an AI model called Codex developed by OpenAI. It’s trained from billions of lines and contexts to suggest additional lines and functions. Copilot — which had over 400,000 subscribers as of August — can surface a programming approach or solution in response to a description of what a developer wants to accomplish (e.g., “Say hello world”), drawing on its knowledge base and the current context.

Image Credits: GitHub

At least a portion the code Codex was trained on is copyrighted, or under a restrictive licence that some advocacy groups have criticized. Users have been able to prompt Copilot to generate code from Quake, code snippets in personal codebases and example code from books like “Mastering JavaScript” and “Think JavaScript”; GitHub itself admits that, about 1% of the time, Copilot suggestions contain code snippets longer than ~150 characters that match the training data.

GitHub asserts that fair use, the U.S. legal doctrine that allows the use of copyrighted material to be used without first obtaining permission from the rights holder, protects it in the case that Copilot was knowingly and unknowingly developed against copyrighted codes. However, not everyone agrees. The Free Software Foundation, a nonprofit that advocates for the free software movement, has called Copilot “unacceptable and unjust.” And Microsoft, GitHub and OpenAI are being sued in a class action lawsuit that accuses them of violating copyright law by allowing Copilot to regurgitate sections of licensed code without providing credit.

GitHub’s liability aside. Some legal experts argue that Copilot could pose a risk to companies if they unwittingly include copyrighted suggestions from this tool into their production software. As Elaine Atwell notes in a piece on Kolide’s corporate blog, because Copilot strips code of its licenses, it’s difficult to tell which code is permissible to deploy and which might have incompatible terms of use.

GitHub’s attempt at rectifying this is a filter, first introduced to the Copilot platform in June, that checks code suggestions with their surrounding code of about 150 characters against public GitHub code and hides suggestions if there’s a match or “near match.” But it’s an imperfect measure. Tim Davis, a Texas A&M University computer science professor, discovered that Copilot would emit large chunks copyrighted code. This includes all attribution text.

@github copilot, with “public code” blocked, emits large chunks of my copyrighted code, with no attribution, no LGPL license. The simple prompt “sparse matrix transpose, cs_”, produces my cs_transpose file in CSparse. My code is on the left, github is on the right. Not OK. pic.twitter.com/sqpOThi8nf

— Tim Davis (@DocSparse) October 16, 2022

GitHub plans to introduce additional features in 2023 aimed at helping developers make informed decisions about whether to use Copilot’s suggestions, including the ability to identify strings matching public code with a reference to those repositories. GitHub Copilot Business customers will not be able to retain code snippets. This applies regardless of whether the data is from public repositories (private repositories), non-GitHub repositories (or local files).

But it’s not clear if these steps will be enough for companies to forget about legal problems.

Read More