Disclaimer: This article is for informational purposes only. Don’t use it in place of legal advice. If you have any concerns about using AI and plagiarism, consult an attorney.
ChatGPT and other artificial intelligence (AI) writing tools have exploded into the market, creating controversy around their use. So much so, in fact, that schools have been creating policies on navigating AI use in the classroom, content marketers are turning to AI detection tools to help ensure original, human content, and even authors are wondering if and how AI-generated content can have a place in their work.
While the exact reasons for AI-related concerns can vary by industry, there is one topic that comes up time and time again: plagiarism.
More specifically, these questions arise:
Note: Since the widespread use of AI technology is still so new, the laws and ethics regarding AI plagiarism, and the question of ‘does AI plagiarize?’ are a little murky.
In this article, we’re going to explore what we know so far about the use of AI writing tools from a legal perspective to help you stay on the right side of the law.
Simply put, plagiarism occurs when you try to pass off someone else’s work as your own. For example, a student copies another student’s essay (whether it’s just a section or the entire thing), puts their name on it, and then hands it in as their own original work. They don’t acknowledge the source material or author at all, as they want to keep the credit for themselves.
This is, of course, highly unethical, and the reason why schools and workplaces often have strict policies in place surrounding plagiarism. Here’s the thing: when determining if using AI is plagiarism, can you really apply this traditional definition of plagiarism to AI writing tools?
So, is using AI to write plagiarism? It’s a bit of a gray area for now, but generally speaking, no, the use of AI tools isn’t considered plagiarism. When you consider how AI tools work, the traditional definition of plagiarism doesn’t really apply.
AI doesn’t usually copy from other sources word-for-word when it’s generating content. Its goal isn’t even to paraphrase specific pieces of content. These machine learning programs do just what their name implies: learn.
Large language models like GPT-3 are trained on massive datasets that allow them to recognize patterns, structures, and styles of text. They then try to replicate these elements — not necessarily copy the sentences and phrases it has been trained on — to generate what sounds like original, human-written content.
So does AI plagiarize? Since the intention of AI isn’t specifically to pass off someone else’s work as its own (and it’s difficult to prove if AI is directly copying from other sources), it’s not considered plagiarism. At least, in the traditional sense.
So why is using AI viewed as plagiarism by some? Some people argue that it should be because it depends almost entirely on the original work of others. And this is where some of the legal issues come into play.
Let’s start off by clarifying something: in most cases, plagiarism isn’t illegal in the United States. A plagiarizer may face serious repercussions due to a school or work policy, sure, but it’s more of an ethical issue than a legal one. Copyright infringement, on the other hand, is a different story.
While plagiarism is about using someone else’s work and not giving credit where credit is due, copyright infringement is a little different. It involves not obtaining permission to use an original, copyrighted work in the first place.
You can often fix a plagiarism issue by simply citing the original source, but it’s more complicated with copyright infringement. If you don’t get permission to use the work from the copyright holder themselves, you can get charged.
And in the world of artificial intelligence, no one knows this better than the AI companies themselves.
While the US Copyright Office, Library of Congress has issued a statement of policy on AI-generated content, there are still some major legal battles going on regarding AI and copyrighted works. And they don’t just involve big companies — authors are getting in on the action too.
For example, OpenAI, the creator of ChatGPT, and Microsoft are being sued by nonfiction book authors and the New York Times for similar copyright infringement cases. They both allege that the companies used their copyrighted work without permission to train their large language models. This comes after the Authors Guild of America, including author George R.R. Martin, had already sued OpenAI for copyright infringement themselves.
These are just a few examples of the current legal challenges to AI, so it will be interesting to see how they affect the definition of AI plagiarism and copyright infringement in the future. But in the meantime, if you’d like to use AI in your own content creation process, it’s important to do so responsibly.
The conversation and laws surrounding AI plagiarism and copyright infringement may be murky, but that doesn’t mean you should avoid using this technology entirely. After all, there are some real benefits to using AI in the content creation process — you just need to do so responsibly.
Here are some best practices for responsible AI use in the content creation process.
So, is using AI plagiarism? Well, from a legal perspective, the answer is no — at least, not yet. While there are various cases regarding AI and plagiarism’s cousin, copyright infringement, currently going on in the courts, the laws surrounding this situation are still murky at best.
In the meantime, you can help avoid any plagiarism-related issues by following the best practices for using AI tools responsibly in the content creation process. By citing your sources, using AI as an assistant instead of a primary author, and taking advantage of plagiarism checkers, you can ensure that you’re reaping the benefits of AI tools in an ethical way.