Try our New Bulk Scan Feature - AVAILABLE NOW!
Scan hundreds of URLs or pieces of content for AI, plagiarism and more in just minutes! Available to all users in our app.
Read more Here
AI Studies

73% of Abstracts in AI Journals are AI-Generated in 2025

How much of AI research in scholarly abstracts is now written by AI? Discover the growth of AI-generated scientific abstracts in artificial intelligence journals.

Trusted By Industry Leaders
Trusted By Industry Leaders

In recent years, the rapid advancement of artificial intelligence (AI) tools capable of generating human-like text has impacted a number of fields, including scientific publishing.

The emergence of large language models (LLMs), such as ChatGPT, has made it increasingly easy for researchers to draft scientific content, including abstracts.

As these tools become more accessible and integrated into academic workflows, questions arise about their impact on the authenticity, originality, and transparency of scientific communication.

This study focuses on the rate of AI in journal abstracts from publications focused on that same topic — artificial intelligence. 

Abstracts are often the only portion of a paper read by policymakers, media, and the general public; as a result, their integrity is paramount. 

Using a dataset of thousands of journal abstracts from AI-related publications spanning 2018 to 2025, this study examines the prevalence of AI-generated content over time. 

The findings of this research will contribute to ongoing discussions about the role of AI in academia, the potential need for transparency in AI-assisted writing, and the ethical implications for scientific authorship.

Objectives of the Study

This study investigates the prevalence and trajectory of AI-generated content in scientific abstracts within the field of artificial intelligence from 2018 to 2025. 

Our analysis is guided by these central questions:

  • What proportion of AI-related scientific abstracts are likely to be AI-generated versus human-written?
  • How has the prevalence of AI-generated abstracts evolved from 2018 to 2025? 
  • What are the implications of increasing AI-generated content for scientific integrity, authorship transparency, and peer review?

Together, these objectives aim to provide a data-driven foundation for understanding how generative AI is reshaping the academic discourse in artificial intelligence, highlighting both the opportunities and risks posed by its integration into scientific writing.

Key Takeaways (TL;DR)

  • In 2018, 11.70% of abstracts were identified as likely AI-generated
    • In 2019, this rose slightly to 13.04%.
    • Likely AI abstracts then remained relatively stable at 12.50% in 2020
    • Despite a dip to 6.12% in 2021, by 2022, the rate returned to 12.37%
  • From 2022 onwards, the rate of AI abstracts in AI journals accelerated.
  • By 2025, Likely AI abstracts in AI journals climbed to 73%.

These findings highlight that AI tools are changing how research is communicated, especially in AI scholarship, a field at the forefront of these technological developments.

524% Increase in AI Abstracts in AI Journals: 2018 to 2025

The proportion of AI-generated abstracts in scientific journals focused on artificial intelligence has shown fluctuations over the time period studied, with an overall upward trajectory, especially since 2022.

This is interesting to note as it highlights broader trends in the implementation of AI technology within the AI industry and scholarship.

  • In 2018, 11.70% of abstracts were identified as likely AI-generated. 
  • This rose slightly to 13.04% in 2019 and remained relatively stable at 12.50% in 2020. 
  • However, in 2021, the rate unexpectedly dropped to 6.12%, suggesting a temporary decline in the use of AI-generated content during that period.
  • By 2022, the rate rebounded to 12.37%, nearly returning to its pre-2021 levels. 

2022 marked a notable shift and accelerated implementation of AI for writing abstracts within AI journals.

From 2022 onwards, the rate of Likely AI abstracts surged, reaching 73% by 2025.

This marks a substantial shift in writing practices within the field of artificial intelligence itself; from 2018 to 2025, it represents a 524% increase in the rate of AI-generated abstracts. 

This trend underscores the accelerating influence of AI on both the content and the medium of scientific discourse in the very field it seeks to transform.

73% of Abstracts in AI-Related Journals Are Likely AI in 2025

The growing presence of AI-generated abstracts from 2018 to 2025 within the field of artificial intelligence reflects a noteworthy evolution in how scientific content is created and presented.

Unlike other domains where AI-generated content may still be emerging, the field of AI presents a unique case: researchers are not only developing AI technologies but are also among the earliest adopters of these tools in their writing practices.

The data shows a modest and fluctuating pattern between 2018 and 2022. 

However, following the launch of popular tools like ChatGPT in 2022, the rate of AI-generated abstracts continued to climb. 

By 2025, the rate jumped to 73%, indicating that the majority of abstracts in AI journals were likely composed with generative AI.

How Does This Impact Readers?

This trend raises nuanced questions about the integration of AI tools in academic publishing. 

On the positive side, AI-assisted writing tools offer efficiencies in scientific communication, such as helping researchers draft clear, grammatically sound abstracts and reducing the language barrier for non-native English speakers. 

However, in a domain so closely tied to the development of these technologies, there are added concerns: how do we ensure accountability when the tools being studied are also influencing the presentation of the research itself? 

Questions around disclosure, intellectual authorship, and the validity of peer-review processes must be revisited as AI-generated content becomes more prevalent, even in the field that builds these tools.

Final Thoughts

This study offers a lens into the evolving landscape of scientific writing within the artificial intelligence community. 

These results suggest that AI tools are changing how research is communicated and published, especially in artificial intelligence, a field which is at the forefront of these technological developments.

As the use of generative AI becomes increasingly normalized, academic institutions, journals, and researchers must establish clear frameworks that define appropriate use, ensure transparency, and maintain the integrity of scholarly communication. 

In a field where the subject and the medium are becoming increasingly intertwined, maintaining a strong ethical foundation will be essential to fostering trust and accountability in the next generation of academic publishing.

Wondering whether a post, review, or abstract you’re reading might be AI-generated? Use the Originality.ai AI detector to find out.

Read more about the impact of AI:

Methodology

We investigated the prevalence of AI-generated abstracts in AI-related scientific literature from 2018 to 2025 using publicly available academic metadata and the Originality.ai AI detection tool.

Data Collection

Using the OpenAlex API, we retrieved up to 500 abstracts per year containing the keyword “artificial intelligence” and a non-empty abstract. A custom function reconstructed the original text from OpenAlex’s inverted index. Metadata such as title, journal, and publication date were retained.

AI Detection

Reconstructed abstracts (minimum 50 words) were analyzed using the Originality.ai API, which provided a confidence score indicating the probability of AI authorship and a binary classification (is_ai_generated) flag. Scans were retried up to three times if errors occurred.

Output and Storage

Results were compiled into a master dataset using pandas and exported as a CSV file, including fields like year, title, abstract, journal, AI score, and scan status.

This approach enabled consistent, scalable tracking of generative AI use in scientific abstracts over time.

Madeleine Lambert

Madeleine Lambert is the Director of Marketing and Sales at Originality.ai, with over a decade of experience in SEO and content creation. She previously owned and operated a successful content marketing agency, which she scaled and exited. Madeleine specializes in digital PR—contact her for media inquiries and story collaborations.

Frequently Asked Questions

Do I have to fill out the entire form?

No, that’s one of the benefits, only fill out the areas which you think will be relevant to the prompts you require.

Why is the English so poor for some prompts?

When making the tool we had to make each prompt as general as possible to be able to include every kind of input. Not to worry though ChatGPT is smart and will still understand the prompt.

In The Press

Originality.ai has been featured for its accurate ability to detect GPT-3, Chat GPT and GPT-4 generated content. See some of the coverage below…

View All Press
Featured by Leading Publications

Originality.ai did a fantastic job on all three prompts, precisely detecting them as AI-written. Additionally, after I checked with actual human-written textual content, it did determine it as 100% human-generated, which is important.

Vahan Petrosyan

searchenginejournal.com

I use this tool most frequently to check for AI content personally. My most frequent use-case is checking content submitted by freelance writers we work with for AI and plagiarism.

Tom Demers

searchengineland.com

After extensive research and testing, we determined Originality.ai to be the most accurate technology.

Rock Content Team

rockcontent.com

Jon Gillham, Founder of Originality.ai came up with a tool to detect whether the content is written by humans or AI tools. It’s built on such technology that can specifically detect content by ChatGPT-3 — by giving you a spam score of 0-100, with an accuracy of 94%.

Felix Rose-Collins

ranktracker.com

ChatGPT lacks empathy and originality. It’s also recognized as AI-generated content most of the time by plagiarism and AI detectors like Originality.ai

Ashley Stahl

forbes.com

Originality.ai Do give them a shot! 

Sri Krishna

venturebeat.com

For web publishers, Originality.ai will enable you to scan your content seamlessly, see who has checked it previously, and detect if an AI-powered tool was implored.

Industry Trends

analyticsinsight.net

Frequently Asked Questions

Why is it important to check for plagiarism?

Tools for conducting a plagiarism check between two documents online are important as it helps to ensure the originality and authenticity of written work. Plagiarism undermines the value of professional and educational institutions, as well as the integrity of the authors who write articles. By checking for plagiarism, you can ensure the work that you produce is original or properly attributed to the original author. This helps prevent the distribution of copied and misrepresented information.

What is Text Comparison?

Text comparison is the process of taking two or more pieces of text and comparing them to see if there are any similarities, differences and/or plagiarism. The objective of a text comparison is to see if one of the texts has been copied or paraphrased from another text. This text compare tool for plagiarism check between two documents has been built to help you streamline that process by finding the discrepancies with ease.

How do Text Comparison Tools Work?

Text comparison tools work by analyzing and comparing the contents of two or more text documents to find similarities and differences between them. This is typically done by breaking the texts down into smaller units such as sentences or phrases, and then calculating a similarity score based on the number of identical or nearly identical units. The comparison may be based on the exact wording of the text, or it may take into account synonyms and other variations in language. The results of the comparison are usually presented in the form of a report or visual representation, highlighting the similarities and differences between the texts.

String comparison is a fundamental operation in text comparison tools that involves comparing two sequences of characters to determine if they are identical or not. This comparison can be done at the character level or at a higher level, such as the word or sentence level.

The most basic form of string comparison is the equality test, where the two strings are compared character by character and a Boolean result indicating whether they are equal or not is returned. More sophisticated string comparison algorithms use heuristics and statistical models to determine the similarity between two strings, even if they are not exactly the same. These algorithms often use techniques such as edit distance, which measures the minimum number of operations (such as insertions, deletions, and substitutions) required to transform one string into another.

Another common technique for string comparison is n-gram analysis, where the strings are divided into overlapping sequences of characters (n-grams) and the frequency of each n-gram is compared between the two strings. This allows for a more nuanced comparison that takes into account partial similarities, rather than just exact matches.

String comparison is a crucial component of text comparison tools, as it forms the basis for determining the similarities and differences between texts. The results of the string comparison can then be used to generate a report or visual representation of the similarities and differences between the texts.

What is Syntax Highlighting?

Syntax highlighting is a feature of text editors and integrated development environments (IDEs) that helps to visually distinguish different elements of a code or markup language. It does this by coloring different elements of the code, such as keywords, variables, functions, and operators, based on a predefined set of rules.

The purpose of syntax highlighting is to make the code easier to read and understand, by drawing attention to the different elements and their structure. For example, keywords may be colored in a different hue to emphasize their importance, while comments or strings may be colored differently to distinguish them from the code itself. This helps to make the code more readable, reducing the cognitive load of the reader and making it easier to identify potential syntax errors.

How Can I Conduct a Plagiarism Check between Two Documents Online?

With our tool it’s easy, just enter or upload some text, click on the button “Compare text” and the tool will automatically display the diff between the two texts.

What Are the Benefits of Using a Text Compare Tool?

Using text comparison tools is much easier, more efficient, and more reliable than proofreading a piece of text by hand. Eliminate the risk of human error by using a tool to detect and display the text difference within seconds.

What Files Can You Inspect with This Text Compare Tool?

We have support for the file extensions .pdf, .docx, .odt, .doc and .txt. You can also enter your text or copy and paste text to compare.

Will My Data Be Shared?

There is never any data saved by the tool, when you hit “Upload” we are just scanning the text and pasting it into our text area so with our text compare tool, no data ever enters our servers.

Software License Agreement

Copyright © 2023, Originality.ai

All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  1. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Will My Data Be Shared?

This table below shows a heat map of features on other sites compared to ours as you can see we almost have greens across the board!

More From The Blog

Al Content Detector & Plagiarism Checker for Marketers and Writers

Use our leading tools to ensure you can hit publish with integrity!