We analyzed 29,000 reviews about 123 products at TrustRadius, a site for reviews about B2B software. Our AI detector examined the text of reviews to find how much content was likely generated by humans or AI.
More than 10.7% of Reviews on TrustRadius, since the launch of ChatGPT, are suspected of being AI generated
Our AI detector examined the text of reviews to find how many of them were likely generated by humans or AI. We calculated what percent of reviews posted each month were AI-generated and tracked how that percentage changed over time. The amount of AI-generated reviews grew after GPT-2 was introduced in February 2019 and followed by GPT-3 in June 2020. However there was a decline around the time ChatGPT became available in November 2022. The first 10 months of 2002 averaged 13.5% of reviews being from AI, but fell to 10.5% from November 2022 through September 2023, a decrease of 22.5%. The decrease provides a direct contrast to the steep increase in AI-generated reviews after ChatGPT found in Originality.AI’s study of Capterra, a similar software-review site.
The drop could be due to fewer AI-generated reviews being submitted to TrustRadius or if the site is using AI-detection tools or other procedures to lower the number of such reviews that get published. TrustRadius states that it rejects 47% of reviews it receives for a number of reasons, including being too short, lacking in useful details, or “suspicious user.”
Note that the ~5% of reviews that were identified as AI generated before 2020 represent false positives that occur with AI detectors. Reference our accuracy study for more details - https://originality.ai/ai-content-detection-accuracy/
High Ratings of 8 to 10 Had 75% More AI-Generated Reviews
TrustRadius ratings are on a scale of 1 to 10. We found that, since GPT-2 became available in early 2019, 13.8% of reviews with a rating of 8 to 10 were written by AI, and that 7.7% of reviews with ratings of 1 to 3 were. This difference meant reviews with higher ratings were 75% more likely to be AI-generated than reviews with lower ratings.
Anonymous, “Verified User” Reviews Were 8.7% More Likely to Be AI-Generated
TrustRadius allows its users to post reviews anonymously, listed as “Verified Users” along with a job title. The anonymous reviews were slightly more likely to be written by AI compared to reviews that have the user’s name attached to them, at 13.6% vs. 12.5%. This TrustRadius finding contrasts with what was found at G2.com, another software-review site, where anonymous reviews were less likely to be AI-generated. Another difference is that in the period since ChatGPT became available, TrustRadius reviews no longer have any difference between anonymous and named reviews. The equalization could be due to the same processes that have lowered the number of AI-generated reviews at TrustRadius overall have worked equally well on both types of reviews.
Using AI to generate text for reviews presents challenges for everyone trying to use those reviews to make decisions, whether for personal or business uses. How do you feel about a review that was AI-generated? The site might be making an effort to stop such reviews from showing up, but some persist. The findings in this study show how detecting AI can change how we use sites like TrustRadius and how the sites can better curate their content.
If you have any questions about this study, please contact us.
Have you seen a thought leadership LinkedIn post and wondered if it was AI-generated or human-written? In this study, we looked at the impact of ChatGPT and generative AI tools on the volume of AI content that is being published on LinkedIn. These are our findings.
We believe that it is crucial for AI content detectors reported accuracy to be open, transparent, and accountable. The reality is, each person seeking AI-detection services deserves to know which detector is the most accurate for their specific use case.