• AiNews.com
  • Pages
  • Perplexity AI Faces Legal and Ethical Concerns Over Content Usage
Illustration of AI search engine Perplexity, highlighting legal and ethical concerns. The image features a computer screen displaying aggregated articles with small, barely visible source logos, a gavel representing legal issues, and a balancing scale symbolizing ethical dilemmas. Background elements include news articles, logos of media outlets like Forbes and Bloomberg, and abstract representations of AI and data algorithms

Author: Alicia Shapiro

Estimated Read Time: 3 minutes

Perplexity, an AI search startup, is currently under fire for allegedly copying media content directly for its "Perplexity Pages" feature. This practice has raised legal and ethical questions regarding how the company uses and attributes content.

Perplexity Pages: Controversy and Content Usage

Perplexity Pages allow users to compile information on specific topics. However, reports indicate that Perplexity itself may be misusing this tool. Some Perplexity Pages reportedly contain articles lifted verbatim from various publications like Forbes, CNBC, and Bloomberg, including content behind paywalls. For instance, a post about former Google CEO Eric Schmidt's secret drone project includes sections allegedly copied from a Forbes article, complete with a graphic. These pages are indexed by Google Search and Google AI overviews, significantly increasing their visibility.

The issue is compounded by the way sources are credited. They are identified only through small logos that link to the original articles, making them easy to overlook. According to Forbes, the media outlet names are not mentioned in the text itself, which can lead to confusion about the content’s origin.

CEO's Response to the Allegations

Aravind Srinivas, CEO of Perplexity, addressed these concerns on social media, admitting that the Pages feature has "rough edges" and promising improvements based on user feedback. He acknowledged that it should be easier to locate and highlight contributing sources. Srinivas also defended Perplexity by stating that, unlike many other AI platforms, Perplexity does make an effort to credit sources.

Srinivas argued that Perplexity’s core search product displays sources more prominently. He suggested that Perplexity’s approach to citing sources is similar to traditional journalism, where new articles reference primary sources from other publications.

“Take journalism where you're writing a new article. What do you do? You say, according to the New York Times, you cite others. That's what we are also doing,” Srinivas explained.

Misconceptions About Journalism and AI

Srinivas’s comments highlight a fundamental misunderstanding of journalism and the role of AI in content aggregation. Journalism involves curating stories, making editorial decisions, and presenting diverse human perspectives, all of which shape public opinion. In contrast, AI content aggregators like Perplexity compile information without the editorial oversight that is a hallmark of traditional journalism.

Furthermore, traditional news outlets do not claim to have all the answers. This is in stark contrast to Perplexity’s marketing, which suggests comprehensive knowledge. This stance undermines the plurality and accountability inherent in journalism.

Legal and Ethical Implications

To avoid potential legal issues, Perplexity could attempt to rewrite web content more thoroughly. However, this strategy increases the risk of inaccuracies and misinformation. The broader dilemma facing AI search engines is their reliance on web content for success, which paradoxically makes the original content redundant. This creates a sustainability issue.

Even with improved source citations, users may not click through to the original articles if the content is already available on the AI platform. This diminishes traffic to the original publishers, impacting their revenue and sustainability.

Google’s AI Overviews have also been criticized for similar practices, often using content from Reddit or lesser-known sites to avoid legal disputes. Google has been boosting Reddit in search results and investing heavily in the platform as part of this strategy.

OpenAI, the developer behind ChatGPT, takes a different approach by partnering with select publishers, ensuring their content is displayed preferentially and linked within ChatGPT. While this method mitigates some issues, it gives OpenAI considerable control over media diversity and which outlets benefit from partnerships.

The Future of AI and Journalism

The legal and ethical challenges posed by AI search engines like Perplexity, Google, and OpenAI need to be addressed by lawmakers and courts. Control over information access is a powerful tool that influences publishers and the broader information ecosystem.

As AI platforms continue to evolve, there is an urgent need for a sustainable AI ecosystem that values journalistic work and balances the interests of AI companies, media outlets, and the public. Policymakers must act swiftly to ensure that AI’s integration into content aggregation does not undermine journalism or reduce information diversity.