Google’s integration of AI-generated summaries into its search engine has placed publishers in a precarious position. They must choose between allowing their content to be used for AI-driven summaries or risking a significant drop in visibility on the world’s most dominant search engine.
While convenient to users, this move by Google poses a potential threat to the web traffic many sites rely on for survival.
The dilemma stems from Google’s AI system leveraging the same web-crawling tool, Googlebot, that powers its traditional search results. Blocking Google’s AI from accessing a site’s content would also mean sacrificing the traffic driven by the search engine, a risk many publishers cannot afford to take.
With Google commanding a substantial share of the search market — a monopoly recently underscored by a federal court — publishers are at a disadvantage in the emerging AI-driven digital landscape.
Small publishing companies are especially concerned about the possible dangers of AI technology. As AI gets better at creating content, there’s the worry that people might get all the information they want right on the search page without needing to click through to the original website.
This could cause fewer visitors to these sites, which would mean less money from advertising. This drop in income could make it hard for many online publishers to stay in business.
“These are two bad options. You drop out and you die immediately, or you partner with them and you probably just die slowly because eventually, they’re not going to need you either,” Joe Ragazzo, publisher of the news site Talking Points Memo told Bloomberg.
Google maintains that its AI Overviews, the summaries displayed at the top of search results, are part of its commitment to delivering high-quality information and supporting publishers.
A Google spokesperson emphasised the ongoing ‘value exchange’ between the tech company and websites, asserting that Google drives billions of clicks to sites daily. However, the benefits of this arrangement are increasingly questioned by those who rely on Google for traffic.
While some tech companies are paying to license content from publishers, Google has largely refrained from entering into such agreements. This stance and its dominant market position leave publishers with little leverage.
For publishers like Kyle Wiens, CEO of iFixit, the decision to allow or block AI crawlers is fraught with consequences. While smaller AI companies like Anthropic can be blocked without significant impact, blocking Googlebot could result in a catastrophic loss of traffic and customers.
The debate over Google AI’s practices has broader implications for the industry. Search startups struggling to compete with the search giant face formidable challenges in building their web indexes.
Companies like Perplexity and Kagi, for example, find it nearly impossible to match Google’s financial and infrastructural power, particularly when it comes to licensing valuable content from platforms like Reddit.
Publishers also feel the barriers created by the omnipresent robots.txt files, which guide how web crawlers interact with sites. While not legally binding, these files often favour established players like Google, further entrenching its dominance. Many publishers now advocate for ‘crawl neutrality’ to level the playing field for emerging search companies.
The stakes are high as the U.S. Department of Justice considers remedies to address Google’s monopolistic practices, including potentially forcing the company to share its search index data with competitors. The ongoing legal battle underscores the critical importance of publishers diversifying their traffic sources and reducing reliance on any single platform.
In the News: Iran-backed hacking group targeted Trump and Biden campaigns among others