AI overviews, bot traffic & the end of clicks - SEODAY 2025 recap
From apocalyptic warnings to dismissive shrugs—the expert assessments at this year's SEODAY spanned the entire spectrum. The core debate? Whether AI is the undisputed future of online search, or if Google can maintain its dominance in this new arena. One thing was certain: no topic was as omnipresent as AI search. In this article, we’ll dive into the most exciting findings on SEO and GEO (Generative Engine Optimization) from one of Germany’s premier industry conferences.
Recently hosted at the RheinEnergieStadion in Cologne—home of Bundesliga newcomers 1. FC Köln—SEODAY 2025 felt electrified. Almost every presentation focused on AI. It’s no surprise, given that the industry is facing its biggest shift since the launch of Google AdWords in 2000. While some speakers warned of a "digital agency apocalypse," others pointed to ChatGPT’s user data to suggest the threat might be overstated.
The event once again boasted a stellar lineup, featuring industry giants like Sistrix CEO Johannes Beus, EVNext MD Dominik Wojcik, and PEEC AI CMO Malte Landwehr. The Expo area was equally impressive, with booths from ReachX, SEOCATION, and PerformanceLiebe. SEODAY CMO Fabian Rossbacher and his team delivered an excellently organized event packed with highlights. We at hurra.com were thrilled to be there, connecting with partners and friends like Annett and Nils from Robinson GmbH and Nico Kavelar from SE Ranking.
The End of Clicks: Is Visibility the New SEO KPI?
Moving on to the highlight presentations: Sistrix CEO Johannes Beus took the stage to walk the audience through the impact of AI Overviews (AIOs) on Google Search. According to Sistrix’s latest data, AI Overviews are already appearing for 20% of all keywords in the German Google index. The composition of these AI-generated answers is far from uniform. While they almost always include at least one paragraph of text and feature list elements in 98% of cases, images only appear in one out of every four results. Headlines are present in 92% of these snapshots, yet more complex elements like carousels, cards, or video overlays are found in fewer than 11% of all instances.
A critical point of discussion was the attribution of sources. This marks the biggest divergence from competitors like OpenAI; while ChatGPT only provides citations for roughly one-fifth of its answers, Google—true to its search DNA—includes source references in 97% of all AIO results. However, much like the ChatGPT interface, these are often tucked away behind a link icon, typically as a block of four sources. Traditional "blue inline links" are only included in about 19% of cases—a discouraging figure for companies that have historically relied on organic search traffic.
The situation becomes even more challenging: 60% of these AI answers mention no brand names at all. In the remaining 40%, you’re forced to share the spotlight with an average of 2.87 competitors, as Google rarely highlights a single brand in isolation. The importance of E-E-A-T, a recurring theme in the current GEO hype, was clearly evident. Beyond giants like Wikipedia and YouTube, the most cited sources are household names like Amazon, ADAC, and Vodafone.
Website operators in the DACH region must also contend with a new local disadvantage. German-language content is no longer just competing with itself, but also with English-language content that Google translates on the fly. We are witnessing a "globalization of competition" for visibility. Currently, this trend is a one-way street (English to German), though a reciprocal move is expected in the medium term to maintain cultural relevance.
Ultimately, Beus gave SEOs little reason to celebrate, suggesting that visibility—rather than clicks—should be the primary KPI of the future. Conveniently, his company offers the industry-standard "Visibility Index" to measure exactly that. Perish the thought that there might be a connection there! 😉

Analyzing bot traffic: How often does my page appear in AI searches?
Juliane Bettinga from SEOSOON offered conference participants a much brighter outlook. I’m happy to say that her presentation was not only my personal highlight of the day but clearly resonated with the entire room, as she deservingly took home the audience prize for the best keynote.
Bettinga took the audience deep into the technical side of crawling control for AI bots, providing numerous helpful tips for monitoring bot activity. First, she clarified that not all AI bots are the same; operators (similar to Google) have a wide variety of crawlers in their repertoire. Specifically, regular spiders like the GPTBot are used to gather training data and can be classified as "AI trainers." These must be distinguished from the real-time calls made by AI assistants, which can be evaluated in your log files. These are actual ChatGPT user hits, triggered every time the AI performs a web search in "thinking mode" and visits a page as a result.
Attention should also be paid to query fanouts—the process of splitting a single user query into multiple sub-queries. In many ways, these are comparable to the long-tail keywords of classic search, except that LLMs merge the individual search results from these sub-queries back together to generate a final answer. Finally, there are AI crawlers that use asynchronous calls specifically to refine and improve data.
Beyond showing how to identify and evaluate AI bots in log files, Bettinga shared several best practices to ensure a site is captured as effectively as possible. For instance, clean source code is just as vital as regularly checking whether your servers can handle the massive traffic spikes caused by bot clusters. It is also fundamental that content can be crawled without JavaScript and that structured data is utilized. While these tips are already well-known in the SEO world, they are still poorly implemented in many places. Unsurprisingly, Bettinga also offers a dedicated tool for log file analysis, as well as a checker to see if AI bots have access to your domain at all. Both can be found in a free version on seosoon.de.
Is your website AI-ready or invisible to users in ChatGPT & Co.?
This was precisely where Bettinga touched on a critical point I addressed in my own keynote at this year’s OMT in Mainz: the accessibility of your website for AI model crawlers. Unlike Google, ChatGPT and its peers are designed as response or dialogue engines and do not rely solely on a static index to feed results. The more complex a user’s request, the more likely the system is to perform a real-time query. While Google primarily uses information already crawled and stored on its servers (periodically checking for updates), LLMs often rely on these "live" crawls.
A domain that blocks these bots will remain invisible in AI results, regardless of how well the content is optimized. Anyone who thinks this is a rare occurrence is mistaken. Just recently, I came across the website of a FTSE 250 company that was—completely unintentionally—blocking every single AI bot.
If you want to ensure your site is not only indexed by Google but also correctly surfaced by AI systems to gain both visibility and customers, let’s talk. We provide the strategic and technical support to make it happen. Feel free to reach out and arrange a free initial consultation.





