AI in SEO: Belgium 2025

Dec 23, 2025

Intro

Think Big Act Personal… or one super day in Belgium with amazing people from OMcollective and JetOctopus.

I woke up, and the first thing I did was to write this article and my post in Linkedin. The amount of positive emotions and excitement didn’t let me sleep properly all night long!

I’m under big, big impression from what OMcollective (do”index, no follow” https://omcollective.com/ ) and JetOctopus (do”index, no follow” https://jetoctopus.com/) did!

It was just one day “boutique”(how I call it) conference, in Gent, Belgium, about SEO and AI. And it was super right on the topic, with no water and no fluff.

I absolutely loved this format, I think it was about 200 people, very friendly, relaxed, and felt like home. I didn’t have to run around with a FOMO and stress that I chose the wrong speech 😂 as it happens at the huge conferences with me every time.

The organisation was great, the venue was great, and the people were amazing! And even the weather and the Gent city, which I had a chance to walk around a bit, was absolutely great!

The decision to go was immediate, the moment I saw Julia’s post in LinkedIn. And I didn’t even have to convince my boss, he directly let me to go even in this intense period. So thank you for that, Aleksandr Lilik!

Right before that, I had a weekend with my friends in Marbella, so I had to wake up at 3:50 am to fly from Malaga to Brussels, and then go by train to Gent (not even saying that before it was Caen-Paris-Malaga-Marbella in 4 days 🤪, so I was a bit tired 😂). 

And I have to say - it was absolutely worth it!

The Speakers and Main Insights:

Pieter Serraris and  “Tactics on improving your visibility in AI”

(LinkedIn https://www.linkedin.com/in/pieter-serraris-%F0%9F%8D%89-71237438/ )

Pieter had a very practical and well-structured talk about tactics that might help to increase visibility in LLMs. 

Some principles I have been preaching myself for quite some time already, and totally agree with Pieter on them. For example:

- Following the EEAT concept is helpful for AI visibility. The content has to be unique with precise data and examples, from an acknowledged expert, using authoritative sources. And it has to be up to date and secure. 

- The Page and Content have to be well structured with H1 and other headings and subheadings, lists, tabs, FAQ, comparisons, summary in the beginning, conclusions in the end, schema markup, and relevant and proper elements in each section of the page, etc. And all that has to be served with a “contextual thinking” in mind…  with an “angle of questions and answers”.

- Good loading speed, proper server response, and avoiding serving important content with JavaScript definitely help all types of bots, including generative AI and LLM crawlers. 

And here are the most insightful ideas that gave me little Eureka moments, because I never looked at them from the AI performance perspective:

- Fresh content attracts LLMs and enhances the trustworthiness of the website. 95% of the ChatGPT citations are younger than 10 months. So keeping content up to date is essential for visibility in AI.


- Be the answer and try to be an ultimate source of all possible additional information the user might need, to fully satisfy the intent. But in this case, keep in mind the main and additional intents of the user in AI search. 

- Be the Easy choice: meaning that LLM bots prefer to take all the needed information from one website, rather than from multiple different websites.


- Personas are back: meaning that we have to think of the "avatar(s)" of our typical user(s) and make different content (types of pages/blocks/CTAs, etc.) for different personas.

- Check the sources for your topic and create your own sources: create your own research, collaborations, case studies, announcements, etc., that will be published on the authoritative websites (maybe by your partners, etc.) and will have a chance to be picked up by LLMs as a source.

- Make your content multimodal, which gives more value to users and AI bots, making your content more attractive for them.


- Create your own sources. For search bots, we work on building authority and having backlinks from other websites. We try to use digital PR and create viral content, outreach, etc., to get more authority. With AI bots, we also have to think of sources that the bot can use in the results… and we can and have to create them ourselves, by collaborating with other brands, doing research, sharing case studies, etc.

- AI, SEO, and Logs insights:

  • In the Log files, we can see what people have searched for in AI, and if we could optimise our content to perform better.

  • In Log files, we can see if AI generates (“hallucinates”) nonexistent pages and decide how to use this information.

  • If you see “RAG” bots in your Log files, that means your website has been used as a source in ChatGPT.

- Here are the websites that appear as sources more than others, so having been mentioned there might increase your chances of getting to AI results.

ChatGPT – Sources Featured in Prompts

 1. reddit.com – 13.5%

 2. wikipedia.org – 11.9%

 3. techradar.com – 2.8%

 4. forbes.com – 2.7%

 5. investopedia.com – 1.7%

 6. tomsguide.com – 1.5%

 7. amazon.com – 1.4%

 8. nerdwallet.com – 1.2%

 9. businessinsider.com – 1.1%

 10. bankrate.com – 0.94%

Google AI Mode – Sources Featured in Prompts

 1. amazon.com – 24.4%

 2. google.com – 20.7%

 3. youtube.com – 16.7%

 4. bankrate.com – 15.3%

 5. nerdwallet.com – 14%

 6. reddit.com – 11.4%

 7. linkedin.com – 11.4%

 8. quora.com – 10.2%

 9. investopedia.com – 10%

 10. forbes.com – 9.5%

- We are still figuring out what are the most precise and optimal ways of measuring performance in LLMs and AI. At the moment, we can use: Log files, brand trackers (simulated data) and web analytics (real data).

Brand trackers (Rankscale, Profound, Promtwatch) can be used to check the visibility of your trend by sending multiple relevant pitches to LLM, but keep in mind that this is still simulated data. Real data comes from GA and Log files.

Julia Nestrets with  “AI bots: what they crawl and why”

(LinkedIn: https://www.linkedin.com/in/julia-nesterets-bab6a68/ )

Julia Nestrets (Co-founder of JetOctopus, Superwoman, my one and only mentor for life) talked about “How and why AI bots crawl our websites, and how we can use this knowledge”.

It is something that very few people in the industry are talking about, and it’s also very important to highlight that the conclusions that Julia shares are made from the real data that JO’s log analyzer gets for years.

Here are some takeaways that stood out for me:

- There are different types of AI Bots: training bots, search bots, and user bots, and each of them has its own functions. For us most important are:

  • AI Search Bots: send requests to a website (HTTP requests), retrieve content like text, tables, or structured data, and collect the information needed to answer the question

So, when you see them in the logs, it means AI bots have requested content from your site to answer a query.

  • AI User Bots: read the data retrieved by the search bot, summarize or interpret it in natural language, and send the response back to the user.

So, when you see them in the logs, it means AI bots are reading the retrieved content and generating answers.

- AI bots have their preferences, and we can track them and also have a crawl budget, and same as with SE bots, it can be wasted, and we can fix it.


- We can analyse the AI logs and try to understand what impacts AI bots' behaviour, likethe  amount of content on the page, or the number of internal links, etc.

- Through the Log files we can track LLMs' URL hallucinations and work with them! We can create new URLs and optimise them if it’s more relevant to what people are searching, or we can make redirections, so the users would get to the needed pages!


- AI bots don’t visit LLM.txt)))) so, there is no point in creating it and bothering about it!

Sergei  Bezborodov and “AI-driven internal linking”

(LinkedIn https://www.linkedin.com/in/sergebezborodov/ )

Sergei  (the technical guru of bots and logs, co-founder of JetOCtopus) had a very fun and straightforward speech, explaining in simple words (as simple as HE could) AI automation for improving internal linking.

His presentation and examples were focused on the e-commerse websites. But some of the principals are universal, and here what I think is most insightful and useful for iGaming as well:

- If your internal linking is not efficient enough, you don’t have to rebuild it from scratch; building a layer on top can be enough.

- even doing an automated AI internal linking process, don’t try to fix the internal linking of the entire website at once; it can be in little batches of pages, or types of pages, or directories etc. So, crawl, analyse, Interlink, deploy, repeat.

Related articles

Ready to Get Started?

Let’s boost your rankings and authority — without adding headcount.

RANK HIGHER,

GROW FASTER

Subscribe to our newsletter

Address

7 Figures OÜ

Reg code: 14315718
Address: Vesivärava 50,

3rd floor, Room 3.9

Tallinn, Estonia

Ready to Get Started?

Let’s boost your rankings and authority — without adding headcount.

RANK HIGHER,

GROW FASTER

Subscribe to our newsletter

Address

7 Figures OÜ

Reg code: 14315718
Address: Vesivärava 50,

3rd floor, Room 3.9

Tallinn, Estonia

Ready to Get Started?

Let’s boost your rankings and authority — without adding headcount.

RANK HIGHER,

GROW FASTER

Subscribe to our newsletter

Address

7 Figures OÜ

Reg code: 14315718
Address: Vesivärava 50,

3rd floor, Room 3.9

Tallinn, Estonia