Some Takeaways from Google Search Central Live Zurich Event (plus my thoughts)

written by Gagan Ghotra

Published On

Last Updated

On 12th December Google held a live Search Central Event at their Zurich office. Though I wasn’t at this event, I’m based way far away from Zurich in Melbourne, Australia (some may say its land down under) but I noticed this post from Mr Jonathan Jones (SVP of Strategy at Forbes) on X. He was there at this event.

I found this post pretty interesting. Though Jonathan works for Forbes who has been abusing its site reputation but I still respect him. In this tweet he primary talked about what Googler Danny Sullivan said.

But in one of his recent post on his site he also covered what Googlers Martin Splitt and John Mueller said during this event. Those topics were SEO, Gemini & Search Console, AI first content creation, Disallowing AI Crawlers, People First & Not Google First content. I suggest reading Jonathan’s post on his site.

First take away – Expanding topics without losing focus

First take away from the event that Jonathan mentioned is about expanding topics without losing focus.

Googler Danny Sullivan said “Start small, gauge audience reaction, and test adjacent areas.” Danny further explained this with an example saying site known for skiing content starts publishing about snowboarding. It’s an adjacent topic that makes sense to readers. However a leap from skiing to car mechanics would likely confuse users and search engines alike.

Expanding beyond what a site is about have been a topic of discussion recently in context of big publishers literally writing about anything that they want and in some cases abusing ranking systems too. For this Google introduced some changes as well and now even Google in their Search Central document saying

We also have systems and methods designed to understand if a section of a site is independent or starkly different from the main content of the site. By treating these areas as if they are standalone sites, it better ensures a level playing field, so that sub-sections of sites don’t get a ranking boost just because of the reputation of the main site. As we continue to work to improve these systems, this helps us deliver the most useful information from a range of sites.

Which means if some pages are totally different in starkly way from the main content of site, Google will be treating them differently and those pages will not be getting domain level boosting.

I think that this change is amazing because it controls the obvious spam from big publishers literally publishing anything and even with unhelpful content on page showing up at top position in search results.

But as of now I don’t know where things in terms of Google actually implementing some systems to demote starkly different content in search rankings. I have asked Danny on X but he never reply to me unfortunately 😔 anyway I’m excited about Google executing this at scale, hopefully in early 2025 (why I’m excited about this? – well recently I wrote a whole LinkedIn post explaining it)

Second take away – Freelancers vs In-House Content

Googler Danny Sullivan said that “Freelance content isn’t inherently an issue. The problem arises when it’s used to take advantage of a site’s signals without building something new.”

I think that Danny said this in context of site reputation abuse policy. When recently on 6th Dec Google added 9 FAQs to site reputation policy one of the question was “Does freelance content violate the site reputation abuse policy?” and in the answer to this question Google clearly said that No freelance content is not abuse.

But in the question explaining what Google is considering “Third-party content” in context of site reputation abuse. They clearly mentioned freelancers are also third party. That’s what I think Danny clarified by saying Freelance content isn’t inherently an issue. Its using freelancers to abuse the domain level signals to manipulate Google’s Search Results that’s what abuse of site reputation policy is otherwise its fine.

Also site reputation abuse is not about if freelancer or in house writers are the ones writing content. Its producing content aimed at abusing the search systems by leveraging site reputation and therefore manipulate Google’s Search results.

That’s why I even posted this on X couple of days back …

Third take away – Core Updates:Incremental and Continuous

Googler Danny Sullivan said “Thousands of small changes happen annually, but core updates refine how we assess relevance and quality.” and further mentioned that we’d like to reach a point where updates are so regular that people stop noticing them altogether.

Even right now as I’m writing this (on 15th Dec) Google is rolling out December Core Update and just couple of weeks back on 6th Dec rollout of November Core Update was finished. I think that back to back updates are fine and if Search team is choosing to rollout frequent updates, probably its better and that will help to fix the algorithms which they seems to have messed up.

And I hope that these frequent updates will lead to truly first hand experience & helpful content rising in search results and unhelpful pages from branded sites going down in ranking!

Fourth takeaway – AI in Content Production

Aside from what Googlers said about AI content that you can read in the screenshot above. My personal strong recommendation to site owners is not use AI content.

My final thoughts

Things are changing rapidly in SEO world and just continuously improving the SEO strategy to align with current recommendations from Google or testing your own strategies is the only way forward to succeed in Google Search which of course as Head of Search Elizabeth Reid said recently is going to evolve into a system which is more multi modal & accessing information through it will feels more effortless.

Categories:

Leave a Reply

Your email address will not be published. Required fields are marked *