5th of March was the day when Google started to roll out March Core Update and alongside it they also rolled out a Spam Update which involved targeting scaled content abuse search, expired domain abuse, site reputation abuse (this one started to come in effect from 5th May)
Spam Update finished rolling out on 20 March while Core Update was finished on 19 April although Google announced about it being done on 26 April, seems like they forgot or a classic example of what happens in large corporations – one part of company not aware of at all of what’s going on in other side of business.

With rollout being done, Google also shared a feedback form asking for feedback if Search Results for specific queries, if your seeing anything ranking for a query while it should not then I would recommend to fill this form up and submit. Also don’t do this for your competitor sites LOL.
I monitored closely what’s going on and what sites are getting affected by this core update. Below is a summary of some important points about this update and some of my personal observations as well.
Reduction of Low Quality & Unhelpful Content in Search Results
Google initially said that this update will lead to reduction of low quality, unoriginal content in search results by 40% but on 26 April they announced that its 45%.
Which means according to Google due to this update – unhelpful content in search results was reduced by 45%. But to me this seems dishonest because Google have not even shared what they mean by low quality, unhelpful content?? and they just came out saying this without any explanation or any other relevant data point.
Even SEO professional Marie Haynes have shared a story of her friend’s recipe site getting negatively impacted by this update and loss of traffic & revenue. Even other SEO professional Lily Ray mentioned in her youtube video that Google went bit too far with this update. She said that she has seen 5 to 10 examples of sites which were hit brutally and they lost 70/80 percent of their traffic – its not justified at all. She continued to say that those sites are closely aligned with guidelines from Google.
Personally I’m also seeing spammy content & sites ranking everywhere – Google is saying something and actual reality on Search Results page is totally different.
Updated Spam Policies
Spam Policies – Google said that scaled content abuse, expired domain abuse, site reputation abuse will become part of Google’s spam policies. Out of these 3 new additions to spam policies – I’m specifically interested in expired domain abuse. I’ve been tracking how Google just let expired domains rank for everything and I’ve shared most of my research about this on my Twitter (now X) account.
Here is what Google now says about Expired Domain Abuse

Google describes ‘expired domain abuse’ as purchasing and repurposing an expired domain name to manipulate search rankings by hosting content that provides little to no value to users – affiliate content on site previously used by a government agency, commercial medical products being sold on a site previously used by a non-profit medical charity, casino-related content on a former elementary school site.
Out of these examples – I specifically have shared lot of examples where people bought an expired domain of a government agency then started publishing content on higher search volume keywords (mostly newsy kind of content).
Most recent examples is couple of expired domains getting hit by March Core Update and a specific domain which used to be Animal Welfare Board of India but later on they changed to other domain name leaving awbi .in free. Someone bought that domain and started to publish lot of spammy content on it but fortunately Google got this one and hit it.
Funny enough this site was ranking for high volume search query ‘taylor swift age’ (checked on 1 April) but as of 9th May its not ranking anywhere for same query. Also this site was ranking really well for another high volume search query ‘lok sabha elections dates’ query as well but now that too is gone.
I think that Google is getting better in caching expired domain abuse and not ranking these sites but still they would need to do a lot more I think because even today I see lot of expired domains ranking in Search Results.
I’m saying that Google need to do lot more because I know that still there are many potential expired domains available out there which can be bought by spammers and therefore abused.
Also around 3 or 4 years back Government of India decided to move most of their sites from .in .org .com to a subdomain of .gov.in this action left so many domains open which over the years have been bought by spammers and abused as well.
Google need to do more to control expired domain abuse and they have to be clear in their external communication about this. Like back in Sep last year Google’s Gary Illyes did a quick QnA with Danny Goodwin and one of question that Danny asked him ‘Expired domain signals are not inherited‘ to which Gary replied If a domain expires, and somebody buys that domain, any signals the site had accumulated will not be transferred to the new domain owner. Google knows when a domain expires.
After reading this I tweeted that “expired domain signals NOT getting passed if owner changes doesn’t seem correct from what I have observed.” and I even shared an example of expired domain which was ranking because Google was clearly transferring accumulated signals and giving insane ranking boost to the site in Search Results.

What ranking signals does a site have? – Google answered this question
What ranking signals does a site have? In the announcement Google addressed question ‘What ranking signals does a site have?‘ and answered it as Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand how to rank individual pages. We do have some site-wide signals that are also considered. Please note: some third-party services provide “reputation” or “authority” scores for sites. These don’t correspond to any of Google’s own signals nor come from Google.

I found it really interesting that Google clearly mentioned that their ranking systems are primarily designed to work on the page level and they use different signals to understand how individual pages should be ranked – Yes they did mentioned that there are some site-wide signals as well which are considered while ranking pages.
But that primarily I think says a lot about importance of on page SEO and what’s there on the page and for years I’ve extensively focused my SEO strategy on doing on page optimisations before focusing on things like backlinks – to be frank this has worked out really well for all of my clients with whom I have worked since starting SEO Consulting back in 2019.
That’s why even today I recommend business owners to focus on optimising the pages first either it be adding more content or adjusting UI/UX elements – do that first and then focus on building backlinks either through Digital PR or traditional link building methods.
If your looking to build backlinks through Digital PR or even traditional link building methods – I would highly recommend that you follow Google’s Link Spam policies because excessively building links may lead to a manual action as well.
Even going back to exactly what Google said in this announcement which I’ve mentioned above – Our core ranking systems are primarily designed to work on the page level – but I think still building links work and help to improve ranking of site (or pages) even though Google’s Gary Illyes has been saying otherwise. Last year in September while speaking at PubCon Gary said that Links are no longer a top three ranking signal.
But anyway going back to main point ‘what ranking signals does a site have?’ I totally agree with what Google said their ranking systems (signals) are designed to work on the page level. So if your looking to improve ranking of your site’s pages for relevant keywords I would recommend to fix the ON PAGE SEO elements and then look in other optimisations like Link Building.
AI Content – Google’s recommendation about it And Publishing content at Scale
Google’s viewpoint on AI content – In the March Core Update announcement Google answered a question about AI content ‘Is this a change in how Google views AI content in terms of spam?‘
Google answered this question as Our long-standing spam policy has been that use of automation, including generative AI, is spam if the primary purpose is manipulating ranking in Search results. The updated policy is in the same spirit of our previous policy and based on the same principle. It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation.
Google clearly said that using automation techniques including generative AI is spam – if the primary purpose is manipulating ranking in Search Results. While reading this I noticed that Google is saying primary – what does this even mean? Is that mean if someone is using AI to act as a filler or perhaps generate templates for their page and then fill it in – is pages generated in this way be considered spam? perhaps NO.
But by use of word primary I think that Google is saying that if a site is just using AI content generation tools (like ChatGPT) and publishing it without any editing or even adding other content then those pages would be considered spam and will not be ranked in Google’s Search Results.
Also Google mentioned that “It’s been expanded to account for more sophisticated scaled content creation methods where it isn’t always clear whether low quality content was created purely through automation“
I think that by these words Google mean even if some site is creating lot of content (many pages) on regular basis and content of those pages is getting written by humans (but is of low quality) still Google will be considering it as spam, if primary purpose is to manipulate ranking.
Similar to ‘Is this a change in how Google views AI content in terms of spam?’ Google answered another almost similar question about generating content at scale – What’s different from the old policy against “automatically-generated content” and the updated policy against “scaled abuse”?
To this question Google has answered it as – Our new policy is meant to help people focus more clearly on the idea that producing content at scale is abusive if done for the purpose of manipulating search rankings and that this applies whether automation or humans are involved.
I think that by saying ‘producing content at scale is abusive if done for the purpose of manipulating search rankings’ Google wanted to clarified their own announcement from Feb last year about using AI to generate content in which they said that its against their guidelines to use AI content for primary purpose of manipulating search rankings.
But with this new announcement Google is further extending their advice to now say that creating content at scale either using AI or Humans is spam (earlier it used to be just automated methods like AI LLMS) if primary purpose of that content is to rank in Search Results and get traffic without offer any value.
Also Google clarified that there is no single system for identifying helpful content that’s what Google announced back in Aug 2022 and updated in Sep 2023. But with March Core Update Google is saying that their core ranking systems use a variety of signals and systems to judge the helpfulness of individual pages. This even further emphasizes the importance of on-page SEO and creating content which is value, unique, user-centric and is actually helpful for users.
Also helpful content classifier used to work at site level but its now backed into core ranking systems, instead of working at site level its more focused on page level signals. Also well known SEO professional Glen Gabe has written an amazing blog post about this I would recommend you to read it as well – How the transition of Google’s helpful content system to its core ranking system is supposed to work.
I think that Google here is trying to tackle some of publications which are based in India or other asian countries where labour costs are relatively cheap that’s why publications are able to hire lot of writers and churn out hundreds of pages everyday whose primary purpose is to just rank & get traffic – and its not offering some real value to readers.
To be honest most of these human churned out pages are not useful and are clearly not written by experts. An example of this which I can remember off the top of my head is Javatpoint which started as a programming tutorials site in 2011 but now churns out thin & not useful at all content on random topics and from Semrush analytics it seems that Google is not hitting this site instead its pages ranking is improving continuously.
But I expect that with rollout of these new spam policies and Google in their announcement questions clearly mentioning that scaled content either using AI or humans but created primarily to rank in search results is against their policies – I expect Google to hit this and other similar sites moving forward.
If your also a site publishing content completely off the main topic (or primary topic) of your site at scale I would recommend to be cautious because as now in March Core Update announcement pages Google has clearly mentioned that they consider these kind of practices as spam so soon or later Google will be hitting these kind of sites.

Site Reputation Abuse
Another thing that Google mentioned in their announcement is Site Reputation Abuse which they defined as Site reputation abuse is when third-party pages are published with little or no first-party oversight or involvement, where the purpose is to manipulate Search rankings by taking advantage of the first-party site’s ranking signals.
I think that this is specifically to tackle sites which have gone too broad over time, they started as a site which just write about a specific topic but over time they started to write about another topics specifically about topics which tend to have higher search volume.
I would say proof for this is in the pudding itself news sites like Forbes have published hundreds of articles about puppies, guess what primary purpose of these articles is to leverage authority of Forbes as a domain. Then rank for puppies related queries ultimately earning affiliate commissions through clicks on product links which Forbes have mentioned in these articles.
I think that purpose of Google with introducing Site Reputation Abuse is to control the ranking of sites in search results for content which is just published to get traffic and not actually be helpful to users. Although rollout of this was announced back in March but Google gave almost 2 months time to publishers to clean things up.
This policy started kicking off on 7th May as confirmed by Google in this tweet. The rollout involves two different components manual actions and algorithmic component, although it started to roll out on 7th May itself but Google confirmed that they’re doing manual actions first and algorithmic component will come later on.
I’m not sure if algorithmic component of site reputation abuse has started to rollout or not (as of 20th May) I’ve asked Google but they’ve not replied. But from recent volatility in search rankings it seems that algorithmic component is rolling out but can be something else as well.
Considering that now Site Reputation Abuse is officially part of Google’s Spam Policies my personal recommendation for site owners would be to not publish pages with little or no first-party oversight or involvement and avoiding publishing these kind of pages just for sake of manipulating search rankings through taking advantage of your site’s authority.
Again as with any other change or update in any spam policy, Google’s goal is to improve quality of search results and ranking content which is helpful & created for people and not just search engines.
Helpful Content and Systems
Google also answered couple of questions about Helpful Content, first one is “Is there a single helpful content system that Google Search uses for ranking?”
Google answered this as “There is no one system used for identifying helpful content. Instead, our core ranking systems use a variety of signals and systems.”
Which is clearly saying that naah there’s not like a specific system which checks if content is helpful or not but overall core ranking systems use different signals/systems to judge if some content is helpful or not.
Also I would recommend to read through this Google document different ranking systems like BERT, Crisis Information Systems, Depublication Systems and so on. Note this page about different ranking systems still mention Helpful Content System but I asked Google and they confirmed that its not accurate, and they will be updating this page in the future.
Moving to another question Google answered ‘How can I check if my content is helpful?’ to this they answered ‘Our help page on how to create helpful, reliable people-first content has questions that you can use to self-assess your content.’
Although Google has provided some specific questions which can be used for accessing content but to be honest its really hard to judge if some piece of content is helpful or not.
Couple of other questions for which Google provided an answer for
Do Google’s core ranking systems assess the helpfulness of content on a page-level or site-wide basis?
Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand the helpfulness of individual pages. We do have some site-wide signals that are also considered.
Will removing unhelpful content help my other content rank better?
Our systems work primarily at the page level to show the most helpful content we can, even if that content is on sites also hosting unhelpful content. This said, having relatively high amounts of unhelpful content might cause other content on the site to perform less well in Search, to a varying degree. Removing unhelpful content might contribute to your other pages performing better.
If I remove unhelpful content, do I have to wait for a core update to see potential ranking improvements?
Ranking changes can happen at any time for a variety of reasons. We regularly update our core ranking systems. Content on the open web changes, which our systems process. Because of this, there’s no set timeline as to how long it might take for potential improvement to be reflected in ranking.
Some FAQs about March Core Update 2024
What should I do if my website was negatively affected by the March Core update?
First thing I recommend is to check in the Search Console and see if there’s a Manual Penalty – if there is then read through why? Clean up the situation and then submit a request to Google for removal of that penalty.
But if there’s no penalty then I recommend to just do analysis of whole site’s content and find out pages which you deem not helpful for your potential users – either update or get rid of those pages. Then wait and hopefully ranking will improve also meanwhile, make sure that there’s no technical issue on site, your site’s is aligned with Google’s EEAT recommendations, less intrusive ads and not too many affiliate links, don’t overly broad topic and just write focused content about whatever main topic of your site is, keep an eye on changing user needs & interesting accordingly keep on updating content. Also remember to not chase just algorithms focus on people and try to meet their needs.
Which types of websites were most affected by the March 2024 Core Update?
| Site Type | Issue | Characteristics of Impacted Sites |
|---|---|---|
| Gaming | Excessive ads Intrusive interstitials Excessive affiliate links Insufficient E-E-A-T Excessive SEO-driven patterns Low-quality product reviews Stock photography Excessive AI-generated content | Sites with disruptive ads Use of “People Also Ask” question generators Lack of E-E-A-T in YMYL topics |
| Niche Blogs & Product Review Sites | Intrusive ads Excessive affiliate links Non-original imagery AI-generated content Lack of niche specialization Excessive keyword stuffing | Sites using images from social media AI content for reviews Covering a wide range of subjects with low expertise |
| MP3 Downloads, Ringtones, Lyrics & More | AI-generated content Lack of main content Confusing navigation Poor UX Deceptive download options Aggressive advertisements Artificial refreshening | Sites with excessive longtail keywords Unrelated content Poor user experience |
| Travel | AI-generated content Generic stock photography Aggressive advertising Thin content Artificially edited dates | Sites providing travel guidance without evidence of visits Overly broad topics Misleading updates |
| Information & Research | AI-generated content Overly optimized pages Poor user experience Slow load times Excessive AI-generated content | Sites with one page per question using public data without adding value |
| Technology and Software | Spam manual actions Automatic redirects to affiliate sites Excessive ads Unoriginal images | Sites with VPN affiliate links Deceptive download options Lack of author transparency |
Did the March Core Update target AI-generated content specifically?
The March Core Update did not specifically target AI-generated content. However, it had a significant impact on websites that relied heavily on AI-generated content with little human oversight.
Many affected sites showed high scores of AI-generated content (when checked using AI content detection tools which themselves are perfect either)
Having AI content may have contributed to decline in visibility and rankings of sites.
This update specifically focuses on the importance of quality, original content and human oversight, aligning with Google’s long-standing policies against excessive automation without human intervention.
Leave a Reply