How can we fix “crawled- currently not indexed” error messages in our Google Search Console account?
Date: 24/04/2025
Introduction
Many businesses will have noticed “crawled but currently not indexed notifications appearing in their Google Search Console account. This can be rectified; we shall explain how you can do that easily.
You may need to improve the content marketing quality (Google EEAT)
Improving the quality of the page
Sometimes the content marketing is not of high enough quality for it to be indexed by Google. Googlebot may have completed indexing it, yet after looking at the page, it may have deemed it as low quality. This can happen for a whole host of reasons, for example, the page could be considered “content thin”.
If the page is deemed “content thin”, then Google won’t add it to its index.
Put yourself in Google’s shoes for just a second.
Google can select from thousands, if not millions, of pages to appear on its first page. Therefore, with such an abundance of choice, why would Google want to put a weak page on the 1st page- that’s right, it would not.
So, instead, you must put your best effort into your business’s content marketing; it’s only then that the page will most likely have a higher Google EEAT score.
When you talk to any good marketing agency, they will tell you that for your business’s content marketing to appear in the index, it must be high quality with a high Google EEAT score.
Website architecture and using internal links
Getting the page layout right is essential, as this can have a significant impact on the ability to improve the on-page SEO at a later date.
For example, if you group all laundry cleaning appliances into one page, this will make the SEO agency’s job a lot harder later on.
So, instead, you split the pages into categories, giving each separate product its page. For example, washing machines should be kept separate from tumble dryers and have their own tab and page.
Is your website suffering from duplication issues?
Duplication issues can cause havoc regarding your business’s organic search engine optimisation. Try to find duplicates and work closely with a marketing agency to rectify such issues.
As you would expect, Google and Googlebot have a hard enough job. Daily, they index millions of pages, so what they don’t need is for existing pages to be duplicated because, well, things get messy.
Google then has to decipher, well, which is the original?
Why is there duplication?
Is the company trying to copy another business for that business to try and fool Google into thinking it’s the author? And therefore should be rewarded with stronger SEO?
Duplication can cause issues and damage your SEO, and therefore, it needs to be sorted.
Have you improved the page and then requested that it be reindexed?
When a car fails an MOT, you can work to rectify the problem areas, then resubmit it for a new test.
This is the same with content marketing; it might not have been indexed before, but you can improve the page and ask for reindexing later.
This means that even if the page was not indexed in the first instance, it will never be indexed. Through improvements made to the page, you could then resubmit through manual indexing, by pasting the URL into your Google Search Console account and asking for it to be reindexed.
So, how do you go about fixing crawled but not indexed Pages? We provide step-by-step instructions:
First of all, decide whether the content’s quality is high or not.
Google’s Quality Raters Guidelines.
We would highly recommend reading this advice that has been offered by Google.
You can then gain insight into improving your business’s content marketing quality.
Therefore, regardless of the size of the business, whether it’s a multibillion-pound e-commerce business or you are a sole trader, you should consider the quality of each page.
Is the quality good enough?
If it’s not, the page will need to be improved futher.
The content quality made better, in order to do this, do read about Google EEAT.
Google can change how it evaluates your content marketing over time.
This means that a page might never get indexed, or it could be indexed, but then then later deindexed. Therefore, you do need to check whether the pages of your website have been indexed or not.
Its good to regularly check your businesses Google Search Console account, to check which pages are indexed or not. This could act as an indicator as to which pages may need improvement.
Internal linking can sometimes help.
The search engine bots, such as Googlebot, use internal links and the layout of your website to understand which pages are the most important.
It is important to realise that there is only a limited crawl budget that Google places on each website.
This means that Googlebot can only spend a limited time indexing each website.
Therefore, businesses need to use the crawl budget efficiently.
Thus, using XML site maps, good website architecture, and internal links, you can help Google and Google’s algorithms understand the most important pages on your company website.
It also helps Googlebot to discover new pages and add them to its index.
Duplicate content can be a significant problem.
Ideally, you want just original pages, no duplicates on your website.
If you have duplication issues, then this can damage your companies organic SEO.
Instead, you want a unique, well-written text for your company website. For example, if you’re writing product descriptions, when describing say a blender, then if this is copied from the manufacturer’s website it will cause duplication issues.
Instead write your unique and well written text to describe that product.
Canonical tags:
Canonical tags are simply HTML tags that tell search engines, such as Google and Bing, which of the versions of the text is the original.
Internal links
Please ensure you add internal links from your blog post to the main pages and vice versa to help shoppers to better navigate the pages.
XML Sitemaps
Adding an XML site map on WordPress website is both free and straightforward as plugins can help you do this. These can then give a list of pages Googlebot and the other bots can then crawl and index.. They should only be at the canonical version of your XML site map.
How we can help
We are a well-known SEO agency serving the glorious areas of both Bristol and Bath. Whether you need help improving your technical SEO, on-page or off-page SEO our team can help.