Before you dismiss your Google rank, consider how important it is. Promoting your content from the tenth to the first position […]
By AayushGoogle Search Console’s “Discovered – currently not indexed” message indicates that although Google is aware of the URL, it hasn’t yet been crawled and indexed.
It does not imply that the page won’t be handled in the future. According to their documentation, they might revisit it later without requiring further work from you.
However, there are further reasons why Google might not be able to crawl and index the page, such as:
- Problems with servers and on-site technology that limit or prohibit Google’s ability to crawl.
- Problems with the page itself, such as its quality.
Additionally, you may queue URLs for their coverage state (as well as other helpful data points) in bulk using the Google Search Console Inspection API.
Using Google Search Console, request indexing
This is a straightforward approach that will fix the problem in most situations.
It happens occasionally that Google takes a long time to scan new URLs. Other times, though, the cause is deeper problems.
When you submit an indexing request, one of two things may occur:
- The URL changes to “Crawled – not indexed”
- Short-term indexing
Both are signs of more severe problems.
The second occurs because your URL may receive a brief “freshness boost” when you request indexing, which may raise the URL beyond the necessary quality threshold and result in temporary indexing.
Why is Google taking so long to crawl my pages?
Google may take longer to crawl your pages, depending on how frequently it visits a website.
The following are some variables that affect how often Google crawls a website:
How relevant is the website deemed by Google?
There’s a reasonable risk Google won’t index and rank your content if you don’t write anything related to your website and business. The ‘Discovered – presently not indexed’ error code may therefore appear.
Suppose you are an SEO agency and start writing about venture capital and company investments, for instance. In that case, Google might decide not to index this content even though it has found your URLs through internal links or your sitemap unless you do this as a business practice (and this is well documented).
Make sure that the material you are promoting aligns with the search intent of the queries you are aiming for at all times.
How often is fresh content posted on the website?
It’s no secret that Google favors websites with frequent content updates and new content. It’s possible that your material takes longer to index than other websites if you are a website that doesn’t regularly release content. This error may appear as you wait for your material to be indexed. Because your website does not consistently publish content, Google is aware that your content exists. Still, it is taking longer to crawl and/or index it.
Although you don’t have to publish on a weekly or monthly schedule, you should make sure you are consistently adding new content and updating pages that have been inactive for some time. This is something you can put on your SEO checklist as a reminder. To assist with this, we also offer a standard operating procedure for content refresh that your company can utilize for free!
Having an excessive number of URLs on the page to crawl.
Only large websites with thousands, hundreds of thousands, or even millions of URLs will be impacted by this possible reason.
The ‘crawl budget’ is how Google crawlers work. This indicates that the number of times a website’s pages may be crawled within a given period is limited for each website.
Google will postpone crawling and indexing some pages until you have the crawl budget to support them if you have a tremendous website and are struggling to manage your crawl budget.
For news websites that upload hundreds or even thousands of new pages per day, this is a typical issue.
Site mistakes costing crawl funds.
It’s crucial to manage your crawl budget if your website is significant. Multiple faults might affect your website’s crawl budget. New content you publish may take longer for Google to crawl and index. These errors include inactive webpages, server problems, keyword cannibalization, etc. Your website may be affected by these crawl problems, and you may notice the ‘Discovered – presently not indexed’ message.
Error codes such as “Excluded by ‘Noindex’ Tag” are not the same as “Discovery-Currently Not Indexed.” The Discovered – presently not indexed error indicates that Google has not chosen to crawl your website at all, whereas this shows that your page has been crawled but has not been indexed because a noindex tag has appeared anywhere in the sitemap or on the page.
How to solve the issue?
Step 1. Make an indexing request
It’s a good idea to ask Google to index your pages if your website has a limited number of “Discovered – currently not indexed” pages. This can be completed directly in GSC.
- Click on the row and then select Inspect URL from the menu on the right to begin inspecting the URL above.
- When the URL has been examined, click Request Indexing.
- GSC may need a minute or two to test the URL. Following its completion, the following notice will appear to confirm that indexing has been requested.
- Fantastic! Although simple, this method is frequently a little picky. Google will only permit you to submit a maximum of 10 to 15 URLs daily for indexing.
- Based on my experience, it’s common to have to submit repeated requests for indexing for specific pages. If this occurs, be aware that you probably have a more severe problem that needs to be addressed.
Step 2. Enhance Internal Link Structure
A link from one page on your website to another is called an internal link. An essential element of technical SEO is internal links. Additionally, internal connections play a crucial role in informing Google which pages on your website ought to be indexed.
Internal links let Google know which pages on your website are most significant. Internal links also assist Google in determining the relationships and connections between your sites.
Likely, these pages don’t have many internal links from other pages on your website if you’ve just added a lot of new information. Even worse, some of these pages may have no internal links going to them or be orphaned.
The best course of action is to add internal links to these pages on other pages of your website if your pages still need to be indexed after submitting an indexing request in GSC. Internal contextual links, such as blog posts, can be added to different page’s bodies. Additionally, you can include internal links to these pages from the header or footer sections of your website, among other navigational components.
Step 3. Eliminating Budgetary Crawl Friction
John Mueller from Google’s search team was asked directly about “Discovered – currently not indexed” issues during a 2021 Google SEO office hours hangout. He was asked if this issue was usually caused by a website’s crawl budget or the amount of resources Google dedicates to crawling and indexing the website. This wouldn’t be an issue, according to John, for the majority of smaller websites. But, he acknowledged that it might be for websites with millions of pages.
Step 4. Enhance the caliber of the content
Problems with content quality are frequently the cause of Google finding your pages but not indexing them.
John was asked whether content quality issues on a website could be the cause of the “Discovered – currently not indexed” issue during the Google SEO office hours that were previously discussed. In response, he explained that although Google probably wouldn’t be evaluating the quality of each URL and choosing not to index it, Google would consider the overall quality of a website. Its crawlers might decide not to index pages from a website if there are a lot of low-quality pages on it.
John emphasized that tiny websites should concentrate on potential quality concerns rather than technical ones if a significant portion of their pages are not getting indexed.
There are several kinds of problems with the quality of content:
Content Produced by AI
Businesses that produce a lot of poor-quality AI-generated material could have trouble getting indexed.
Google’s remarks regarding the use of AI-generated material are contradictory and ambiguous. While there is undoubtedly a place for AI-generated content, Google would argue that using AI to manipulate search results is not the best use of the technology. Instead, you should use AI as a starting point in the editorial process rather than mass publishing large amounts of content to your website.
Publishers of websites are being urged by Google to produce original, high-quality material. A piece of content can stand out in many ways by utilizing professional quotes, first-party facts, excellent visuals, or videos, among other strategies.
Minimal Content
Low-quality content that offers searchers little to nothing is referred to as thin content. Google believes displaying sparse content in search results will lower the standard of its user’s experience.
Too few words on a page is a standard definition of thin content. Although it is one thin material, there are other kinds, such as doorway pages, AI-generated content, low-quality programmatically made content, duplicate content, and excessive promotional content.
In the realm of search engine optimization, there is no substitute for quality. Your published content may be deemed thin if you have many non-helpful pages.
Additionally, Google’s algorithms may view your website as having a lower overall quality if it has a lot of thin pages. As a result, Google may be hesitant to crawl and index new pages on your website.
Identical Content
Content that appears on both your website and a third-party website is considered duplicate content. Several SEO problems, such as keyword cannibalization and indexing problems, can be brought on by duplicate content.
Google may consider your website to be of low quality if it finds many pages that are substantially duplicate or duplicate. A programmatic SEO strategy may result in a significant amount of the same material across multiple pages.
Assume you are having problems with indexing because you duplicated material from one page to another to move more quickly. If so, you should go back to those previously released pages and significantly alter them.
Step 5. Boost Domain Authority
The quantity and caliber of external hyperlinks leading to your website are measured by your domain authority. Large, well-established websites frequently have hundreds or even thousands of backlinks, which contributes to their high domain authority. In contrast, newer or smaller websites typically have a lower domain authority.
Google once referred to the quality or strength of hyperlinks as leading to a website called PageRank, which is now commonly used to refer to domain authority.
By raising your website’s domain authority, you may tell Google that it is a reliable and superior resource. Google seeks to index websites that are both reputable and authoritative.
You may influence how quickly Google indexes your new web pages by creating backlinks to your website.
Backlinks come in various forms, and there are numerous approaches to constructing backlinks. Generally speaking, quantity is never more significant than quality. You should strive to build backlinks from respectable and authoritative websites in your field. Purchasing a lot of backlinks from user-generated or low-quality websites is not something you should do. Additionally, you should exercise extreme caution if you’re hiring a link-building company to handle this task.
Last Words
This is not an isolated incident; several other users have also reported seeing “Discovered – currently not indexed” in their Google Search Console reports. This is a common issue with smaller, more recent websites.
Using Search Console, you can ask Google to manually index the few of these pages you have. Think of this as a starting point.
If, despite your requests for indexation, your pages are still not indexed, you may be positive that there is a more significant issue with the content of your website. Prioritize internal linking above all else so that Google can determine which pages are pertinent to you.
As soon as you reach that stage, you should start paying close attention to the page’s quality. Just so you know, there is hope for fixing duplicate content concerns, thin page counts on your website, and heavy reliance on AI-generated data. After these improvements are applied, your pages should be indexed appropriately.
Increasing the domain authority of your website and creating hyperlinks to it can help to speed up crawling and indexing.