If you have an old, established site, chances are you’ve done something in the last few years that is no longer effective or, potentially, detrimental to your SEO. It might be anything from a too-high keyword density, too much focus on links from poor sources, unfortunate uses of code that has accrued a penalty, duplicate content; the sky is the limit.
The typical advice is to make the changes necessary to comply with modern Google standards. That’s all well and good, but once you’ve made the changes, you still don’t see their effects. You run a few experimental searches and, sure enough, Google still has your removed pages indexed and your old content cached. How long does it take for your changes to propagate through the index and you can be free of their shadow?
Unfortunately, there’s no one clear answer to this question. When you make changes to your site, you’re at the mercy of the Google crawlers to discover those changes and make the changes in the overall search index. Sometimes this takes multiple passes, particularly for removed pages. Google wants to make sure that a missing page was actually removed, and it’s not just missing due to a down server or site maintenance.
So, on the absolute short end, you might get lucky. You might upload the changes to your site mere moments before Google finds a link to that page and crawls it. That crawl notes the changes and, if they’re minor, immediately indexes them. The time from making the change to it being reflected in the index could be under five minutes.
On the other hand, consider the long end. You make a change just after Google has crawled your page. It’s a major change, involving over half of the content of the page, including back end code. Google doesn’t find it right away, and when it does, it notes the changes but makes a note to crawl again later to make sure it wasn’t just a read error on the part of the bot. After a few days have passed, the Google bot crawls again, notes that the changes are still there, and verifies them in Google’s processes. The total time from making the change to it being reflected in the index could stretch as long as four or more weeks.
Some things can cause complications in the Google bot’s ability to crawl and reindex your site.
• If you removed the entire page – rather than use NOINDEX and NOFOLLOW to effectively remove it without deleting the content – it can take longer for Google to verify that the page is intentionally gone and not accidentally missing. This problem is exacerbated if you still have live internal links pointing to the content as though it still exists.
• You removed the page from the index entirely using the Google URL Removal tool. This tool is not to be used lightly; it is more of a scorched earth tool. It’s used mostly for removing pages you want no association with, such as pages that were created when your site was hacked. It’s also used to quickly remove confidential information from the index. It should never be used on a page you intend to put back up.
• Your code is malformed in some way. Bad or broken code can cause errors with the Google bot, leading to its inability to parse your site. If it can’t read your pages, it won’t index them. Be careful to test each page when you make a change. To see what Google sees, use the “Fetch as Google” option in the Webmaster Tools. This allows you to see your site from the eyes of the search engine.
• You have the page blocked in your robots.txt file. It’s fairly common to block the page while you make changes, particularly if those changes are made on the live page and will take more than a single editing session. Just remember to remove the block when you want the page to be indexed again!
• The page has a rel=”canonical” reference to an older version that’s live and not edited. The canonical tag tells Google that the linked page is the real version, so it will index that instead of the one it sees.
There are other possible problems as well. The Fetch as Google tool gives you a good idea of what Google sees, and can help you diagnose other potential issues.
If you find that you need changes indexed as quickly as possible, or that you have waited an exceptionally long time without an index pass, you can take a few steps to encourage the process.
• Obtain links from pages that are indexed daily. A new link on a page that is regularly indexed will itself be indexed quickly. This leads the bots to your page more quickly than they otherwise would arrive, giving you a bit of an edge. This is a good idea for publishing content normally, too.
• Use the Fetch as Google tool. It’s not listed on the tool description, nor does Google confirm, but many users claim that using the Fetch as Google tool makes Google aware of the page and may be used to help discover and index content. If nothing else, it’s an educational procedure for discovering how the search engine sees the world.
• Use the Google Ping function. The function of Ping is to tell Google there’s new content on a given page. It has been mostly supplanted by XML sitemaps in recent times, but the Ping can still be a useful utility. When you post new content, or when edited content is updated, Ping Google about it and they’ll know to index it soon.
• Submit a sitemap to Google Sitemaps. This is by far the best way to have new content discovered and old content reindexed. An XML sitemap gives you the ability to specify when a page was last changed, to tell Google it should be indexed again. Submit a site map and keep it up to date; Google will use it to discover new content as well as index old content. Make sure you keep change dates for each page, which can be generated entirely via plugin.
• Publish enough new content that your site is indexed regularly. This is similar to the first tip, though it’s not really valuable for having your current content reindexed. If you’re a large, frequently-updated blog, you can leverage that content flow to entice Google to crawl your site more often.
• Submit your edited links to various social media sites, particularly Google+. More exposure means more chances for Google to discover and follow a link. IT also helps your SEO in a general way and gains you more social interaction on your content; good to have all around.
• Submit your new content through an indexed RSS feed. Google will follow certain RSS feeds and index content when it appears. All you need to do is use RSS and tell your edited content to republish as new.
• Give it more time. Really, Google is a massive edifice. Sometimes, it just doesn’t get around to checking out your site as quickly as you would like. With millions of pages seeing updates, changes or creation every day, it’s easy to get lost in the shuffle. Just give it time; sooner or later, Google will find you.
Hi, I submit regularly new post at my website, however I found that re-indexing is not being done as I want, it is taking enough time compare to new things.
I’ve submitted my sitemap for over a week ago. Still my posts are not getting index on google. where the problem actually is? can you shed some lights on it.
I had trouble with my robots.txt file so I fixed it, also created an uploaded a site map and then had Google ‘fetch’ the site using their tool . That was yesterday. There’s no mention in your article whether you are talking minutes, hours or days before I can expect results or determine if something is still amiss. Any experience with the fetch process and when I should expect results (or go back in to find more problems?).
i just recrawl my blog and there side nothing change mean google show me old Title and description Why
Fetch as google seems to work for me fairly quickly, takes a little more time for it to place the page in it’s new place in the results page though
I’m gone to convey my little brother, that he should also go to see this website on regular basis to obtain updated from most recent news.