Saturday, January 16, 2021

How to Get Pages Indexed by Google, Quickly?

If a page isn’t in Google’s index, there’s 0% chance that it will receive organic traffic.

Indexation, in an over simplified nutshell, is step 2 in Google’s ranking process:

  • Crawling
  • Indexing
  • Ranking

This article will focus on how to get Googlebot to index more pages on your site, faster.

How to check if your pages are indexed by Google

The first step is understanding what your website’s indexation rate is.

Indexation rate = # of pages in Google’s index / # of pages on your site

You can review how many pages your website has indexed in Google Search Console’s “Index Coverage Status Report“.

check indexation in google search console

If you see errors or a large number of pages outside of the index:

  • Your sitemap might have URLs that are non-indexable (i.e. pages set to NOINDEX, blocked via robots.txt or require user login)
  • Your site might have a large number ‘low quality’ or duplicate pages that Google deems unworthy
  • Your site might not have enough ‘authority’ to justify all the pages

You can dig into the specifics in the table underneath (this is an awesome new feature in Google’s updated Search Console).

find indexation issues in search console

 

How to get pages on your site indexed

I hate to be cliche, but you really need to deliver the right experience to get Google’s attention. If your site doesn’t meet Google’s guidelines in regards to trust, authority and quality, these tips will likely not work for you.

With that being said, you can use these tactics to improve your site’s indexation rate.

 

1. Use Fetch As Google

Google Search Console has a feature allowing you to input a URL for Google to “Fetch”. After submission, Googlebot will visit your page and index.

fetch-as-google

Here’s how to do it…

  • Log into Google Search Console
  • Navigate to Crawl Fetch as Google
  • Take the URL you’d like indexed and paste it into the search bar
  • Click the Fetch button
  • After Google had found the URL, click Submit to Index

Assuming the page is indexable, it will be picked up within a few hours.

 

2. Use internal links

Search engines crawl from page to page through HTML links.

search through links

Image credit

We can use authority pages on your site to push equity to others. I like to use Ahrefs “best pages by links” report.

ahrefs best pages by links

This report tells me the most authoritative pages on my site – I can simply add an internal link from here to a page that needs equity.

It’s important to note, the 2 interlinking pages need to be relevant – it’s not a good idea to link unrelated pages together.

Read my guide about internal linking silos

 

3. Block low quality pages from Google’s index

While content is a cornerstone of a high quality website, the wrong content can be your demise. Too many low quality pages can decrease the number of times Google crawls, indexes and ranks your site..

For that reason, we want to periodically “prune” our website’s by removing the garbage pages

Pages that serve no value should be:

  • Set to NOINDEX. When the page still has value to your audience, but not search engines (think thank you pages, paid landing pages, etc).
  • Blocked via crawl through Robots.txt file. When an entire set of pages has value to your audience, but not search engines (think archives, press releases).
  • 301 redirected. When the page has no value to your audience or search engines, but has existing traffic or links (think old blog posts with links).
  • Deleted (404). When the page has no value to your audience or search engines, and has no existing traffic or links.

We’ve built a content audit tool to help you with this process.

 

4. Include the page in your sitemap

Your sitemap is a guide to help search engines understand which pages on your site are important.

Having a page in your sitemap does NOT guarantee indexation, but having failing to include important pages will decrease indexation.

If your site is running on WordPress, it’s incredibly easy to setup and submit a sitemap using a plugin (I like Yoast).

Read more about how to build a sitemap

Once your sitemap is built and submit is GSC, you can review in the Sitemaps report.

xml sitemap indexation rate

Double check to make sure all pages you want indexed are included. Triple check to make sure all pages you DON’T want indexed are NOT included.

 

5. Share the page on Twitter

Twitter is a powerful network that Google crawls regularly (they index Tweets, too).

Google indexes Tweets

It’s a no brainer to share your content on social media, but it’s also an easy way to give Google a nudge.

 

6. Share the page on high traffic sites

Sites like Reddit and Quora are popular sites that allow you to drop links. I make it a regular practice to promote recently published pages on Quora – it helps with indexation, but also can drive a ton of traffic.

promote on quora

If you’re feeling lazy (and grey hat), you can buy “social signals” on sites like Fiver.

 

7. Secure external links to the page

As previously mentioned, Google crawls from page to page through HTML links.

Getting other websites to link to yours is not only a huge ranking factor, but a great way to pick up the indexation of your website.

The easiest ways to get links:

  • Guest post on a relevant, authoritative website
  • Find relevant bloggers or media sites and reach out with an advertising request

This is grossly over simplified – you can check out my top link building tactics for more ideas.

 

8. “Ping” your website

Sites like Ping-O-Matic that send “pings” to search engines to notify them that your blog has been updated.

website pingers

Honestly, it’s not the greatest method – but it’s fast, free and easy to use.

Source: https://webris.org/google-index/

Tuesday, August 11, 2020

Google Update 10th August 2020 – what we know so far.

From what we can tell, Google has begun rolling out an enormous Google Search ranking algorithm update from the 10th of August. While it is yet to be officially confirmed by Google’s Search Liaison, the chatter amongst the SEO and search community is loud and clear: so far, it does not look good.

From what it looks like, the Google Update began to roll out at around 2 pm ET on the 10th of August. The most significant changes seem to be rankings, with no real clear algorithm pattern. With many speculating that the update looks more like a bug or a bad algorithm test, page one rankings for many authoritative and successful websites seem to have tanked.

10 August Google Update


In the case of SEO in Australia, we’ve seen all of our competitors and the big players in our industry suffer on page one. A couple of our clients are being outranked by forums, dynamically generated amazon listings, random Facebook posts and even job listings. What is going on here?

We’re still not sure what’s happening. From what we can tell, the algorithm shift has adjusted search engine rankings to have poor-quality pages gaining the top ten spots on search results. Our early research into the algorithm shifts have also uncovered that many local search results are being completely outranked by eBay, Amazon, directory listings and cloaking websites that are not only irrelevant but incredibly spammy for these types of searches.

From what we have analysed, the results that were ranking on page one yesterday now all seem to be sitting on page six, seven and eight of Google. Page one is mysteriously cluttered with spammy, cloaking, phishing websites. Ecommerce websites have seemed to slip in favour of forum, directory and social media listings.

10 August Google Update


There is plenty of speculation happening on Twitter, Webmaster World, and SEO forums. While there are a couple of people mentioning that their sites have benefitted from the SERP changes, the majority are reporting on a brutal shift in their traffic and rankings. Worldwide, many have reported that their sites are being de-ranked in favour of spammy websites and directory listings. There has also been a lot of talk about drastic changes within short spaces of time – with results being updated every 30 minutes or so for some.

It makes little sense for Google to have the first pages of search results filled with unrelated forums, cloaking websites and social media websites and directory listings. Google’s success over competing search engines is that its algorithm provides the most logical and pleasing user experience. By way of logic, it would seem that this update is not in line with providing high quality and useful organic search results.

10 August Google Update


Again, it is far too early to conclude on anything but regular checks on SERP trends and it’s pretty clear that something huge is happening. Google is yet to confirm any changes. There’s every chance we’re witnessing an enormous glitch or bug. But there’s also a chance that this may be a part of a new Google search ranking algorithm update.

Continue to monitor your rankings and watch for any changes. If we’ve learnt anything from previous Google algorithm updates, it’s important to wait it out until the update has fully rolled out or Google has confirmed the suspicions. Don’t do anything drastic, and if your website has suddenly tanked in the search engine results; you’re not alone.

Monday, July 27, 2020

How to Prevent Search Engines from Indexing WordPress Sites?

Prevent Search Engine

Site owners will do anything to get their websites indexed. However, you might not want search engines to crawl through your website if it’s still in development. In a case like this, it’s recommended to discourage search engines from indexing your site. Stick with us if you want to learn more about this topic!

  1. Discouraging Search Engines From Indexing WordPress SitesUsing the WordPress Built-In FeatureEditing robots.txt File Manually
  2. Password Protecting Your WordPress WebsiteUsing Hosting Control PanelUsing WordPress Plugins
  3. Removing Indexed Page From Google

Why Would You Want To Stop Search Engines From Indexing Your Site?

There are some cases where people want to discourage search engines from indexing their sites:

  • Unfinished websites — at this stage of errors and trials, it’s best not to have your website available to the public eyes.
  • Restricted websites — if you plan to have an invite-only website, you do not want it to get listed on SERPs.
  • Test accounts — web owners create a site duplicate for testing and trial purposes. Since these sites are not designed for the public, don’t let it get indexed by search engines.

So how do you block search engines from indexing your site? Well, take a look at several options below and try it yourself.

1. Discouraging Search Engines From Indexing WordPress Sites

The simplest way to stop search engines from indexing your website is by preventing them from crawling it. To do it, you need to edit your website directory’s robots.txt file. Here are a few ways to achieve that:

Using the WordPress Built-In Feature

Editing WordPress robots.txt is quite easy as you only need to use a WordPress built-in feature. Here’s how:

  1. Login to WordPress admin area and go to Settings -> Reading.
  2. Scroll down and locate the Search Engine Visibility option.
  3. Check the option that says Discourage search engines from indexing this site.
  4. Save Changes, and that’s it! WordPress will automatically edit its robots.txt file for you.

Editing robots.txt File Manually

If you prefer the manual option, you can use File Manager or an FTP client to edit the robots.txt file.

In this article, we’ll show you how to do it through the hPanel’s File Manager:

  1. Login to hPanel and locate File Manager under the Files area.

  2. Go to your WordPress root directory folder (in most cases, it’s public_html) and find the robots.txt file. If you can’t find it, create a new blank file.
  3. Right-click on the file and select Edit.

  4. Enter the following syntax:

    User-agent:
    * Disallow: /

The code above will prevent search engines from indexing your whole site. If you want to apply the disallow rule to a specific page, write the page’s subdirectory and slug. For example: Disallow /blog/food-review-2019.

The syntaxes in robots.txt files are case sensitive, so be careful when editing.

2. Password Protecting Your WordPress Website

Search engines and web crawlers don’t have access to password-protected files. Here are a few methods to password protect your WordPress site:

Using Hosting Control Panel

If you are a Hostinger client, you can password protect your website using hPanel’s Password Protect Directories tool:

  1. Access hPanel and navigate to Password Protect Directories.
  2. Enter your root directory into the first field.
  3. Once the directory is selected, enter your username and password and click Protect.

If your root directory is public_html, leave the directory column blank

The process in cPanel is also quite similar:

  1. Log in to your cPanel account and head to Directory Privacy.

  2. Select your root directory. In our case, it’s public_html.
  3. Check the Password protect this directory option, and name the protected directory. Press Save.
  4. Create a new user to login to the protected website, and that’s it!

Using WordPress Plugins

There are tons of plugins that can help to password protect your site. However, the Password Protected plugin might just be the best one out there. It’s been tested with the new WordPress update, and it’s pretty straightforward to use.

After installing and activating the plugin, head to Settings -> Password Protected and configure the settings to match your needs.

3. Removing Indexed Page From Google

Don’t worry if Google has indexed your site. You can remove it from SERPs by following these steps:

  1. Set up Google Search Console for your website.
  2. Access Google Search Console of your newly added website and scroll down to Legacy tools and reports -> Removals.
  3. Click the Temporarily hide button and enter the URL you want to remove from Google.
  4. On a new window, choose Clear URL from cache and temporarily remove from search, then Submit Request.

And that’s it! Google will temporarily remove your site from search results. Make sure to apply the previous methods to prevent Google from indexing your site again.

Conclusion

There you have it! Quick and easy ways to discourage search engines from indexing your sites. Here’s a quick recap of the methods we’ve learned today:

  • Edit the robots.txt file, which can be performed automatically or manually.
  • Password protect your website by using a plugin or your hosting control panel.
  • Remove indexed pages from Google via Google Search console.

If you have any other methods, or if you have any questions, please do let us know in the comments. Good luck!