Tuesday, August 11, 2020

Google Update 10th August 2020 – what we know so far.

From what we can tell, Google has begun rolling out an enormous Google Search ranking algorithm update from the 10th of August. While it is yet to be officially confirmed by Google’s Search Liaison, the chatter amongst the SEO and search community is loud and clear: so far, it does not look good.

From what it looks like, the Google Update began to roll out at around 2 pm ET on the 10th of August. The most significant changes seem to be rankings, with no real clear algorithm pattern. With many speculating that the update looks more like a bug or a bad algorithm test, page one rankings for many authoritative and successful websites seem to have tanked.

10 August Google Update


In the case of SEO in Australia, we’ve seen all of our competitors and the big players in our industry suffer on page one. A couple of our clients are being outranked by forums, dynamically generated amazon listings, random Facebook posts and even job listings. What is going on here?

We’re still not sure what’s happening. From what we can tell, the algorithm shift has adjusted search engine rankings to have poor-quality pages gaining the top ten spots on search results. Our early research into the algorithm shifts have also uncovered that many local search results are being completely outranked by eBay, Amazon, directory listings and cloaking websites that are not only irrelevant but incredibly spammy for these types of searches.

From what we have analysed, the results that were ranking on page one yesterday now all seem to be sitting on page six, seven and eight of Google. Page one is mysteriously cluttered with spammy, cloaking, phishing websites. Ecommerce websites have seemed to slip in favour of forum, directory and social media listings.

10 August Google Update


There is plenty of speculation happening on Twitter, Webmaster World, and SEO forums. While there are a couple of people mentioning that their sites have benefitted from the SERP changes, the majority are reporting on a brutal shift in their traffic and rankings. Worldwide, many have reported that their sites are being de-ranked in favour of spammy websites and directory listings. There has also been a lot of talk about drastic changes within short spaces of time – with results being updated every 30 minutes or so for some.

It makes little sense for Google to have the first pages of search results filled with unrelated forums, cloaking websites and social media websites and directory listings. Google’s success over competing search engines is that its algorithm provides the most logical and pleasing user experience. By way of logic, it would seem that this update is not in line with providing high quality and useful organic search results.

10 August Google Update


Again, it is far too early to conclude on anything but regular checks on SERP trends and it’s pretty clear that something huge is happening. Google is yet to confirm any changes. There’s every chance we’re witnessing an enormous glitch or bug. But there’s also a chance that this may be a part of a new Google search ranking algorithm update.

Continue to monitor your rankings and watch for any changes. If we’ve learnt anything from previous Google algorithm updates, it’s important to wait it out until the update has fully rolled out or Google has confirmed the suspicions. Don’t do anything drastic, and if your website has suddenly tanked in the search engine results; you’re not alone.

Monday, July 27, 2020

How to Prevent Search Engines from Indexing WordPress Sites?

Prevent Search Engine

Site owners will do anything to get their websites indexed. However, you might not want search engines to crawl through your website if it’s still in development. In a case like this, it’s recommended to discourage search engines from indexing your site. Stick with us if you want to learn more about this topic!

  1. Discouraging Search Engines From Indexing WordPress SitesUsing the WordPress Built-In FeatureEditing robots.txt File Manually
  2. Password Protecting Your WordPress WebsiteUsing Hosting Control PanelUsing WordPress Plugins
  3. Removing Indexed Page From Google

Why Would You Want To Stop Search Engines From Indexing Your Site?

There are some cases where people want to discourage search engines from indexing their sites:

  • Unfinished websites — at this stage of errors and trials, it’s best not to have your website available to the public eyes.
  • Restricted websites — if you plan to have an invite-only website, you do not want it to get listed on SERPs.
  • Test accounts — web owners create a site duplicate for testing and trial purposes. Since these sites are not designed for the public, don’t let it get indexed by search engines.

So how do you block search engines from indexing your site? Well, take a look at several options below and try it yourself.

1. Discouraging Search Engines From Indexing WordPress Sites

The simplest way to stop search engines from indexing your website is by preventing them from crawling it. To do it, you need to edit your website directory’s robots.txt file. Here are a few ways to achieve that:

Using the WordPress Built-In Feature

Editing WordPress robots.txt is quite easy as you only need to use a WordPress built-in feature. Here’s how:

  1. Login to WordPress admin area and go to Settings -> Reading.
  2. Scroll down and locate the Search Engine Visibility option.
  3. Check the option that says Discourage search engines from indexing this site.
  4. Save Changes, and that’s it! WordPress will automatically edit its robots.txt file for you.

Editing robots.txt File Manually

If you prefer the manual option, you can use File Manager or an FTP client to edit the robots.txt file.

In this article, we’ll show you how to do it through the hPanel’s File Manager:

  1. Login to hPanel and locate File Manager under the Files area.

  2. Go to your WordPress root directory folder (in most cases, it’s public_html) and find the robots.txt file. If you can’t find it, create a new blank file.
  3. Right-click on the file and select Edit.

  4. Enter the following syntax:

    User-agent:
    * Disallow: /

The code above will prevent search engines from indexing your whole site. If you want to apply the disallow rule to a specific page, write the page’s subdirectory and slug. For example: Disallow /blog/food-review-2019.

The syntaxes in robots.txt files are case sensitive, so be careful when editing.

2. Password Protecting Your WordPress Website

Search engines and web crawlers don’t have access to password-protected files. Here are a few methods to password protect your WordPress site:

Using Hosting Control Panel

If you are a Hostinger client, you can password protect your website using hPanel’s Password Protect Directories tool:

  1. Access hPanel and navigate to Password Protect Directories.
  2. Enter your root directory into the first field.
  3. Once the directory is selected, enter your username and password and click Protect.

If your root directory is public_html, leave the directory column blank

The process in cPanel is also quite similar:

  1. Log in to your cPanel account and head to Directory Privacy.

  2. Select your root directory. In our case, it’s public_html.
  3. Check the Password protect this directory option, and name the protected directory. Press Save.
  4. Create a new user to login to the protected website, and that’s it!

Using WordPress Plugins

There are tons of plugins that can help to password protect your site. However, the Password Protected plugin might just be the best one out there. It’s been tested with the new WordPress update, and it’s pretty straightforward to use.

After installing and activating the plugin, head to Settings -> Password Protected and configure the settings to match your needs.

3. Removing Indexed Page From Google

Don’t worry if Google has indexed your site. You can remove it from SERPs by following these steps:

  1. Set up Google Search Console for your website.
  2. Access Google Search Console of your newly added website and scroll down to Legacy tools and reports -> Removals.
  3. Click the Temporarily hide button and enter the URL you want to remove from Google.
  4. On a new window, choose Clear URL from cache and temporarily remove from search, then Submit Request.

And that’s it! Google will temporarily remove your site from search results. Make sure to apply the previous methods to prevent Google from indexing your site again.

Conclusion

There you have it! Quick and easy ways to discourage search engines from indexing your sites. Here’s a quick recap of the methods we’ve learned today:

  • Edit the robots.txt file, which can be performed automatically or manually.
  • Password protect your website by using a plugin or your hosting control panel.
  • Remove indexed pages from Google via Google Search console.

If you have any other methods, or if you have any questions, please do let us know in the comments. Good luck!

Monday, February 10, 2020

16 On-Page SEO Factors You Must Update On Your Blog At All Times

On-page SEO Checklist

Why is on-page SEO so important?

Of course your classic on-page SEO tweaks still work. Even better than before actually. And I’m not the only one who says that. Take it from Google too.

Behind those fancy AI-based algorithm updates lie your usual keyword optimization hacks. With no keyword input and relevant related words, Google’s bots simply wouldn’t be able to understand your content and place it where relevant.

Other studies like this one from Backlinko also justify the use of on-page SEO methods. Just run any search for a competitive keyword and you’ll notice most websites try to keep their on-page factors clean and relevant.

When done the right way, optimizing your pages for optimal ranking can also:

  • Boost, even double, your website traffic
  • Bring in more leads
  • Improve your click-through-rates
  • Increase time on page
  • Reduce bounce rates
  • Match reader intent
  • Position you as a thought leader in your industry
  • And so much more!
  • On-page SEO factors to optimize right away

But you have so many factors to optimize, where do you start?

On-Page SEO

Below are all the on-page SEO factors that are worth your time:

1. SEO-friendly URL

Short URL that includes the keyword.

As an example:

www.domain.com/blog/on-page-seo

is better than a default URL

www.domain.com/blog/sfeogytytuyjyj.html

or a long one

www.domain.com/blog/on-page-seo-factor-to-optimize-this-year.html

or the other likes.

Make sure you think this through before you publish the article. Changing the URL after will make you lose your links unless you add a redirect.

Another issue to pay attention to is to make sure you won’t be using the same keyword in another URL for a more profitable page.

For instance, if you’re an SEO agency you might want a page like:

www.domain.com/on-page-seo

But if you later decide to also put together a guide for the same keyword, you won’t be able to use the same URL so you’ll have to publish it on your blog as www.domain.com/blog/on-page-seo or change the URL.

2. Title Tag

Your main keyword should ideally be placed at the beginning of your title. Especially in the case of the SEO title. You can set this one separately from the Yoast WordPress plug-in.

Here are 4 examples for the “blogging mistakes” keyword where the 1st result is optimal:

Title Tags

The SEO title is not final like the URL so you can change it at any time after publishing the post.

If you’ve got a bit more time, do some A/B testing on your SEO title. Change it every 3-4 months to see which one works best for your CTR.

3. Headings! Headings!

Nailing the keyword optimization of your headings is so important, yet so many writers seem to skip this part.

You’ve got multiple options here:

Take your main keyword and create natural headings around it. This means your keyword will appear in 2-3 headings.
Place your main keyword in the 2-3 headings mentioned at point 1 and optimize the rest of your headings for secondary keywords.
Above all, remember to include at least H2s and H3s in your text. [like this article, btw] Ideally, you’d have separate designs for these so they are easily distinguishable by readers.

4. The first 100 words

Another ignored on-page SEO factor is including your keyword in the first 100 words of your article. I don’t always do this because sometimes it doesn’t seem natural to shove a keyword in the first few words since you might want to set the scene first.

But if you can manage to add it in the first sentence, way to go! Google will automatically consider this topic is of top importance to the article and thus crawl it accordingly.

5. Frequent keywords, but no stuffing!

Stuffing is actually quite hard to do these days without readers reporting your content.

RECOMMENDED FOR YOU
Webcast, March 5th: How AI Can Find Opportunities and Shorten Your Sales Cycles
REGISTER NOW
Keyword stuffing looks something like:

These are our blog blogging blogger tips for bloggers who blog on blogs for bloggers…

Not cool. I know.

Instead, natural keyword frequency looks more like:

We’ve put together this list of tips for first-time bloggers who are looking to improve the quality of their blog posts…

And then just use your keywords sparingly and in a natural way throughout the entire article.

6. Outbound links

These are the links you add to relevant content on other websites. The general rule (or best practice if you want to) is to only link to materials that will be of value to your readers or support your claims.

You can try my trick and create a strategy to always follow for this. For instance, I only link to reports or studies and occasionally to external tools readers might want to test.

Don’t add too many though. Google didn’t disclose a number of outbound links that’s ok to use, but most blog guidelines [and my own experience] will accept a maximum of 3 links.

Also, try not to link to content that targets the same keyword you want to aim for. Google will automatically think that even you consider that content is better so it will be much more difficult to rank higher than that competitor.

7. Internal links

We’ve got two situations here.

The first case is when you add links to your other blog posts or web pages in this article you’re currently putting together. By all possible means, make sure the links are relevant to your topic.

The second instance happens after you publish your article. Try to find 2-3-4 of your other posts that are relevant to your new post and place a link to this new article on relevant keywords only.

Disclaimer: Avoid link stuffing. This means you shouldn’t use your top-performing article to link to every possible post of yours.

For all external and internal links, make sure you check them regularly so the links are not broken or the content there hasn’t fundamentally changed and no longer matches your needs.

8. Page speed

Smaller images, enable file compression, reduce redirects, minify CSS, JavaScript, and HTML, improve your server’s response time, and anything else PageSpeedInsights tells you to change.

9. Responsive design

Google has been seriously penalizing websites that are not responsive.

Mobile traffic is still growing so even if you don’t believe Google will have a say, your readers will when they won’t be able to click your call-to-action button.

10. Meta description

This is the small snippet of content that users will see under your SEO title in the search results.

Two secrets here:

Include your keyword for Google.
Include a CTA or an enticing fact or incentive to make people want to click on your post. This will also boost your click-through-rate.
Yes, the meta description too can be changed even years after first publishing the article.

Go way back in your blog’s history and check all meta descriptions. You’ll be surprised to discover missing ones too.

11. Review the readers’ intent

So you have this post that ranked well for 2 years but then it died. Do a SERP research again to see if the readers’ intent has changed or maybe your competitors managed to answer better to their needs.

This is also a perfect time for you to review the entire structure of the article and run a new keyword research to check for new potential secondary keywords to target. Keyword volumes and difficulty can change often. Every week even. So keeping an eye on the evolution of the keywords that are highly valuable for your business is vital to ensure you maintain your position.

12. Remove duplicate content in all its forms

Canonical links will be your best friend here. Especially for e-commerce websites who commonly have duplicate content on their category pages.

But even if you’re not an e-commerce website, I recommend making sure you have the canonical link set for every page of yours. Yes, that includes articles.

A much-too-common issue beginner marketers make is adding their blog posts to multiple categories or tags on their blog. This inevitably creates duplicate content so bear through the temptation and stick to one category.

13. ALT tags and file names

You’re probably already aware that the keyword your want to rank for should also be part of your ALT text in at least one image. [Still, try to add ALT tags to all images and include secondary keywords in them.]

Disclaimer: Don’t do keyword stuffing here either. A good ALT tag is “blogging mistakes for beginners”. A bad ALT tag looks like this “blogging mistakes bloggers blogs beginner mistakes”

What many writers are still not aware of is the importance of having keywords in the file name of your images as well. You know, blogging-mistakes.png instead of screenshot56.png.

14. Data markup

This only applies to specific websites where you want to post news, reviews, recipes, and the likes.

Your results will appear like:

SEO Data

Instead of:

bad markup

So many options here that can be added and tested at all times. Heach to Schema.org for all the details to see if there’s anything right for your blog type.

15. Got social media?

If you don’t yet have social media sharing buttons on your posts, go right now and get a plug-in. Many tools let you customize the text readers will share (or at least the suggestion) so they can bring in more views via their own networks.

16. No more black hat techniques!
Finally, make sure your website is free of any black hat SEO techniques. These include spammy links, cloaking, doorway pages, hidden text and links, spam comments, duplicate content, link farms, paid links even.

Surprisingly or not, Google is starting to pick up on paid links. That’s why many large websites strictly prohibit selling links on their website. Unfortunately, you’ll still occasionally receive emails from writers who are willing to provide such links. Just say no. It’s cheaper, more valuable, and easier to become an author on that website yourself anyway.

Where to take your on-page SEO efforts next?

Bookmark this article or create your own checklist of everything you need to change. If possible, make sure you analyze all of these on-page SEO factors and how they perform on your own pages.

I won’t lie to you and tell you the process is easy or quick. It can take months if you have a year’s worth of content or more.

But it’s worth it!

Got any extra tips on optimizing the on-page SEO factors for your website? What has worked for you and where are you still experimenting? Let us know!

Read more at: business2community.com