Ads

Showing posts with label Google SEO Updates. Show all posts
Showing posts with label Google SEO Updates. Show all posts

Friday, October 06, 2023

What are Zombie pages in SEO: chase them away from your site!

Zombie Pages

Zombie pages are pages on a website that generate little or no traffic and are difficult or impossible to access through search engine results.

In this article, we will give you our advice on how to detect these pages and how to treat them so that they do not affect the visibility of your entire site.

Summary:

  1. Why do we have to deal with the zombie pages?
  2. The different types of zombie pages
  3. How to locate these pages?
  4. How to deal with zombie pages?

Why do we have to deal with the Zombie Pages?

The detection and processing of zombie pages allows to:

  • Improve the user experience of visitors. Removing or correcting the zombie pages of a site allows to provide a better user experience and to improve the bounce and conversion rate of a site.
  • To improve the Quality Score awarded by Google. The search engine judges a site as a whole – compensating for the negative effects of zombie pages raises its overall score and therefore improves its positioning.
  • To optimize the crawl budget. Removing or blocking the indexing of zombie pages allows to spread the crawl time that is allocated to a site on its most significant pages.

The different types of Zombie Pages.

1 – Unindexed Pages.

These pages usually have technical problems such as loading times that are too long or scripts that do not execute. Google divides the time that its crawlers spend on a site according to the number of pages on it, it will choose not to index pages that slow down its task and which would anyway also have a high chance of being abandoned by visitors.

These pages are absent from Google’s results index, they are not visited or in any case, do not receive direct traffic from the search engine.

2 – “Non Responsive” pages.

Pages that are not optimized or that take too long to navigate on mobile phones are also at a disadvantage. Google will punish them because it considers them to offer a degraded user experience.

These pages are present in Google’s results but their ranking is penalized.

3 – Pages with obsolete or low-quality content.

There are two types:

  • Published pages that have not been updated for several years. Google may downgrade the rating of such pages by considering that they are no longer current.
  • Pages with low content (less than 300 words) or without real interest are also penalized.

These pages are gradually downgraded in the results.

4 – pages not (or not enough) optimized for SEO.

These pages can be quite useful and interesting for internet users but they do not apply the SEO criteria (such as the absence of alt, h1, h2, or h3 tags, a bad title or a too-long title, and no keywords…).

These pages are downgraded in search engine results.

5 – The annex pages.

These are often pages that can be accessed at the footer of the site: contact, legal notices, GTC, GDPR… Even if they are of little interest to the internet user, they contain legal information and their presence on a site is an SEO requirement.

The absence of these pages negatively affects the referencing of a site.

6 – Orphan pages.

These pages are simply not found by crawler robots, they are not linked by any internal link to other pages of the site and they are not accessible through the site’s menu. They are pages that somehow float in a parallel universe with almost no chance of being visited.

Several techniques exist to identify the orphan pages of a site. One of the simplest (well, unless your site has thousands of pages) is to compare your XML sitemap with the Google index of your site (you will get this index by a search such as “site:mywebsite.com” in Google). You just have to compare the two lists to identify the pages present in your sitemap but absent from the Google index. You will then just have to link the orphan pages to the rest of your site by internal meshing.

How to locate the Zombie Pages?

If you want to make the diagnosis by yourself without going through the services of an agency, we advise you to use the Google Search Console. You will find the tools that will allow you to detect pages with low or decreasing performance.

The “performance” tab (+ new + page), very easy to use (especially if your site is only a few pages long), will allow you to compare the evolution over time of traffic on each of your pages and thus detect those that are experiencing a sharp drop in traffic.

The “excluded” tab (+ coverage + excluded) will allow you to analyze two types of zombie pages:

  • The “Explored, currently unindexed” pages

These are pages that Google decided not to index during its last crawl considering their content too weak, duplicated, or containing information already present on many other sites. It is therefore advisable here to first complete and/or rewrite the content of these pages and wait for Google’s robots to come and explore them again.

  • The pages “Detected, currently not indexed”.

These are pages that Google has chosen not to index due to technical problems (e.g. when the server response time is too long).

How to deal with the Zombie Pages?

Some pages just need to be updated or optimized, while others really need to be removed and redirected.

Improve these pages.

As zombie pages are often pages with a too-long loading time, absent from the site’s mesh, or with unsuitable content, you need to rehabilitate them in the eyes of Google as well as in the eyes of your visitors.

  • Update and enrich the content of these pages;
  • Check that they contain the right keywords and that the semantic richness of the text is adapted to the subject matter;
  • Improve UX and Loading Time;
  • Add links to other linked pages on the site;
  • Add internal inbound links from other pages on your site.;
  • Share it on your social networks;
  • Do not change their URL.

Delete them!

Don’t launch into this delicate operation without checking the pages you are going to delete on a case-by-case basis.

If there are zombie pages on your site that have outdated content and do not generate any conversions, then it is possible to delete them.

On the other hand, pages that interest only a few visitors but which have a very “profitable” conversion rate should be kept.

Of course, there are zombie pages that are essential like the legal notices, the General Conditions of Sale, and the RGPD… which generate little or no visits and are to be kept.

Once your zombie pages have been deleted, don’t forget to redirect (301) the URLs of these pages to the pillar pages of the appropriate category or to other pages that deal with a similar theme.

Tuesday, May 31, 2022

Google Launching May 2022 Broad Core Algorithm Update

 Google is releasing a broad core algorithm update on May 25, 2022. It will take roughly two weeks to fully roll out.

Google May 2022 Broad Core Algorithm Update

 Google confirms a broad core algorithm update, called the May 2022 core update, is rolling out today.

Core updates are designed to make search results more relevant for users. Though the update is launching today, it will take 1-2 weeks for this update to fully roll out.


Danny Sullivan, Google’s Public Liaison for Search, notes that changes to site performance in search results are to be expected.

“Core updates are changes we make to improve Search overall and keep pace with the changing nature of the web. While nothing in a core update is specific to any particular site, these updates may produce some noticeable changes to how sites perform…”

When a core update rolls out, Google is known for pointing to the same guidance it published in 2019.


Nothing has changed there, as Google references the same document with respect to this update.

To sum it up, Google’s general advice regarding core updates is as follows:
  • Expect widely noticeable effects, such as spikes or drops in search rankings.
  • Core updates are “broad” in the sense that they don’t target anything specific. Rather, they’re designed to improve Google’s systems overall.
  • Pages that drop in rankings aren’t being penalized; they’re being reassessed against other web content that has been published since the last update.
  • Focusing on providing the best possible content is the top recommended way to deal with the impact of a core algorithm update.
  • Broad core updates happen every few months. Sites might not recover from one update until the next one rolls out.
  • Improvements do not guarantee recovery. However, choosing not to implement any improvements will virtually guarantee no recovery.
It has been six months since the last Google core update, which rolled out in November 2021.

Those who have been working hard on their site during that time may start to see some noticeable improvements to search rankings.

On the other hand, those who have left their site sit idle may begin to see themselves outranked by sites with more relevant content.

It’s too early to start assessing the impact, however, as this update will take a week or longer to roll out.

Your rankings may fluctuate during that time, so don’t be alarmed if you notice changes right away.

Google will confirm when the update is finished rolling out, then it will be time to start doing your analysis.

Saturday, November 20, 2021

Google is testing the IndexNow protocol for sustainability

After both Microsoft Bing and Yandex announce IndexNow, Google promises to give it a try.

Index Now


A Google spokesperson has confirmed that the search company will be testing the new IndexNow protocol first introduced by Microsoft Bing and Yandex a few weeks ago. Google said while its crawling efforts are efficient, Google will test to see if it can improve its overall sustainability efforts by leveraging the IndexNow protocol.

Google’s statement.

Google told Search Engine Land “we take a holistic approach to sustainability at Google, including efforts to develop the most efficient and accurate index of the web.” “We have been carbon neutral since 2007 and we are on pace to be carbon-free by 2030 across both our data centers and campuses. We’re encouraged by work to make web crawling more efficient, and we will be testing the potential benefits of this protocol,” a Google spokesperson added.

What is IndexNow?

IndexNow provides a method for websites owners to instantly inform search engines about latest content changes on their website. IndexNow is a simple ping protocol so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.

How it works?

The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change.

The steps include:
  1. Generate a key supported by the protocol using the online key generation tool.
  2. Host the key in text file named with the value of the key at the root of your web site.
  3. Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.

Submit one URL is easy as sending a simple HTTP request containing the URL changed and your key.
https://www.bing.com/IndexNow?url=url-changed&key=your-key and the same would work by using https://yandex.com/indexnow?url=url-changed&key=your-key

How does it work with Google?

We asked Google if https://www.google.com/IndexNow?url=url-changed&key=your-key would work with the search engine and we await to hear back. The protocol works that if you submit it to Bing or Yandex and, one would assume now Google, that one submission to any of those search engines would send them to all the search engines participating in this protocol. So submitting it to Bing, would essentially also submit it to Yandex, Google and other participating search engines.

Google changed its mind?

When this first was introduced, we were told that Google is aware of the IndexNow initiative and the company was asked to participate but that Google did not. I guess Google had a change of heart?

Google crawling is efficient.

Google’s crawling mechanism is supposedly very efficient and Google has continued to improve its crawling efficiency. Last year Googlebot began supporting HTTP/2, the next generation of the fundamental data transfer protocol of the web. HTTP/2 is significantly more efficient than its predecessors and it saves resources for both Google and web sites. Google uses HTTP/2 in over half of all crawls.

Why we care?

Like we said before, instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. Google has been very strict about its applications indexing API, used for job postings and livestream content only now. Google’s change of heart here can be an exciting change for SEOs and site owners.

The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing.


Friday, November 19, 2021

Google November 2021 Core Update

Google is rolling out a new broad core update today named the November 2021 Core Update. This is the third core update Google released in 2021.

Google Nov Core Update

The announcement. Google announced this rollout on the Google Search Central Twitter account, not the Search Liaison account, which it has done for all other previous announcements on core updates.

Rollout started at about 11am ET. Google updated us that the rollout has begun at about 11am ET. Google said “The November 2021 Core Update is now rolling out live. As is typical with these updates, it will typically take about one to two weeks to fully roll out.”

Timing before holidays. It is a bit shocking to see Google rollout this update before, and likely during (assuming this is a normal two week rollout), the biggest online holiday shopping season. Black Friday and Cyber Monday is less than two weeks away and Google is rolling out this update starting today.

Previously Google took breaks before the holiday shopping season, it was Google’s gift to webmasters said former Googler Matt Cutts.

Previous core updates.

The most recent previous core update was the July 2021 core update and before that it was the June 2021 core update and that update was slow to roll out but a big one. Then we had the December 2020 core update ands the December update was very big, bigger than the May 2020 core update, and that update was also big and broad and took a couple of weeks to fully roll out. Before that was the January 2020 core update, we had some analysis on that update over here. The one prior to that was the September 2019 core update. That update felt weaker to many SEOs and webmasters, as many said it didn’t have as big of an impact as previous core updates. Google also released an update in November, but that one was specific to local rankings. You can read more about past Google updates over here.

What to do if you are hit.

Google has given advice on what to consider if you are negatively impacted by a core update in the past. There aren’t specific actions to take to recover, and in fact, a negative rankings impact may not signal anything is wrong with your pages. However, Google has offered a list of questions to consider if your site is hit by a core update. Google did say you can see a bit of a recovery between core updates but the biggest change you would see would be after another core update.

Why we care.

Whenever Google updates its search ranking algorithms, it means that your site can do better or worse in the search results. Knowing when Google makes these updates gives us something to point to in order to understand if it was something you changed on your web site or something Google changed with its ranking algorithm. Today, we know Google will be releasing a core ranking update, so keep an eye on your analytics and rankings over the next couple of weeks.

Tuesday, August 11, 2020

Google Update 10th August 2020 – what we know so far.

From what we can tell, Google has begun rolling out an enormous Google Search ranking algorithm update from the 10th of August. While it is yet to be officially confirmed by Google’s Search Liaison, the chatter amongst the SEO and search community is loud and clear: so far, it does not look good.

From what it looks like, the Google Update began to roll out at around 2 pm ET on the 10th of August. The most significant changes seem to be rankings, with no real clear algorithm pattern. With many speculating that the update looks more like a bug or a bad algorithm test, page one rankings for many authoritative and successful websites seem to have tanked.

10 August Google Update


In the case of SEO in Australia, we’ve seen all of our competitors and the big players in our industry suffer on page one. A couple of our clients are being outranked by forums, dynamically generated amazon listings, random Facebook posts and even job listings. What is going on here?

We’re still not sure what’s happening. From what we can tell, the algorithm shift has adjusted search engine rankings to have poor-quality pages gaining the top ten spots on search results. Our early research into the algorithm shifts have also uncovered that many local search results are being completely outranked by eBay, Amazon, directory listings and cloaking websites that are not only irrelevant but incredibly spammy for these types of searches.

From what we have analysed, the results that were ranking on page one yesterday now all seem to be sitting on page six, seven and eight of Google. Page one is mysteriously cluttered with spammy, cloaking, phishing websites. Ecommerce websites have seemed to slip in favour of forum, directory and social media listings.

10 August Google Update


There is plenty of speculation happening on Twitter, Webmaster World, and SEO forums. While there are a couple of people mentioning that their sites have benefitted from the SERP changes, the majority are reporting on a brutal shift in their traffic and rankings. Worldwide, many have reported that their sites are being de-ranked in favour of spammy websites and directory listings. There has also been a lot of talk about drastic changes within short spaces of time – with results being updated every 30 minutes or so for some.

It makes little sense for Google to have the first pages of search results filled with unrelated forums, cloaking websites and social media websites and directory listings. Google’s success over competing search engines is that its algorithm provides the most logical and pleasing user experience. By way of logic, it would seem that this update is not in line with providing high quality and useful organic search results.

10 August Google Update


Again, it is far too early to conclude on anything but regular checks on SERP trends and it’s pretty clear that something huge is happening. Google is yet to confirm any changes. There’s every chance we’re witnessing an enormous glitch or bug. But there’s also a chance that this may be a part of a new Google search ranking algorithm update.

Continue to monitor your rankings and watch for any changes. If we’ve learnt anything from previous Google algorithm updates, it’s important to wait it out until the update has fully rolled out or Google has confirmed the suspicions. Don’t do anything drastic, and if your website has suddenly tanked in the search engine results; you’re not alone.

Monday, July 27, 2020

How to Prevent Search Engines from Indexing WordPress Sites?

Prevent Search Engine

Site owners will do anything to get their websites indexed. However, you might not want search engines to crawl through your website if it’s still in development. In a case like this, it’s recommended to discourage search engines from indexing your site. Stick with us if you want to learn more about this topic!

  1. Discouraging Search Engines From Indexing WordPress SitesUsing the WordPress Built-In FeatureEditing robots.txt File Manually
  2. Password Protecting Your WordPress WebsiteUsing Hosting Control PanelUsing WordPress Plugins
  3. Removing Indexed Page From Google

Why Would You Want To Stop Search Engines From Indexing Your Site?

There are some cases where people want to discourage search engines from indexing their sites:

  • Unfinished websites — at this stage of errors and trials, it’s best not to have your website available to the public eyes.
  • Restricted websites — if you plan to have an invite-only website, you do not want it to get listed on SERPs.
  • Test accounts — web owners create a site duplicate for testing and trial purposes. Since these sites are not designed for the public, don’t let it get indexed by search engines.

So how do you block search engines from indexing your site? Well, take a look at several options below and try it yourself.

1. Discouraging Search Engines From Indexing WordPress Sites

The simplest way to stop search engines from indexing your website is by preventing them from crawling it. To do it, you need to edit your website directory’s robots.txt file. Here are a few ways to achieve that:

Using the WordPress Built-In Feature

Editing WordPress robots.txt is quite easy as you only need to use a WordPress built-in feature. Here’s how:

  1. Login to WordPress admin area and go to Settings -> Reading.
  2. Scroll down and locate the Search Engine Visibility option.
  3. Check the option that says Discourage search engines from indexing this site.
  4. Save Changes, and that’s it! WordPress will automatically edit its robots.txt file for you.

Editing robots.txt File Manually

If you prefer the manual option, you can use File Manager or an FTP client to edit the robots.txt file.

In this article, we’ll show you how to do it through the hPanel’s File Manager:

  1. Login to hPanel and locate File Manager under the Files area.

  2. Go to your WordPress root directory folder (in most cases, it’s public_html) and find the robots.txt file. If you can’t find it, create a new blank file.
  3. Right-click on the file and select Edit.

  4. Enter the following syntax:

    User-agent:
    * Disallow: /

The code above will prevent search engines from indexing your whole site. If you want to apply the disallow rule to a specific page, write the page’s subdirectory and slug. For example: Disallow /blog/food-review-2019.

The syntaxes in robots.txt files are case sensitive, so be careful when editing.

2. Password Protecting Your WordPress Website

Search engines and web crawlers don’t have access to password-protected files. Here are a few methods to password protect your WordPress site:

Using Hosting Control Panel

If you are a Hostinger client, you can password protect your website using hPanel’s Password Protect Directories tool:

  1. Access hPanel and navigate to Password Protect Directories.
  2. Enter your root directory into the first field.
  3. Once the directory is selected, enter your username and password and click Protect.

If your root directory is public_html, leave the directory column blank

The process in cPanel is also quite similar:

  1. Log in to your cPanel account and head to Directory Privacy.

  2. Select your root directory. In our case, it’s public_html.
  3. Check the Password protect this directory option, and name the protected directory. Press Save.
  4. Create a new user to login to the protected website, and that’s it!

Using WordPress Plugins

There are tons of plugins that can help to password protect your site. However, the Password Protected plugin might just be the best one out there. It’s been tested with the new WordPress update, and it’s pretty straightforward to use.

After installing and activating the plugin, head to Settings -> Password Protected and configure the settings to match your needs.

3. Removing Indexed Page From Google

Don’t worry if Google has indexed your site. You can remove it from SERPs by following these steps:

  1. Set up Google Search Console for your website.
  2. Access Google Search Console of your newly added website and scroll down to Legacy tools and reports -> Removals.
  3. Click the Temporarily hide button and enter the URL you want to remove from Google.
  4. On a new window, choose Clear URL from cache and temporarily remove from search, then Submit Request.

And that’s it! Google will temporarily remove your site from search results. Make sure to apply the previous methods to prevent Google from indexing your site again.

Conclusion

There you have it! Quick and easy ways to discourage search engines from indexing your sites. Here’s a quick recap of the methods we’ve learned today:

  • Edit the robots.txt file, which can be performed automatically or manually.
  • Password protect your website by using a plugin or your hosting control panel.
  • Remove indexed pages from Google via Google Search console.

If you have any other methods, or if you have any questions, please do let us know in the comments. Good luck!

Monday, February 10, 2020

16 On-Page SEO Factors You Must Update On Your Blog At All Times

On-page SEO Checklist

Why is on-page SEO so important?

Of course your classic on-page SEO tweaks still work. Even better than before actually. And I’m not the only one who says that. Take it from Google too.

Behind those fancy AI-based algorithm updates lie your usual keyword optimization hacks. With no keyword input and relevant related words, Google’s bots simply wouldn’t be able to understand your content and place it where relevant.

Other studies like this one from Backlinko also justify the use of on-page SEO methods. Just run any search for a competitive keyword and you’ll notice most websites try to keep their on-page factors clean and relevant.

When done the right way, optimizing your pages for optimal ranking can also:

  • Boost, even double, your website traffic
  • Bring in more leads
  • Improve your click-through-rates
  • Increase time on page
  • Reduce bounce rates
  • Match reader intent
  • Position you as a thought leader in your industry
  • And so much more!
  • On-page SEO factors to optimize right away

But you have so many factors to optimize, where do you start?

On-Page SEO

Below are all the on-page SEO factors that are worth your time:

1. SEO-friendly URL

Short URL that includes the keyword.

As an example:

www.domain.com/blog/on-page-seo

is better than a default URL

www.domain.com/blog/sfeogytytuyjyj.html

or a long one

www.domain.com/blog/on-page-seo-factor-to-optimize-this-year.html

or the other likes.

Make sure you think this through before you publish the article. Changing the URL after will make you lose your links unless you add a redirect.

Another issue to pay attention to is to make sure you won’t be using the same keyword in another URL for a more profitable page.

For instance, if you’re an SEO agency you might want a page like:

www.domain.com/on-page-seo

But if you later decide to also put together a guide for the same keyword, you won’t be able to use the same URL so you’ll have to publish it on your blog as www.domain.com/blog/on-page-seo or change the URL.

2. Title Tag

Your main keyword should ideally be placed at the beginning of your title. Especially in the case of the SEO title. You can set this one separately from the Yoast WordPress plug-in.

Here are 4 examples for the “blogging mistakes” keyword where the 1st result is optimal:

Title Tags

The SEO title is not final like the URL so you can change it at any time after publishing the post.

If you’ve got a bit more time, do some A/B testing on your SEO title. Change it every 3-4 months to see which one works best for your CTR.

3. Headings! Headings!

Nailing the keyword optimization of your headings is so important, yet so many writers seem to skip this part.

You’ve got multiple options here:

Take your main keyword and create natural headings around it. This means your keyword will appear in 2-3 headings.
Place your main keyword in the 2-3 headings mentioned at point 1 and optimize the rest of your headings for secondary keywords.
Above all, remember to include at least H2s and H3s in your text. [like this article, btw] Ideally, you’d have separate designs for these so they are easily distinguishable by readers.

4. The first 100 words

Another ignored on-page SEO factor is including your keyword in the first 100 words of your article. I don’t always do this because sometimes it doesn’t seem natural to shove a keyword in the first few words since you might want to set the scene first.

But if you can manage to add it in the first sentence, way to go! Google will automatically consider this topic is of top importance to the article and thus crawl it accordingly.

5. Frequent keywords, but no stuffing!

Stuffing is actually quite hard to do these days without readers reporting your content.

RECOMMENDED FOR YOU
Webcast, March 5th: How AI Can Find Opportunities and Shorten Your Sales Cycles
REGISTER NOW
Keyword stuffing looks something like:

These are our blog blogging blogger tips for bloggers who blog on blogs for bloggers…

Not cool. I know.

Instead, natural keyword frequency looks more like:

We’ve put together this list of tips for first-time bloggers who are looking to improve the quality of their blog posts…

And then just use your keywords sparingly and in a natural way throughout the entire article.

6. Outbound links

These are the links you add to relevant content on other websites. The general rule (or best practice if you want to) is to only link to materials that will be of value to your readers or support your claims.

You can try my trick and create a strategy to always follow for this. For instance, I only link to reports or studies and occasionally to external tools readers might want to test.

Don’t add too many though. Google didn’t disclose a number of outbound links that’s ok to use, but most blog guidelines [and my own experience] will accept a maximum of 3 links.

Also, try not to link to content that targets the same keyword you want to aim for. Google will automatically think that even you consider that content is better so it will be much more difficult to rank higher than that competitor.

7. Internal links

We’ve got two situations here.

The first case is when you add links to your other blog posts or web pages in this article you’re currently putting together. By all possible means, make sure the links are relevant to your topic.

The second instance happens after you publish your article. Try to find 2-3-4 of your other posts that are relevant to your new post and place a link to this new article on relevant keywords only.

Disclaimer: Avoid link stuffing. This means you shouldn’t use your top-performing article to link to every possible post of yours.

For all external and internal links, make sure you check them regularly so the links are not broken or the content there hasn’t fundamentally changed and no longer matches your needs.

8. Page speed

Smaller images, enable file compression, reduce redirects, minify CSS, JavaScript, and HTML, improve your server’s response time, and anything else PageSpeedInsights tells you to change.

9. Responsive design

Google has been seriously penalizing websites that are not responsive.

Mobile traffic is still growing so even if you don’t believe Google will have a say, your readers will when they won’t be able to click your call-to-action button.

10. Meta description

This is the small snippet of content that users will see under your SEO title in the search results.

Two secrets here:

Include your keyword for Google.
Include a CTA or an enticing fact or incentive to make people want to click on your post. This will also boost your click-through-rate.
Yes, the meta description too can be changed even years after first publishing the article.

Go way back in your blog’s history and check all meta descriptions. You’ll be surprised to discover missing ones too.

11. Review the readers’ intent

So you have this post that ranked well for 2 years but then it died. Do a SERP research again to see if the readers’ intent has changed or maybe your competitors managed to answer better to their needs.

This is also a perfect time for you to review the entire structure of the article and run a new keyword research to check for new potential secondary keywords to target. Keyword volumes and difficulty can change often. Every week even. So keeping an eye on the evolution of the keywords that are highly valuable for your business is vital to ensure you maintain your position.

12. Remove duplicate content in all its forms

Canonical links will be your best friend here. Especially for e-commerce websites who commonly have duplicate content on their category pages.

But even if you’re not an e-commerce website, I recommend making sure you have the canonical link set for every page of yours. Yes, that includes articles.

A much-too-common issue beginner marketers make is adding their blog posts to multiple categories or tags on their blog. This inevitably creates duplicate content so bear through the temptation and stick to one category.

13. ALT tags and file names

You’re probably already aware that the keyword your want to rank for should also be part of your ALT text in at least one image. [Still, try to add ALT tags to all images and include secondary keywords in them.]

Disclaimer: Don’t do keyword stuffing here either. A good ALT tag is “blogging mistakes for beginners”. A bad ALT tag looks like this “blogging mistakes bloggers blogs beginner mistakes”

What many writers are still not aware of is the importance of having keywords in the file name of your images as well. You know, blogging-mistakes.png instead of screenshot56.png.

14. Data markup

This only applies to specific websites where you want to post news, reviews, recipes, and the likes.

Your results will appear like:

SEO Data

Instead of:

bad markup

So many options here that can be added and tested at all times. Heach to Schema.org for all the details to see if there’s anything right for your blog type.

15. Got social media?

If you don’t yet have social media sharing buttons on your posts, go right now and get a plug-in. Many tools let you customize the text readers will share (or at least the suggestion) so they can bring in more views via their own networks.

16. No more black hat techniques!
Finally, make sure your website is free of any black hat SEO techniques. These include spammy links, cloaking, doorway pages, hidden text and links, spam comments, duplicate content, link farms, paid links even.

Surprisingly or not, Google is starting to pick up on paid links. That’s why many large websites strictly prohibit selling links on their website. Unfortunately, you’ll still occasionally receive emails from writers who are willing to provide such links. Just say no. It’s cheaper, more valuable, and easier to become an author on that website yourself anyway.

Where to take your on-page SEO efforts next?

Bookmark this article or create your own checklist of everything you need to change. If possible, make sure you analyze all of these on-page SEO factors and how they perform on your own pages.

I won’t lie to you and tell you the process is easy or quick. It can take months if you have a year’s worth of content or more.

But it’s worth it!

Got any extra tips on optimizing the on-page SEO factors for your website? What has worked for you and where are you still experimenting? Let us know!

Read more at: business2community.com

Thursday, December 19, 2019

BERT Explained: What You Need to Know About Google’s New Algorithm

Google announced that it has been rolling out a new update called BERT.

Google BERT Update

To give you an idea of how big of an update this is, it’s the biggest update since Google released RankBrain.

In other words, there is a really good chance that this impacts your site. And if it doesn’t, as your traffic grows, it will eventually affect your site.

But before we go into how this update affects SEOs and what you need to adjust (I will go into that later in this post), let’s first get into what this update is all about.

What is Bert?

Bert stands for Bidirectional Encoder Representations from Transformers.

You are probably wondering, what the heck does that mean, right?

Google, in essence, has adjusted its algorithm to better understand natural language processing.
Just think of it this way: you could put a flight number into Google and they typically show you the flight status. Or a calculator may come up when you type in a math equation. Or if you put a stock symbol in, you’ll get a stock chart.

Or even a simpler example is: you can start typing into Google and its autocomplete feature can figure out what you are searching for before you even finishing typing it in.

But Google has already had all of that figured out before Bert. So let’s look at some examples of Bert in action.

The new changes this algorithm update brings makes it much more relevant for searchers and it creates a better experience for you and me and everyone else who uses Google.

But how does it affect SEOs?

You need to change your SEO strategy

There are three types of queries people usually make when performing a search:
  1. Informational
  2. Navigational
  3. Transactional
An informational query is like someone looking to lose weight. They aren’t sure how so they may search for “how to lose weight”.

And once they perform the search, they may find a solution such as different diets. From there they may search for a solution, using a navigational query such as “Atkins diet”.

Once someone figures out the exact solution, they then may perform a transactional search query, such as “the Atkins diet cookbook”.

From what we are seeing on our end is that Bert is mainly impacting top-of-the-funnel keywords, which are informational related keywords.

BERT


Now if you want to not only maintain your rankings but gobble up some of the rankings of your competition, a simple solution is to get very specific with your content.

Typically, when you create content, which is the easiest way to rank for informational related keywords, SEOs tell you to create super long content.

Yes, you may see that a lot of longer-form content ranks well on Google, but their algorithm doesn’t focus on word count, it focuses on quality.

Source: Neil's Blog

Wednesday, September 11, 2019

Evolving “nofollow” – New Ways to Identify The Nature of Links

Google Nofollow Update
Nearly 15 years ago, the nofollow attribute was introduced as a means to help fight comment spam. It also quickly became one of Google’s recommended methods for flagging advertising-related or sponsored links. The web has evolved since nofollow was introduced in 2005 and it’s time for nofollow to evolve as well.

Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. These, along with nofollow, are summarized below:

rel="sponsored": Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.

rel="ugc": UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.

rel="nofollow": Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.

When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes -- sponsored, UGC and nofollow -- are treated as hints about which links to consider or exclude within Search. We’ll use these hints -- along with other signals -- as a way to better understand how to appropriately analyze and use links within our systems.

Why not completely ignore such links, as had been the case with nofollow?

Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.

We know these new attributes will generate questions, so here’s a FAQ that we hope covers most of those.

Do I need to change my existing nofollows?

No. If you use nofollow now as a way to block sponsored links, or to signify that you don’t vouch for a page you link to, that will continue to be supported. There’s absolutely no need to change any nofollow links that you already have.

Can I use more than one rel value on a link?


Yes, you can use more than one rel value on a link. For example, rel="ugc sponsored" is a perfectly valid attribute which hints that the link came from user-generated content and is sponsored. It’s also valid to use nofollow with the new attributes -- such as rel="nofollow ugc" -- if you wish to be backwards-compatible with services that don’t support the new attributes.

If I use nofollow for ads or sponsored links, do I need to change those?

No. You can keep using nofollow as a method for flagging such links to avoid possible link scheme penalties. You don't need to change any existing markup. If you have systems that append this to new links, they can continue to do so. However, we recommend switching over to rel=”sponsored” if or when it is convenient.

Do I still need to flag ad or sponsored links?


Yes. If you want to avoid a possible link scheme action, use rel=“sponsored” or rel=“nofollow” to flag these links. We prefer the use of “sponsored,” but either is fine and will be treated the same, for this purpose.

What happens if I use the wrong attribute on a link?

There’s no wrong attribute except in the case of sponsored links. If you flag a UGC link or a non-ad  link as “sponsored,” we’ll see that hint but the impact -- if any at all -- would be at most that we might not count the link as a credit for another page. In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.

It is an issue going the opposite way. Any link that is clearly an ad or sponsored should use “sponsored” or “nofollow,” as described above. Using “sponsored” is preferred, but “nofollow” is acceptable.

Why should I bother using any of these new attributes?

Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes.

Won’t changing to a “hint” approach encourage link spam in comments and UGC content?


Many sites that allow third-parties to contribute to content already deter link spam in a variety of ways, including moderation tools that can be integrated into many blogging platforms and human review. The link attributes of “ugc” and “nofollow” will continue to be a further deterrent. In most cases, the move to a hint model won’t change the nature of how we treat such links.

We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes. We will still continue to carefully assess how to use links within Search, just as we always have and as we’ve had to do for situations where no attributions were provided.

When do these attributes and changes go into effect?


All the link attributes, sponsored, ugc and nofollow, now work today as hints for us to incorporate for ranking purposes. For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020. Those depending on nofollow solely to block a page from being indexed (which was never recommended) should use one of the much more robust mechanisms listed on our Learn how to block URLs from Google help page.

Source: Webmaster Central Blog

Sunday, June 16, 2019

Google Core Update Finished Rolling out on 8 June 2019

June 2019 Core Update Roll Out


Google Core Algorithm Update (June-2019)
As warned, the June 2019 core update is slowly being rolled out from Google’s data centers that are located in different countries. The announcement about the roll out was made from the same Google SearchLiaison twitter account that made the pre-announcement.
The June 2019 Core Update is now live and rolling out to our various data centers over the coming days. — Google SearchLiaison (@searchliaison) June 3, 2019
The algorithm trackers have started detecting a spike in their graph. This indicates that the impact of the latest broad core algorithm update that has been officially named June 2019 core update, is starting to affect SERP rankings.

Since Google has updated its Quality Rater Guidelines a few days back with much more emphasis on ranking quality websites on the search, the latest update may be a quality patch for the search results page.

We will give you a detailed stat of the impact of the algorithm update on SERP as soon as we get the data from the algorithm trackers. Also, our detailed analysis of the websites hit by the update and the possible way to recover will follow.

June 2019 Core Update Pre-announcement


It has been officially announced that the search engine giant will roll out an important Algorithm Update on June 3rd. The latest update, which will be a Broad Core Algorithm Update like the one released in March, will officially be called the June 2019 Core Update.

It is the first time that Google is pre-announcing the launch of an Algorithm update. Here is the official Twitter announcement:
Tomorrow, we are releasing a broad core algorithm update, as we do several times per year. It is called the June 2019 Core Update. Our guidance about such updates remains as we’ve covered before. — Google SearchLiaison (@searchliaison) June 2, 2019

Unofficial Google Update of March 27th 2019


Yes, you heard it right. Google has made some significant changes to the algorithm during the final few days of the month of March.

We have seen Google making tweaks after the roll-out of Broad Core Algorithm updates, but the one we are witnessing now is huge, and some algorithm sensors have detected more significant ranking fluctuation than the one that happened on March 12th when Google launched its confirmed March 2019 Core Update.

The fluctuations that started on March 27th is yet to stabilize, and more and more webmasters are taking it to forums after their website traffic got hit.

The latest tweak has come as a double blow for a few websites as they lost the traffic and organic ranking twice on the same month.

Source: Google SearchLiaison

Monday, April 22, 2019

4 Important Factor about Keyword Difficulty.

keyword difficulty
What is the Keyword SEO Difficulty?

What is the Keyword Difficulty in SEO?

Keyword difficulty (also known as keyword competition) is one of the most important metrics you should consider when doing keyword research. The higher is the keyword difficulty, the harder it is to rank on the 1st SERP due to high competition of the ranking websites.

It’s a critical metric alongside with exact monthly search volumes and SERP analysis. It determines a selection of keywords that will help you to improve SEO, bid on keywords in PPC campaigns, and much more.

How is the Keyword Difficulty calculated?


The calculation is based on the selected metrics by Moz, Majestic and our know-how, namely:
  • Domain Authority
  • Page Authority
  • Citation Flow
  • Trust Flow
The calculation consists of the following steps:
  1. Calculate the overall Link Profile Strength (LPS) for every website that ranks on the 1st Google SERP based on the selected Moz and Majestic metrics.
  2. Each metric has a different weight to make sure the results estimate how the real rankings evolve as much as possible.
  3. Take into account both high and low LPS values to calculate the overall Keyword SEO Difficulty.
  4. The final value estimated how hard it is to start ranking on the 1st SERP so it takes more than ever into consideration websites with low LPS.
  5. It’s absolutely alright when a low-authority website outranks high-authority websites and that’s exactly what Keyword Difficulty focuses on.

What is a good value of the Keyword  difficulty in SEO?


keyword difficulty
Keyword SEO Difficulty

The Keyword Difficulty is indicated on a scale from 0 to 100. The lower the value, the easier it is to rank for the keyword on the 1st SERP.
Keep in mind that the “real” SEO difficulty may vary. It depends on your on-page and off-page SEO skills.

Monday, October 01, 2018

Google's Medic Update 2018 - The Core Search Update

The big Google algorithm update, nicknamed the Medic Update, here is everything we know about it, including official information from Google and non-official insights from across the industry.


Google's Medic update and how to deal with it


The Google search algorithm update from August 1 is now fully rolled out, and here is what we know about the update, who we think was impacted and some of the analysis of what, if any, actions you may want to consider taking if you were negatively impacted.

In summary, Google is calling this a broad, global, core update, but based on much of the analysis done thus far, there seems to be a focus on health and medical sites and YMYL Your Money Your Life sites. But many sites besides those were impacted by the update. Google is telling us that there is nothing you can do to fix your site, so you should just focus on making a great experience, offer better content and a more useful website. This update has taken on the name the Medic Update because of its focus on the medical and health space. This specific focus is something Google will not confirm.

Why is it called the Medic update?


It’s called the Medic update because Barry Schwartz, one of the most prolific writers in the search industry, called it that. It doesn’t mean this update only affected medical sites.

Google has said that this update was a "broad core algorithm update" and that it does these updates "several times per year."

Google references its advice from the previous core updates, saying there’s "no ‘fix’ for pages that may perform less well, other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages." Google also said, "As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded."

Who was impacted by this update


As we explained above, Google said this is a “global” update, which implies every niche and every type of site could have been impacted. But based on the data that I’ve been seeing from surveys, multiple data companies and SEO consultants, there seems to be a focus on medical and health niches, as well as “Your Money Your Life” types of sites, with creeping into the entertainment and gaming niches as well. I’ve shown Google this data and a Google spokesperson responded by referencing the statements made above.

Source: Google Blog

Tuesday, September 18, 2018

What is SEO Linking?

link-building

Link building, simply put, is the process of getting other websites to link back to your website. All marketers and business owners should be interested in building links to drive referral traffic and increase their site's authority.

Basics of Quality Link Building for SEO


Why build links? Google's algorithms are complex and always evolving, but backlinks remain an important factor in how every search engine determines which sites rank for which keywords. Building links is one of the many tactics used in search engine optimization (SEO) because links are a signal to Google that your site is a quality resource worthy of citation. Therefore, sites with more backlinks tend to earn higher rankings.

There's a right way and a wrong way, however, to build links to your site. If you care about the long-term viability of your site and business, you should only engage in natural linkbuilding, meaning, the process of earning links rather than buying them or otherwise achieving them through manipulative tactics (sometimes known as black-hat SEO, a practice that can get your site essentially banned from the search results).

That said, natural, organic link building is a difficult, time-consuming process. Not all links are created equal: A link from an authoritative website like the Wall Street Journal will have a greater impact on your rankings on the SERP than a link from a small or newly built website, but high-quality links are harder to come by.

This guide will teach you how to build quality links that improve your organic rankings without violating Google guidelines.

Remember, link building is imperative in achieving high organic search rankings.

Why Link Building Is Important for SEO


Link building is important because it is a major factor in how Google ranks web pages. Google notes that:

"In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages."

Imagine that we own a site promoting wind turbine equipment that we sell. We're competing with another wind turbine equipment manufacturer. One of the ranking factors Google will look at in determining how to rank our respective pages is link popularity.

While the above example provides a general visual understanding of why link building is important, it's very basic. It omits key factors such as:
  • The trust and authority of the linking pages.
  • The SEO and content optimization of the respective sites.
  • The anchor text of the incoming links.

For a more in-depth explanation of how PageRank is calculated, read through these resources:
  • The original Google PageRank paper
  • An in-depth discussion of the formula behind PageRank
  • The Wikipedia page on the subject

The most important concept to understand is that, as Google says, you're more likely to have your content rank higher for keywords you're targeting if you can get external websites to link to your pages.

Friday, July 06, 2018

Goodbye, AdWords. Hello Google Ads

Google has made the executive decision to consolidate its overwhelming abundance of ad products under three brand new umbrellas: Google Ads, Google Marketing Platform, and Google Ads Manager. Behold, logos!

google ads

“AdWords” is no more.


In a surprise rebranding of its 18-year-old PPC advertising platform, Google has renamed AdWords to the broader Google Ads, simplified its advertising products and introduced several new features.

google ads

Huge news? You bet. Cause for concern? Not at all!


That’s because we’ve broken down all the changes coming to Google advertising to give you the essential information you need.

So, what is changing?


According to Google, Google Ads users will start seeing the new brand name and logo reflected on the platform, website, billing and help center.

Instead of logging into AdWords from adwords.google.com, you will log into Google Ads from ads.google.com.

Changes to the Google Ads branding will not impact your campaign performance, navigation, or reporting.
The new Google Ads experience is going to be heavily focused on the ease of multi-channel advertising, connecting everything from search to display to video for a more seamless experience.

These new brands will help advertisers and publishers of all sizes choose the right solutions for their businesses, making it even easier for them to deliver valuable, trustworthy ads and the right experiences for consumers across devices and channels.
So, basically, Google Ads is a platform where businesses will find all they loved about AdWords and even more tools to make the process of reaching customers easier.

Monday, May 28, 2018

Why is SEO good for a website?


What is a SEO Friendly Website and Why do you Need One

Many times you hear the term “SEO friendly” or “SEO friendly website” but what does this really mean and why do you need one? How can a SEO friendly website help your business grow?

These are the questions I will try to answer in this post always having in mind that beginners to SEO may be reading it so I will try to avoid technical terms or advanced SEO practices and theories.

What do we mean by a Search Engine Friendly Website?

A SEO friendly website has those configurations and features so it’s easy for search engines to crawl (read) and understand what the particular website is all about. The most important characteristics of a SEO friendly website are:
  1. Unique titles and Descriptions for all pages:
    Learn the Secrets of SEO, Keyword Research & Link Building and Apply them to your Website.
    Each page of the website (including the home page) has a unique title and description. The titles are between 60-65 characters and the descriptions are aprx 150 characters. Titles and descriptions describe accurately what the page is about without being keyword stuffed. Example of a good title and description.
  2. Well formatted URLs – URLs:
    Permanent links (that’s the url of a webpage) are descriptive, all lower case and separated by dashes. Example of a well formatted URL.
  3. Fast loading web pages:
    Neither people nor search engines want websites that are slow to load. On the contrary fast loading websites are SEO friendly (meaning they have an advantage in ranking algorithms over websites that are slower) and generate more user interactions (sales, newsletter signups, contact forms submissions etc).
  4. It has unique content:
    Content on the website is not found anywhere else on the web, all pages have unique and useful content. This means a website cannot be SEO friendly if it has content copied from other web sites.
  5. Includes images that are optimized for search engines:
    Search engines prefer text that’s the truth, but you also need to have images to your pages because people like it, it makes your content more interesting, easier to read, shareable etc. When you do so, make sure that the images are optimized for size (tools like "smushit" can help you reduce image file size without losing quality) and also that you set a meaningful image filename and ALT text.
  6. Pages have a meaningful structure:
    A web page usually has the following elements:
    • Header
    • Breadcrumbs Menu
    • Page Title (that’s the H1 tag – there is only one per page)
    • Well formatted text – text is separated into a number of short paragraphs with subheadings
    • Author information
    • Footer
There are of course many other characteristics that make a website SEO Friendly, you can read them in our previous post, the ultimate SEO checklist but the above 6 elements are currently among the most important.

Why do you need a SEO friendly website?

Something that most CEOs, small business owners or new webmasters don’t easily understand is why you need a SEO friendly website and why make the effort to make your website friendlier to search engines.

Well, the reasons are a lot but those you need to know are:

Want to improve your SEO but do not know where to start? Stop Guessing! Read my proven SEO and link building advice and boost your traffic and rankings. Download this how to guide.
  1. It will get you more organic traffic (that is traffic from search engines)
    As expected, a SEO friendly website will get you more traffic from search engines as it is likely to run higher in the SERPS (search engine results pages). If you take into account that the majority of people who use the search box tend to select one of the first 5 results, you can understand the importance of SEO.
  2. It will make your website user friendly
    SEO is not only for search engines but good SEO is for users as well. Applying the principles of SEO to your website, will make it easier to use and this will enhance the user experience.
  3. It gives you brand credibility
    Users are more likely to trust websites (businesses) that are found in the first pages of Google, Bing or Yahoo. This is good for both brand awareness and brand credibility.
  4. It is cost effective
    A SEO website will drive targeted traffic 24×7 without needed to spend money on PPC or other forms of online advertising. While there is a cost to reach that point, the long term benefits are bigger.
  5. It helps you understand what your most important customers want
    SEO drives quality traffic and by analyzing the behavior of those users (how they enter your website, what they click, how they leave, what they like etc) is a great way to understand what your customers want and adjust your website or products to match their needs.
  6. SEO is even more important on mobile
    A website that is mobile friendly and has good rankings on mobile search can get more customers and traffic that websites that are not mobile SEO friendly. More and more users are using their mobiles to search for information or products while on the go it is important to be on top of the search results otherwise you are losing customers to competition, especially those searching for local products or services.

Conclusion

A SEO friendly website has certain features and characteristics that helps search engines understand what the website is all about and this increases the chances of achieving better rankings in the SERPS.
The most important advantage of having a SEO friendly website is that you will get more targeted organic traffic from search engines.
Source: Quora

Wednesday, April 18, 2018

Rolling Out Mobile-First Indexing


Mobile-First Index Roll-out — March 26, 2018


Google announced that the mobile-first index was finally "rolling out." Since the index has been in testing for many months, and Google has suggested they are migrating sites gradually, it's unclear how much impact this specific roll-out had on the overall index. Webmaster should begin to see notifications within Google Search Console.

Source: Google Webmaster Central Blog

Friday, January 19, 2018

What Is Google Fred?

Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits. This does not mean all sites hit by the Google Fred update are dummy sites created for ad revenue, but (as Barry Schwartz noted in his observations of Google Fred) the majority of websites affected were content sites that have a large amount of ads and seem to have been created for the purpose of generating revenue over solving a user’s problem.


Which Websites were Affected by FRED?

The majority of the websites affected had one (or more) of the following:
  • An extremely large presence of ads
  • Content (usually in blog form) on all sorts of topics created for ranking purposes
  • Content has ads or affiliate links spread throughout, and the quality of content is far below industry-specific sites
  • Deceptive ads (looks like a download or play button to trick someone into clicking)
  • Thin content
  • UX barriers
  • Mobile problems
  • Aggressive affiliate setups
  • Aggressive monetization

How to Tell Your Site Affected By the Google Fred Algorithm Update?

If you saw a large drop in rankings and organic traffic around the middle of March and are guilty of one of the above, your site was probably impacted.

Google Fred Recovery

The Google Fred algorithm is focused on limited black-hat SEO tactics for aggressive monetization, so the biggest fix is to scale down your ads and increase the quality of your content.

For a full Google Fred recovery, we recommend:
  • Scaling back the amount of ads on your site
  • Review the Google Search Quality Rater Guidelines (QRG) and follow them as closely as you possibly can
  • Review the placement of ads on your site. Do they contribute to poor user experience?
  • Review the user experience of your site, and make a schedule to do this periodically. Keep upping the ante of your content
  • Review the content to be sure it serves a purpose, and that purpose is outlined in the form of metadata and tags

The number one thing you can do is to manually browse through your site. Is it user-friendly? Are you greeted by ads everywhere you go? Is your content scraped or extremely thin? Think about your users. If it’s not something you would enjoy seeing on other websites, you need to take it off of yours.

What are the Best Google Fred Update SEO Tactics?

If you’re looking for Fred update SEO tactics, we recommend you memorize the Google Quality Rating Guidelines and be sure every piece of content on your site is compliant with the best practices. These are the factors Google considers extremely important when it comes to quality:
  • Clear indication of who the website belongs to
  • Clear indication of what the page is about
  • A well-maintained and updated page, which means it’s error-free, loads quickly, and has few technical errors
  • Excellent website reputation (quality of backlinks, industry awards, positive user reviews, and expert testimonials all contribute to excellent reputation)
  • Content that demands at least one of the following: time, effort, expertise, and talent/skill
Source: Bluecorona

Thursday, January 18, 2018

Major Google SEO Updates & Algorithm Changes from 2009 to 2017

Google has a long history of famous algorithm updates, search index changes and refreshes.

2017 Updates
  • Snippet Length Increase — November 30, 2017
  • Featured Snippet Drop — October 27, 2017
  • Chrome HTTPS Warnings — October 17, 2017
  • Google Tops 50% HTTPS — April 16, 2017
  • "Fred" (Unconfirmed) — March 8, 2017
  • Intrusive Interstitial Penalty — January 10, 2017

2016 Updates
  • Penguin 4.0, Phase 2 — October 6, 2016
  • Penguin 4.0, Phase 1 — September 27, 2016
  • Penguin 4.0 Announcement — September 23, 2016
  • Image/Universal Drop — September 13, 2016
  • "Possum" — September 1, 2016
  • Mobile-friendly 2 — May 12, 2016
  • AdWords Shake-up — February 23, 2016

2015 Updates
  • RankBrain* — October 26, 2015
  • Panda 4.2 (#28) — July 17, 2015
  • The Quality Update — May 3, 2015
  • Mobile Update AKA "Mobilegeddon" — April 22, 2015

2014 Updates
  • Pigeon Expands (UK, CA, AU) — December 22, 2014
  • Penguin Everflux — December 10, 2014
  • Pirate 2.0 — October 21, 2014
  • Penguin 3.0 — October 17, 2014
  • "In The News" Box — October 1, 2014
  • Panda 4.1 (#27) — September 23, 2014
  • Authorship Removed — August 28, 2014
  • HTTPS/SSL Update — August 6, 2014
  • Pigeon — July 24, 2014
  • Authorship Photo Drop — June 28, 2014
  • Payday Loan 3.0 — June 12, 2014
  • Panda 4.0 (#26) — May 19, 2014
  • Payday Loan 2.0 — May 16, 2014
  • Page Layout #3 — February 6, 2014

2013 Updates
  • Authorship Shake-up  —  December 19, 2013
  • Penguin 2.1 (#5)  —  October 4, 2013
  • Hummingbird  —  August 20, 2013
  • In-depth Articles  —  August 6, 2013
  • Knowledge Graph Expansion  —  July 19, 2013
  • Panda Recovery  —  July 18, 2013
  • "Payday Loan" Update  —  June 11, 2013
  • Panda Dance  —  June 11, 2013
  • Penguin 2.0 (#4)  —  May 22, 2013
  • Domain Crowding  —  May 21, 2013
  • "Phantom"  —  May 9, 2013
  • Panda #25  —  March 14, 2013
  • Panda #24  —  January 22, 2013

2012 Updates
  • Panda #23  —  December 21, 2012
  • Knowledge Graph Expansion  —  December 4, 2012
  • Panda #22  —  November 21, 2012
  • Panda #21  —  November 5, 2012
  • Page Layout #2  —  October 9, 2012
  • Penguin #3  —  October 5, 2012
  • Panda #20  —  September 27, 2012
  • Exact-Match Domain (EMD) Update  —  September 27, 2012
  • Panda 3.9.2 (#19)  —  September 18, 2012
  • Panda 3.9.1 (#18)  —  August 20, 2012
  • 7-Result SERPs  —  August 14, 2012
  • DMCA Penalty ("Pirate")  —  August 10, 2012
  • Panda 3.9 (#17)  —  July 24, 2012
  • Link Warnings  —  July 19, 2012
  • Panda 3.8 (#16)  —  June 25, 2012
  • Panda 3.7 (#15)  —  June 8, 2012
  • Penguin 1.1 (#2)  —  May 25, 2012
  • Knowledge Graph  —  May 16, 2012
  • Panda 3.6 (#14)  —  April 27, 2012
  • Penguin  —  April 24, 2012
  • Panda 3.5 (#13)  —  April 19, 2012
  • Panda 3.4 (#12)  —  March 23, 2012
  • Search Quality Video  —  March 12, 2012
  • Panda 3.3 (#11)  —  February 27, 2012
  • Venice  —  February 27, 2012
  • Ads Above The Fold  —  January 19, 2012
  • Panda 3.2 (#10)  —  January 18, 2012

2011 Updates
  • Panda 3.1 (#9)  —  November 18, 2011
  • Query Encryption  —  October 18, 2011
  • Panda "Flux" (#8)  —  October 5, 2011
  • Panda 2.5 (#7)  —  September 28, 2011
  • Pagination Elements  —  September 15, 2011
  • Expanded Sitelinks  —  August 16, 2011
  • Panda 2.4 (#6)  —  August 12, 2011
  • Panda 2.3 (#5)  —  July 23, 2011
  • Google+  —  June 28, 2011
  • Panda 2.2 (#4)  —  June 21, 2011
  • Schema.org  —  June 2, 2011
  • Panda 2.1 (#3)  —  May 9, 2011
  • Panda 2.0 (#2)  —  April 11, 2011
  • The +1 Button  —  March 30, 2011
  • Panda/Farmer  —  February 23, 2011
  • Attribution Update  —  January 28, 2011

2010 Updates
  • Negative Reviews  —  December 1, 2010
  • Social Signals  —  December 1, 2010
  • Instant Previews  —  November 1, 2010
  • Google Instant  —  September 1, 2010
  • Brand Update  —  August 1, 2010
  • Caffeine (Rollout)  —  June 1, 2010
  • Google Places  —  April 1, 2010

2009 Updates
  • Real-time Search  —  December 1, 2009
  • Caffeine (Preview)  —  August 1, 2009
  • Vince  —  February 1, 2009
  • Rel-canonical Tag  —  February 1, 2009