Ads

Friday, October 06, 2023

What are Zombie pages in SEO: chase them away from your site!

Zombie Pages

Zombie pages are pages on a website that generate little or no traffic and are difficult or impossible to access through search engine results.

In this article, we will give you our advice on how to detect these pages and how to treat them so that they do not affect the visibility of your entire site.

Summary:

  1. Why do we have to deal with the zombie pages?
  2. The different types of zombie pages
  3. How to locate these pages?
  4. How to deal with zombie pages?

Why do we have to deal with the Zombie Pages?

The detection and processing of zombie pages allows to:

  • Improve the user experience of visitors. Removing or correcting the zombie pages of a site allows to provide a better user experience and to improve the bounce and conversion rate of a site.
  • To improve the Quality Score awarded by Google. The search engine judges a site as a whole – compensating for the negative effects of zombie pages raises its overall score and therefore improves its positioning.
  • To optimize the crawl budget. Removing or blocking the indexing of zombie pages allows to spread the crawl time that is allocated to a site on its most significant pages.

The different types of Zombie Pages.

1 – Unindexed Pages.

These pages usually have technical problems such as loading times that are too long or scripts that do not execute. Google divides the time that its crawlers spend on a site according to the number of pages on it, it will choose not to index pages that slow down its task and which would anyway also have a high chance of being abandoned by visitors.

These pages are absent from Google’s results index, they are not visited or in any case, do not receive direct traffic from the search engine.

2 – “Non Responsive” pages.

Pages that are not optimized or that take too long to navigate on mobile phones are also at a disadvantage. Google will punish them because it considers them to offer a degraded user experience.

These pages are present in Google’s results but their ranking is penalized.

3 – Pages with obsolete or low-quality content.

There are two types:

  • Published pages that have not been updated for several years. Google may downgrade the rating of such pages by considering that they are no longer current.
  • Pages with low content (less than 300 words) or without real interest are also penalized.

These pages are gradually downgraded in the results.

4 – pages not (or not enough) optimized for SEO.

These pages can be quite useful and interesting for internet users but they do not apply the SEO criteria (such as the absence of alt, h1, h2, or h3 tags, a bad title or a too-long title, and no keywords…).

These pages are downgraded in search engine results.

5 – The annex pages.

These are often pages that can be accessed at the footer of the site: contact, legal notices, GTC, GDPR… Even if they are of little interest to the internet user, they contain legal information and their presence on a site is an SEO requirement.

The absence of these pages negatively affects the referencing of a site.

6 – Orphan pages.

These pages are simply not found by crawler robots, they are not linked by any internal link to other pages of the site and they are not accessible through the site’s menu. They are pages that somehow float in a parallel universe with almost no chance of being visited.

Several techniques exist to identify the orphan pages of a site. One of the simplest (well, unless your site has thousands of pages) is to compare your XML sitemap with the Google index of your site (you will get this index by a search such as “site:mywebsite.com” in Google). You just have to compare the two lists to identify the pages present in your sitemap but absent from the Google index. You will then just have to link the orphan pages to the rest of your site by internal meshing.

How to locate the Zombie Pages?

If you want to make the diagnosis by yourself without going through the services of an agency, we advise you to use the Google Search Console. You will find the tools that will allow you to detect pages with low or decreasing performance.

The “performance” tab (+ new + page), very easy to use (especially if your site is only a few pages long), will allow you to compare the evolution over time of traffic on each of your pages and thus detect those that are experiencing a sharp drop in traffic.

The “excluded” tab (+ coverage + excluded) will allow you to analyze two types of zombie pages:

  • The “Explored, currently unindexed” pages

These are pages that Google decided not to index during its last crawl considering their content too weak, duplicated, or containing information already present on many other sites. It is therefore advisable here to first complete and/or rewrite the content of these pages and wait for Google’s robots to come and explore them again.

  • The pages “Detected, currently not indexed”.

These are pages that Google has chosen not to index due to technical problems (e.g. when the server response time is too long).

How to deal with the Zombie Pages?

Some pages just need to be updated or optimized, while others really need to be removed and redirected.

Improve these pages.

As zombie pages are often pages with a too-long loading time, absent from the site’s mesh, or with unsuitable content, you need to rehabilitate them in the eyes of Google as well as in the eyes of your visitors.

  • Update and enrich the content of these pages;
  • Check that they contain the right keywords and that the semantic richness of the text is adapted to the subject matter;
  • Improve UX and Loading Time;
  • Add links to other linked pages on the site;
  • Add internal inbound links from other pages on your site.;
  • Share it on your social networks;
  • Do not change their URL.

Delete them!

Don’t launch into this delicate operation without checking the pages you are going to delete on a case-by-case basis.

If there are zombie pages on your site that have outdated content and do not generate any conversions, then it is possible to delete them.

On the other hand, pages that interest only a few visitors but which have a very “profitable” conversion rate should be kept.

Of course, there are zombie pages that are essential like the legal notices, the General Conditions of Sale, and the RGPD… which generate little or no visits and are to be kept.

Once your zombie pages have been deleted, don’t forget to redirect (301) the URLs of these pages to the pillar pages of the appropriate category or to other pages that deal with a similar theme.

Friday, July 21, 2023

What is Robots.txt file, How to update robots.txt file created by Yoast SEO with new one?

robots.txt file

User-agent: *

Disallow: /

User-agent: Googlebot-Image

Allow: /images/

This robots.txt file tells all search engine crawlers to not crawl any pages on the website. However, it allows Googlebot-Image to crawl the images directory.

Here is a breakdown of the directives in this robots.txt file:

  • User-agent: This directive specifies which search engine crawler the directive applies to. In this case, the directive applies to all search engine crawlers, as the asterisk (*) is used as a wildcard.
  • Disallow: This directive tells the search engine crawler not to crawl the specified path. In this case, the directive tells the crawler not to crawl any pages on the website.
  • Allow: This directive tells the search engine crawler to crawl the specified path. In this case, the directive tells the crawler to crawl the images directory.

Here are some other directives that you can use in your robots.txt file:

  • Host: This directive specifies which host the directive applies to.
  • Crawl-delay: This directive specifies how long the search engine crawler should wait between requests to your website.
  • Sitemap: This directive specifies the location of your website's sitemap.

For more information on robots.txt files, you can refer to the following resources:

  • Robots Exclusion Standard: https://en.wikipedia.org/wiki/Robots_exclusion_standard
  • Google Search Console Robots.txt documentation: https://developers.google.com/search/docs/crawling-indexing/robots/intro
  • Moz Robots.txt documentation: https://moz.com/learn/seo/robotstxt

Here are the steps on how to cancel the robots.txt file created by Yoast SEO and create a new one:

  1. Go to your WordPress dashboard and click on Yoast SEO > Tools > File editor.
  2. In the File editor section, click on the Robots.txt tab.
  3. Click on the Delete button to delete the existing robots.txt file.
  4. Click on the Create new file button to create a new robots.txt file.
  5. In the new robots.txt file, enter the directives that you want to use to control how search engines crawl your website.
  6. Save the new robots.txt file.

Once you have created the new robots.txt file, you need to upload it to your website's server. You can do this by using a file transfer protocol (FTP) client or by using your web hosting provider's file manager.

Here are some additional things to keep in mind when creating a robots.txt file:

  • The robots.txt file must be named robots.txt and it must be saved in the root directory of your website.
  • The directives in the robots.txt file are case-sensitive.
  • You can use the Allow and Disallow directives to control how search engines crawl your website.
  • You can use the User-agent directive to specify which search engines the directives apply to.

I hope this helps! Let me know if you have any other questions.

Monday, April 24, 2023

What is a PBN?

 

PBN

PBN stands for Private Blog Network. It is a network of multiple websites that are owned and operated by a single person or organization, with the goal of manipulating search engine rankings to increase the visibility and authority of a specific website or group of websites.

PBNs are typically created by purchasing expired or deleted domains with existing backlinks and redirecting them to the primary website to pass on link juice and improve its search engine ranking. However, PBNs are considered a black hat SEO technique and can result in penalties from search engines if discovered. As such, it is not recommended to use PBNs as a part of your SEO strategy.

Is PBN good for SEO?

No, PBNs are not good for SEO in the long term. While they may provide short-term gains in search engine rankings, using a Private Blog Network (PBN) is considered a black hat SEO technique that violates Google's guidelines. If Google detects the use of PBNs, it can result in severe penalties, including a drop in rankings or even complete removal from the search index.

Instead of relying on PBNs, it is recommended to focus on creating high-quality content that provides value to your audience and attracts natural backlinks. Building relationships with other websites in your industry and earning backlinks through guest posting, content partnerships, and other legitimate means can also help improve your website's search engine rankings over time.

Monday, March 13, 2023

What is trust flow and citation flow in SEO?

TF vs CF

In SEO, TF and CF refer to two different metrics used to evaluate the quality and authority of a website or webpage.

TF stands for "Trust Flow," which is a metric developed by Majestic SEO. It measures the quality of the links pointing to a website or webpage, taking into account the authority of the linking domains. The more high-quality and trustworthy links a website or webpage has pointing to it, the higher its Trust Flow score will be.

CF stands for "Citation Flow," which is another metric developed by Majestic SEO. It measures the quantity of the links pointing to a website or webpage, regardless of their quality or authority. The more links a website or webpage has pointing to it, the higher its Citation Flow score will be.

In general, a high TF and CF score indicates that a website or webpage is likely to be authoritative and trustworthy. However, it's important to note that these metrics are just one aspect of SEO, and should not be relied upon solely to evaluate the quality of a website or webpage. Other factors, such as content quality, user experience, and technical SEO, also play a critical role in SEO success.

Tuesday, March 07, 2023

The Difference Between External and Internal Links

Internal Links vs External links

External links are hyperlinks that point to pages on other websites. When you click on an external link, you leave the current website and are directed to a different website. For example, if you are reading an article on one website and you click on a link to a related article on a different website, that link is an external link.

Internal links, on the other hand, are hyperlinks that point to pages within the same website. When you click on an internal link, you stay within the same website but are directed to a different page on that website. For example, if you are reading an article on a website and you click on a link to a related article on the same website, that link is an internal link.

Internal links are important for website navigation and can help users find relevant content on the same website. External links can provide additional context or resources for users and can also help improve the website's search engine rankings by indicating to search engines that the website is connected to other relevant and trustworthy websites.

Friday, March 03, 2023

What are backlinks and backlink types?

Backlinks

Backlinks are links from one website to another website. They are also known as inbound links, incoming links, or simply links. Backlinks are important for search engine optimization (SEO) because they signal to search engines that other websites consider your content to be valuable and worth linking to.
There are several types of backlinks, including:

Natural backlinks: These are links that other websites give you voluntarily because they think your content is valuable or useful.

Manual backlinks: These are links that you create yourself, such as by commenting on blog posts or forum threads, submitting your website to directories, or exchanging links with other websites.

Editorial backlinks: These are links that you earn because another website has found your content to be valuable and has linked to it without any request or incentive from you.

Contextual backlinks: These are links that are embedded within the content of a webpage, rather than in a separate section like a sidebar or footer.

Do-follow backlinks: These are links that pass on link juice, which is the value or authority that search engines associate with a website. These links are considered valuable for SEO.

No-follow backlinks: These are links that do not pass on link juice, and are typically used for links that are paid or are considered to be of lower quality.

It's important to note that while backlinks can be valuable for SEO, not all backlinks are created equal. High-quality backlinks from authoritative websites are typically more valuable than low-quality backlinks from spammy or irrelevant websites.

Additionally, it's important to focus on creating valuable content that naturally attracts backlinks, rather than relying solely on manual or paid backlink strategies.

Monday, February 20, 2023

What are HTTP response codes and why are they important in SEO?

http status codes means

HTTP response codes are standardized status codes that web servers return to indicate the status of a client's request. There are many different response codes, but some common ones include:
  • 200 OK: The request was successful, and the server is returning the requested data.
  • 404 Not Found: The requested resource could not be found on the server.
  • 301 Moved Permanently: The requested resource has been permanently moved to a new URL.

HTTP response codes are important in SEO because they can affect how search engines crawl and index your website. For example, if a page returns a 404 error, search engines may remove it from their index, which can hurt your SEO.

On the other hand, if you use 301 redirects to redirect old URLs to new ones, you can preserve your SEO by ensuring that search engines and users can still find your content.

What is the ideal length for a meta tag for SEO?

There isn't necessarily an ideal length for a meta tag, as the appropriate length can vary depending on the specific type of meta tag and its purpose.

meta tags

For example, the ideal length for a meta title tag is generally considered to be between 50-60 characters, as this is the typical maximum number of characters that search engines will display in search results. However, the length of a meta description tag can vary and can be up to around 155-160 characters, as this is the typical maximum number of characters that search engines will display in the description snippet in search results.

It's important to note that while there are general guidelines for meta tag lengths, it's more important to focus on creating accurate and relevant meta tags that provide clear and concise information about the content on the page.

How to submit website to the Google Search Console correctly?

Submit Website To Google Search Console

Submitting your new website to Google Search Console is an important step in getting your website indexed by Google. Here are the steps to submit your new website to Google Search Console:

  1. First, go to the Google Search Console website (https://search.google.com/search-console/welcome) and sign in with your Google account.

  2. Click on the "Add a Property" button and enter your website URL.

  3. You will be asked to verify that you own the website. There are several methods of verification, such as adding an HTML tag to your website, uploading a HTML file, or adding a DNS record. Follow the instructions for your preferred verification method.

  4. Once your website is verified, you can access the Google Search Console dashboard. Here, you can see information about your website's performance, index status, and more.

  5. To submit your website to Google, click on the "URL Inspection" option in the left-hand menu. Enter the URL of the page you want to index in the search bar and click "Enter."

  6. If the page has not already been indexed by Google, you will see an option to "Request Indexing." Click on this button to submit your page to Google.

  7. Google will review your page and add it to the index if it meets its guidelines. This process can take a few hours or several days, depending on various factors such as the size of your website and how frequently it is updated.

That's it! Your website should now be submitted to Google Search Console and ready for indexing. It's a good idea to regularly check the Search Console dashboard to monitor your website's performance and make any necessary adjustments to improve its visibility in Google search results.