Ads

Showing posts with label On-Page SEO. Show all posts
Showing posts with label On-Page SEO. Show all posts

Tuesday, August 17, 2021

What are meta tags in Blogger, and how do you create them?

blogger meta tags

If you run a Blogger blog for your business or organization, you may occasionally need to alter the HTML code for it. In general, Blogger automates the coding aspects of your blog, saving you significant amounts of time creating HTML and CSS code. However, if you want to include meta tags within your blog, you do need to edit its HTML code. Adding meta tags to your blog allows you to include information that may affect how well it performs in search engines.

1. Create a blank text file in a text editor to build your meta tags. Rather than writing the code directly into Blogger, it is easier to prepare it in advance. The most common meta tags are for the site keywords and description. You can also include meta elements to indicate the author and revised dates for page content. To create a keywords meta element, use the following outline:

This example could be for a print and design business, with the keywords indicating the content of the site pages. Alter the keywords to suit the content on your own blog. For the description meta element, use the following syntax:

The description includes readable sentences concisely explaining the content and purpose of the site. Again, alter this to suit your own blog.

2. Access the HTML content for your blog. Log in to Blogger and find your blog in the Dashboard. If you are using the newer Blogger interface, select "Template" from the drop-down list for your blog. For the older style, click "Design" for the blog you are working on. Click "Edit HTML" from the list along the top of the Design section. A large text field appears with your blog code in it. Check the "Expand Widget Templates" check box above the text field to show all of the code for the template your blog is using.

3. Find the head section of your blog. You need to place your meta tags in the head section of your blog template. The easiest way to do this is to locate the closing tag. You can use your browser's "find" tool to do this by pressing "Ctrl-F" or choosing "Edit" and then "Find" from the browser toolbar menu. Enter "" (without quotes) and press "Enter" to find it in the HTML code. Place your cursor before the closing head tag.

4. Enter your meta tags. Copy your meta tags from the text file you created by selecting the code and pressing "Ctrl-C" or "Edit" and then "Copy" from the menu in your text editor. Go to the point you placed your cursor at in the Blogger HTML code and paste the meta tags by pressing "Ctrl-V" or choosing "Edit" and then "Paste" from your browser menu. Click the "Save Template" button under the HTML text field for your blog. Your blog template will be updated to include the meta tags.

Buy website traffic cheap

Monday, July 27, 2020

How to Prevent Search Engines from Indexing WordPress Sites?

Prevent Search Engine

Site owners will do anything to get their websites indexed. However, you might not want search engines to crawl through your website if it’s still in development. In a case like this, it’s recommended to discourage search engines from indexing your site. Stick with us if you want to learn more about this topic!

  1. Discouraging Search Engines From Indexing WordPress SitesUsing the WordPress Built-In FeatureEditing robots.txt File Manually
  2. Password Protecting Your WordPress WebsiteUsing Hosting Control PanelUsing WordPress Plugins
  3. Removing Indexed Page From Google

Why Would You Want To Stop Search Engines From Indexing Your Site?

There are some cases where people want to discourage search engines from indexing their sites:

  • Unfinished websites — at this stage of errors and trials, it’s best not to have your website available to the public eyes.
  • Restricted websites — if you plan to have an invite-only website, you do not want it to get listed on SERPs.
  • Test accounts — web owners create a site duplicate for testing and trial purposes. Since these sites are not designed for the public, don’t let it get indexed by search engines.

So how do you block search engines from indexing your site? Well, take a look at several options below and try it yourself.

1. Discouraging Search Engines From Indexing WordPress Sites

The simplest way to stop search engines from indexing your website is by preventing them from crawling it. To do it, you need to edit your website directory’s robots.txt file. Here are a few ways to achieve that:

Using the WordPress Built-In Feature

Editing WordPress robots.txt is quite easy as you only need to use a WordPress built-in feature. Here’s how:

  1. Login to WordPress admin area and go to Settings -> Reading.
  2. Scroll down and locate the Search Engine Visibility option.
  3. Check the option that says Discourage search engines from indexing this site.
  4. Save Changes, and that’s it! WordPress will automatically edit its robots.txt file for you.

Editing robots.txt File Manually

If you prefer the manual option, you can use File Manager or an FTP client to edit the robots.txt file.

In this article, we’ll show you how to do it through the hPanel’s File Manager:

  1. Login to hPanel and locate File Manager under the Files area.

  2. Go to your WordPress root directory folder (in most cases, it’s public_html) and find the robots.txt file. If you can’t find it, create a new blank file.
  3. Right-click on the file and select Edit.

  4. Enter the following syntax:

    User-agent:
    * Disallow: /

The code above will prevent search engines from indexing your whole site. If you want to apply the disallow rule to a specific page, write the page’s subdirectory and slug. For example: Disallow /blog/food-review-2019.

The syntaxes in robots.txt files are case sensitive, so be careful when editing.

2. Password Protecting Your WordPress Website

Search engines and web crawlers don’t have access to password-protected files. Here are a few methods to password protect your WordPress site:

Using Hosting Control Panel

If you are a Hostinger client, you can password protect your website using hPanel’s Password Protect Directories tool:

  1. Access hPanel and navigate to Password Protect Directories.
  2. Enter your root directory into the first field.
  3. Once the directory is selected, enter your username and password and click Protect.

If your root directory is public_html, leave the directory column blank

The process in cPanel is also quite similar:

  1. Log in to your cPanel account and head to Directory Privacy.

  2. Select your root directory. In our case, it’s public_html.
  3. Check the Password protect this directory option, and name the protected directory. Press Save.
  4. Create a new user to login to the protected website, and that’s it!

Using WordPress Plugins

There are tons of plugins that can help to password protect your site. However, the Password Protected plugin might just be the best one out there. It’s been tested with the new WordPress update, and it’s pretty straightforward to use.

After installing and activating the plugin, head to Settings -> Password Protected and configure the settings to match your needs.

3. Removing Indexed Page From Google

Don’t worry if Google has indexed your site. You can remove it from SERPs by following these steps:

  1. Set up Google Search Console for your website.
  2. Access Google Search Console of your newly added website and scroll down to Legacy tools and reports -> Removals.
  3. Click the Temporarily hide button and enter the URL you want to remove from Google.
  4. On a new window, choose Clear URL from cache and temporarily remove from search, then Submit Request.

And that’s it! Google will temporarily remove your site from search results. Make sure to apply the previous methods to prevent Google from indexing your site again.

Conclusion

There you have it! Quick and easy ways to discourage search engines from indexing your sites. Here’s a quick recap of the methods we’ve learned today:

  • Edit the robots.txt file, which can be performed automatically or manually.
  • Password protect your website by using a plugin or your hosting control panel.
  • Remove indexed pages from Google via Google Search console.

If you have any other methods, or if you have any questions, please do let us know in the comments. Good luck!

Monday, February 10, 2020

16 On-Page SEO Factors You Must Update On Your Blog At All Times

On-page SEO Checklist

Why is on-page SEO so important?

Of course your classic on-page SEO tweaks still work. Even better than before actually. And I’m not the only one who says that. Take it from Google too.

Behind those fancy AI-based algorithm updates lie your usual keyword optimization hacks. With no keyword input and relevant related words, Google’s bots simply wouldn’t be able to understand your content and place it where relevant.

Other studies like this one from Backlinko also justify the use of on-page SEO methods. Just run any search for a competitive keyword and you’ll notice most websites try to keep their on-page factors clean and relevant.

When done the right way, optimizing your pages for optimal ranking can also:

  • Boost, even double, your website traffic
  • Bring in more leads
  • Improve your click-through-rates
  • Increase time on page
  • Reduce bounce rates
  • Match reader intent
  • Position you as a thought leader in your industry
  • And so much more!
  • On-page SEO factors to optimize right away

But you have so many factors to optimize, where do you start?

On-Page SEO

Below are all the on-page SEO factors that are worth your time:

1. SEO-friendly URL

Short URL that includes the keyword.

As an example:

www.domain.com/blog/on-page-seo

is better than a default URL

www.domain.com/blog/sfeogytytuyjyj.html

or a long one

www.domain.com/blog/on-page-seo-factor-to-optimize-this-year.html

or the other likes.

Make sure you think this through before you publish the article. Changing the URL after will make you lose your links unless you add a redirect.

Another issue to pay attention to is to make sure you won’t be using the same keyword in another URL for a more profitable page.

For instance, if you’re an SEO agency you might want a page like:

www.domain.com/on-page-seo

But if you later decide to also put together a guide for the same keyword, you won’t be able to use the same URL so you’ll have to publish it on your blog as www.domain.com/blog/on-page-seo or change the URL.

2. Title Tag

Your main keyword should ideally be placed at the beginning of your title. Especially in the case of the SEO title. You can set this one separately from the Yoast WordPress plug-in.

Here are 4 examples for the “blogging mistakes” keyword where the 1st result is optimal:

Title Tags

The SEO title is not final like the URL so you can change it at any time after publishing the post.

If you’ve got a bit more time, do some A/B testing on your SEO title. Change it every 3-4 months to see which one works best for your CTR.

3. Headings! Headings!

Nailing the keyword optimization of your headings is so important, yet so many writers seem to skip this part.

You’ve got multiple options here:

Take your main keyword and create natural headings around it. This means your keyword will appear in 2-3 headings.
Place your main keyword in the 2-3 headings mentioned at point 1 and optimize the rest of your headings for secondary keywords.
Above all, remember to include at least H2s and H3s in your text. [like this article, btw] Ideally, you’d have separate designs for these so they are easily distinguishable by readers.

4. The first 100 words

Another ignored on-page SEO factor is including your keyword in the first 100 words of your article. I don’t always do this because sometimes it doesn’t seem natural to shove a keyword in the first few words since you might want to set the scene first.

But if you can manage to add it in the first sentence, way to go! Google will automatically consider this topic is of top importance to the article and thus crawl it accordingly.

5. Frequent keywords, but no stuffing!

Stuffing is actually quite hard to do these days without readers reporting your content.

RECOMMENDED FOR YOU
Webcast, March 5th: How AI Can Find Opportunities and Shorten Your Sales Cycles
REGISTER NOW
Keyword stuffing looks something like:

These are our blog blogging blogger tips for bloggers who blog on blogs for bloggers…

Not cool. I know.

Instead, natural keyword frequency looks more like:

We’ve put together this list of tips for first-time bloggers who are looking to improve the quality of their blog posts…

And then just use your keywords sparingly and in a natural way throughout the entire article.

6. Outbound links

These are the links you add to relevant content on other websites. The general rule (or best practice if you want to) is to only link to materials that will be of value to your readers or support your claims.

You can try my trick and create a strategy to always follow for this. For instance, I only link to reports or studies and occasionally to external tools readers might want to test.

Don’t add too many though. Google didn’t disclose a number of outbound links that’s ok to use, but most blog guidelines [and my own experience] will accept a maximum of 3 links.

Also, try not to link to content that targets the same keyword you want to aim for. Google will automatically think that even you consider that content is better so it will be much more difficult to rank higher than that competitor.

7. Internal links

We’ve got two situations here.

The first case is when you add links to your other blog posts or web pages in this article you’re currently putting together. By all possible means, make sure the links are relevant to your topic.

The second instance happens after you publish your article. Try to find 2-3-4 of your other posts that are relevant to your new post and place a link to this new article on relevant keywords only.

Disclaimer: Avoid link stuffing. This means you shouldn’t use your top-performing article to link to every possible post of yours.

For all external and internal links, make sure you check them regularly so the links are not broken or the content there hasn’t fundamentally changed and no longer matches your needs.

8. Page speed

Smaller images, enable file compression, reduce redirects, minify CSS, JavaScript, and HTML, improve your server’s response time, and anything else PageSpeedInsights tells you to change.

9. Responsive design

Google has been seriously penalizing websites that are not responsive.

Mobile traffic is still growing so even if you don’t believe Google will have a say, your readers will when they won’t be able to click your call-to-action button.

10. Meta description

This is the small snippet of content that users will see under your SEO title in the search results.

Two secrets here:

Include your keyword for Google.
Include a CTA or an enticing fact or incentive to make people want to click on your post. This will also boost your click-through-rate.
Yes, the meta description too can be changed even years after first publishing the article.

Go way back in your blog’s history and check all meta descriptions. You’ll be surprised to discover missing ones too.

11. Review the readers’ intent

So you have this post that ranked well for 2 years but then it died. Do a SERP research again to see if the readers’ intent has changed or maybe your competitors managed to answer better to their needs.

This is also a perfect time for you to review the entire structure of the article and run a new keyword research to check for new potential secondary keywords to target. Keyword volumes and difficulty can change often. Every week even. So keeping an eye on the evolution of the keywords that are highly valuable for your business is vital to ensure you maintain your position.

12. Remove duplicate content in all its forms

Canonical links will be your best friend here. Especially for e-commerce websites who commonly have duplicate content on their category pages.

But even if you’re not an e-commerce website, I recommend making sure you have the canonical link set for every page of yours. Yes, that includes articles.

A much-too-common issue beginner marketers make is adding their blog posts to multiple categories or tags on their blog. This inevitably creates duplicate content so bear through the temptation and stick to one category.

13. ALT tags and file names

You’re probably already aware that the keyword your want to rank for should also be part of your ALT text in at least one image. [Still, try to add ALT tags to all images and include secondary keywords in them.]

Disclaimer: Don’t do keyword stuffing here either. A good ALT tag is “blogging mistakes for beginners”. A bad ALT tag looks like this “blogging mistakes bloggers blogs beginner mistakes”

What many writers are still not aware of is the importance of having keywords in the file name of your images as well. You know, blogging-mistakes.png instead of screenshot56.png.

14. Data markup

This only applies to specific websites where you want to post news, reviews, recipes, and the likes.

Your results will appear like:

SEO Data

Instead of:

bad markup

So many options here that can be added and tested at all times. Heach to Schema.org for all the details to see if there’s anything right for your blog type.

15. Got social media?

If you don’t yet have social media sharing buttons on your posts, go right now and get a plug-in. Many tools let you customize the text readers will share (or at least the suggestion) so they can bring in more views via their own networks.

16. No more black hat techniques!
Finally, make sure your website is free of any black hat SEO techniques. These include spammy links, cloaking, doorway pages, hidden text and links, spam comments, duplicate content, link farms, paid links even.

Surprisingly or not, Google is starting to pick up on paid links. That’s why many large websites strictly prohibit selling links on their website. Unfortunately, you’ll still occasionally receive emails from writers who are willing to provide such links. Just say no. It’s cheaper, more valuable, and easier to become an author on that website yourself anyway.

Where to take your on-page SEO efforts next?

Bookmark this article or create your own checklist of everything you need to change. If possible, make sure you analyze all of these on-page SEO factors and how they perform on your own pages.

I won’t lie to you and tell you the process is easy or quick. It can take months if you have a year’s worth of content or more.

But it’s worth it!

Got any extra tips on optimizing the on-page SEO factors for your website? What has worked for you and where are you still experimenting? Let us know!

Read more at: business2community.com

Monday, April 22, 2019

4 Important Factor about Keyword Difficulty.

keyword difficulty
What is the Keyword SEO Difficulty?

What is the Keyword Difficulty in SEO?

Keyword difficulty (also known as keyword competition) is one of the most important metrics you should consider when doing keyword research. The higher is the keyword difficulty, the harder it is to rank on the 1st SERP due to high competition of the ranking websites.

It’s a critical metric alongside with exact monthly search volumes and SERP analysis. It determines a selection of keywords that will help you to improve SEO, bid on keywords in PPC campaigns, and much more.

How is the Keyword Difficulty calculated?


The calculation is based on the selected metrics by Moz, Majestic and our know-how, namely:
  • Domain Authority
  • Page Authority
  • Citation Flow
  • Trust Flow
The calculation consists of the following steps:
  1. Calculate the overall Link Profile Strength (LPS) for every website that ranks on the 1st Google SERP based on the selected Moz and Majestic metrics.
  2. Each metric has a different weight to make sure the results estimate how the real rankings evolve as much as possible.
  3. Take into account both high and low LPS values to calculate the overall Keyword SEO Difficulty.
  4. The final value estimated how hard it is to start ranking on the 1st SERP so it takes more than ever into consideration websites with low LPS.
  5. It’s absolutely alright when a low-authority website outranks high-authority websites and that’s exactly what Keyword Difficulty focuses on.

What is a good value of the Keyword  difficulty in SEO?


keyword difficulty
Keyword SEO Difficulty

The Keyword Difficulty is indicated on a scale from 0 to 100. The lower the value, the easier it is to rank for the keyword on the 1st SERP.
Keep in mind that the “real” SEO difficulty may vary. It depends on your on-page and off-page SEO skills.

Friday, April 19, 2019

White Hat SEO and Black Hat SEO

White Hat SEO and Black Hat SEO

What is the Difference Between White Hat SEO and Black Hat SEO?


The difference between black hat SEO and white hat SEO has to do with the techniques used when trying to improve a website’s search engine ranking.

Black hat SEO refers to techniques and strategies used to get higher search rankings, and breaking search engine rules. Black hat SEO focuses on only search engines and not so much a human audience. Black hat SEO is typically used by those who are looking for a quick return on their site, rather than a long-term investment on their site. Some techniques used in black hat SEO include: keyword stuffing, link farming, hidden texts and links, and blog content spamming. Consequences of black hat SEO can possibly result in your site being banned from a search engine and de-indexed as a penalization for using unethical techniques.

White hat SEO refers to the use of techniques and strategies that target a human audience as opposed to a search engine. Techniques that are typically used in white hat SEO include using keywords, and keyword analysis, doing research, rewriting meta tags in order for them to be more relevant, backlinking, link building as well as writing content for human readers. Those who use white hat SEO expect to make a long-term investment on their website, as the results last a long time.

Does Black Hat SEO work?


Everybody has their own definition of “black hat SEO”. Put simply, black hat SEO includes any techniques that are against Google's guidelines. Some people view them as a fast track to achieve higher rankings. In fact, many SEO practitioners believe black hat SEO tactics are useful and they encourage others to use them.

Source: Google Blog

Tuesday, September 18, 2018

What is SEO Linking?

link-building

Link building, simply put, is the process of getting other websites to link back to your website. All marketers and business owners should be interested in building links to drive referral traffic and increase their site's authority.

Basics of Quality Link Building for SEO


Why build links? Google's algorithms are complex and always evolving, but backlinks remain an important factor in how every search engine determines which sites rank for which keywords. Building links is one of the many tactics used in search engine optimization (SEO) because links are a signal to Google that your site is a quality resource worthy of citation. Therefore, sites with more backlinks tend to earn higher rankings.

There's a right way and a wrong way, however, to build links to your site. If you care about the long-term viability of your site and business, you should only engage in natural linkbuilding, meaning, the process of earning links rather than buying them or otherwise achieving them through manipulative tactics (sometimes known as black-hat SEO, a practice that can get your site essentially banned from the search results).

That said, natural, organic link building is a difficult, time-consuming process. Not all links are created equal: A link from an authoritative website like the Wall Street Journal will have a greater impact on your rankings on the SERP than a link from a small or newly built website, but high-quality links are harder to come by.

This guide will teach you how to build quality links that improve your organic rankings without violating Google guidelines.

Remember, link building is imperative in achieving high organic search rankings.

Why Link Building Is Important for SEO


Link building is important because it is a major factor in how Google ranks web pages. Google notes that:

"In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages."

Imagine that we own a site promoting wind turbine equipment that we sell. We're competing with another wind turbine equipment manufacturer. One of the ranking factors Google will look at in determining how to rank our respective pages is link popularity.

While the above example provides a general visual understanding of why link building is important, it's very basic. It omits key factors such as:
  • The trust and authority of the linking pages.
  • The SEO and content optimization of the respective sites.
  • The anchor text of the incoming links.

For a more in-depth explanation of how PageRank is calculated, read through these resources:
  • The original Google PageRank paper
  • An in-depth discussion of the formula behind PageRank
  • The Wikipedia page on the subject

The most important concept to understand is that, as Google says, you're more likely to have your content rank higher for keywords you're targeting if you can get external websites to link to your pages.

Monday, June 18, 2018

Bing Announces Bing AMP Viewer & JSON-LD Support in Bing Webmaster Tools

Bing

Microsoft unveiled two new features within Bing during principal program manager Fabrice Canel’s appearance at SMX Advanced this morning. First, he announced support within Bing Webmaster Tools to debug and view your JSON-LD markup. Second, he announced support for a Bing AMP viewer.

Bing AMP Viewer


Bing AMP viewer will be rolled out this summer and will make AMP-enabled web pages work directly from Bing’s mobile search results. This will work similarly to the way Google returns AMP pages within its mobile search results.

Bing Webmaster Tools Supports JSON-LD


Bing began supporting JSON-LD markup in March, but now, Bing Webmaster Tools will also support debugging such JSON-LD in the tool.

Bing AMP Viewer

Monday, May 28, 2018

Why is SEO good for a website?


What is a SEO Friendly Website and Why do you Need One

Many times you hear the term “SEO friendly” or “SEO friendly website” but what does this really mean and why do you need one? How can a SEO friendly website help your business grow?

These are the questions I will try to answer in this post always having in mind that beginners to SEO may be reading it so I will try to avoid technical terms or advanced SEO practices and theories.

What do we mean by a Search Engine Friendly Website?

A SEO friendly website has those configurations and features so it’s easy for search engines to crawl (read) and understand what the particular website is all about. The most important characteristics of a SEO friendly website are:
  1. Unique titles and Descriptions for all pages:
    Learn the Secrets of SEO, Keyword Research & Link Building and Apply them to your Website.
    Each page of the website (including the home page) has a unique title and description. The titles are between 60-65 characters and the descriptions are aprx 150 characters. Titles and descriptions describe accurately what the page is about without being keyword stuffed. Example of a good title and description.
  2. Well formatted URLs – URLs:
    Permanent links (that’s the url of a webpage) are descriptive, all lower case and separated by dashes. Example of a well formatted URL.
  3. Fast loading web pages:
    Neither people nor search engines want websites that are slow to load. On the contrary fast loading websites are SEO friendly (meaning they have an advantage in ranking algorithms over websites that are slower) and generate more user interactions (sales, newsletter signups, contact forms submissions etc).
  4. It has unique content:
    Content on the website is not found anywhere else on the web, all pages have unique and useful content. This means a website cannot be SEO friendly if it has content copied from other web sites.
  5. Includes images that are optimized for search engines:
    Search engines prefer text that’s the truth, but you also need to have images to your pages because people like it, it makes your content more interesting, easier to read, shareable etc. When you do so, make sure that the images are optimized for size (tools like "smushit" can help you reduce image file size without losing quality) and also that you set a meaningful image filename and ALT text.
  6. Pages have a meaningful structure:
    A web page usually has the following elements:
    • Header
    • Breadcrumbs Menu
    • Page Title (that’s the H1 tag – there is only one per page)
    • Well formatted text – text is separated into a number of short paragraphs with subheadings
    • Author information
    • Footer
There are of course many other characteristics that make a website SEO Friendly, you can read them in our previous post, the ultimate SEO checklist but the above 6 elements are currently among the most important.

Why do you need a SEO friendly website?

Something that most CEOs, small business owners or new webmasters don’t easily understand is why you need a SEO friendly website and why make the effort to make your website friendlier to search engines.

Well, the reasons are a lot but those you need to know are:

Want to improve your SEO but do not know where to start? Stop Guessing! Read my proven SEO and link building advice and boost your traffic and rankings. Download this how to guide.
  1. It will get you more organic traffic (that is traffic from search engines)
    As expected, a SEO friendly website will get you more traffic from search engines as it is likely to run higher in the SERPS (search engine results pages). If you take into account that the majority of people who use the search box tend to select one of the first 5 results, you can understand the importance of SEO.
  2. It will make your website user friendly
    SEO is not only for search engines but good SEO is for users as well. Applying the principles of SEO to your website, will make it easier to use and this will enhance the user experience.
  3. It gives you brand credibility
    Users are more likely to trust websites (businesses) that are found in the first pages of Google, Bing or Yahoo. This is good for both brand awareness and brand credibility.
  4. It is cost effective
    A SEO website will drive targeted traffic 24×7 without needed to spend money on PPC or other forms of online advertising. While there is a cost to reach that point, the long term benefits are bigger.
  5. It helps you understand what your most important customers want
    SEO drives quality traffic and by analyzing the behavior of those users (how they enter your website, what they click, how they leave, what they like etc) is a great way to understand what your customers want and adjust your website or products to match their needs.
  6. SEO is even more important on mobile
    A website that is mobile friendly and has good rankings on mobile search can get more customers and traffic that websites that are not mobile SEO friendly. More and more users are using their mobiles to search for information or products while on the go it is important to be on top of the search results otherwise you are losing customers to competition, especially those searching for local products or services.

Conclusion

A SEO friendly website has certain features and characteristics that helps search engines understand what the website is all about and this increases the chances of achieving better rankings in the SERPS.
The most important advantage of having a SEO friendly website is that you will get more targeted organic traffic from search engines.
Source: Quora

Wednesday, April 18, 2018

Rolling Out Mobile-First Indexing


Mobile-First Index Roll-out — March 26, 2018


Google announced that the mobile-first index was finally "rolling out." Since the index has been in testing for many months, and Google has suggested they are migrating sites gradually, it's unclear how much impact this specific roll-out had on the overall index. Webmaster should begin to see notifications within Google Search Console.

Source: Google Webmaster Central Blog

Friday, January 19, 2018

What Is Google Fred?

Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits. This does not mean all sites hit by the Google Fred update are dummy sites created for ad revenue, but (as Barry Schwartz noted in his observations of Google Fred) the majority of websites affected were content sites that have a large amount of ads and seem to have been created for the purpose of generating revenue over solving a user’s problem.


Which Websites were Affected by FRED?

The majority of the websites affected had one (or more) of the following:
  • An extremely large presence of ads
  • Content (usually in blog form) on all sorts of topics created for ranking purposes
  • Content has ads or affiliate links spread throughout, and the quality of content is far below industry-specific sites
  • Deceptive ads (looks like a download or play button to trick someone into clicking)
  • Thin content
  • UX barriers
  • Mobile problems
  • Aggressive affiliate setups
  • Aggressive monetization

How to Tell Your Site Affected By the Google Fred Algorithm Update?

If you saw a large drop in rankings and organic traffic around the middle of March and are guilty of one of the above, your site was probably impacted.

Google Fred Recovery

The Google Fred algorithm is focused on limited black-hat SEO tactics for aggressive monetization, so the biggest fix is to scale down your ads and increase the quality of your content.

For a full Google Fred recovery, we recommend:
  • Scaling back the amount of ads on your site
  • Review the Google Search Quality Rater Guidelines (QRG) and follow them as closely as you possibly can
  • Review the placement of ads on your site. Do they contribute to poor user experience?
  • Review the user experience of your site, and make a schedule to do this periodically. Keep upping the ante of your content
  • Review the content to be sure it serves a purpose, and that purpose is outlined in the form of metadata and tags

The number one thing you can do is to manually browse through your site. Is it user-friendly? Are you greeted by ads everywhere you go? Is your content scraped or extremely thin? Think about your users. If it’s not something you would enjoy seeing on other websites, you need to take it off of yours.

What are the Best Google Fred Update SEO Tactics?

If you’re looking for Fred update SEO tactics, we recommend you memorize the Google Quality Rating Guidelines and be sure every piece of content on your site is compliant with the best practices. These are the factors Google considers extremely important when it comes to quality:
  • Clear indication of who the website belongs to
  • Clear indication of what the page is about
  • A well-maintained and updated page, which means it’s error-free, loads quickly, and has few technical errors
  • Excellent website reputation (quality of backlinks, industry awards, positive user reviews, and expert testimonials all contribute to excellent reputation)
  • Content that demands at least one of the following: time, effort, expertise, and talent/skill
Source: Bluecorona

Thursday, January 18, 2018

Major Google SEO Updates & Algorithm Changes from 2009 to 2017

Google has a long history of famous algorithm updates, search index changes and refreshes.

2017 Updates
  • Snippet Length Increase — November 30, 2017
  • Featured Snippet Drop — October 27, 2017
  • Chrome HTTPS Warnings — October 17, 2017
  • Google Tops 50% HTTPS — April 16, 2017
  • "Fred" (Unconfirmed) — March 8, 2017
  • Intrusive Interstitial Penalty — January 10, 2017

2016 Updates
  • Penguin 4.0, Phase 2 — October 6, 2016
  • Penguin 4.0, Phase 1 — September 27, 2016
  • Penguin 4.0 Announcement — September 23, 2016
  • Image/Universal Drop — September 13, 2016
  • "Possum" — September 1, 2016
  • Mobile-friendly 2 — May 12, 2016
  • AdWords Shake-up — February 23, 2016

2015 Updates
  • RankBrain* — October 26, 2015
  • Panda 4.2 (#28) — July 17, 2015
  • The Quality Update — May 3, 2015
  • Mobile Update AKA "Mobilegeddon" — April 22, 2015

2014 Updates
  • Pigeon Expands (UK, CA, AU) — December 22, 2014
  • Penguin Everflux — December 10, 2014
  • Pirate 2.0 — October 21, 2014
  • Penguin 3.0 — October 17, 2014
  • "In The News" Box — October 1, 2014
  • Panda 4.1 (#27) — September 23, 2014
  • Authorship Removed — August 28, 2014
  • HTTPS/SSL Update — August 6, 2014
  • Pigeon — July 24, 2014
  • Authorship Photo Drop — June 28, 2014
  • Payday Loan 3.0 — June 12, 2014
  • Panda 4.0 (#26) — May 19, 2014
  • Payday Loan 2.0 — May 16, 2014
  • Page Layout #3 — February 6, 2014

2013 Updates
  • Authorship Shake-up  —  December 19, 2013
  • Penguin 2.1 (#5)  —  October 4, 2013
  • Hummingbird  —  August 20, 2013
  • In-depth Articles  —  August 6, 2013
  • Knowledge Graph Expansion  —  July 19, 2013
  • Panda Recovery  —  July 18, 2013
  • "Payday Loan" Update  —  June 11, 2013
  • Panda Dance  —  June 11, 2013
  • Penguin 2.0 (#4)  —  May 22, 2013
  • Domain Crowding  —  May 21, 2013
  • "Phantom"  —  May 9, 2013
  • Panda #25  —  March 14, 2013
  • Panda #24  —  January 22, 2013

2012 Updates
  • Panda #23  —  December 21, 2012
  • Knowledge Graph Expansion  —  December 4, 2012
  • Panda #22  —  November 21, 2012
  • Panda #21  —  November 5, 2012
  • Page Layout #2  —  October 9, 2012
  • Penguin #3  —  October 5, 2012
  • Panda #20  —  September 27, 2012
  • Exact-Match Domain (EMD) Update  —  September 27, 2012
  • Panda 3.9.2 (#19)  —  September 18, 2012
  • Panda 3.9.1 (#18)  —  August 20, 2012
  • 7-Result SERPs  —  August 14, 2012
  • DMCA Penalty ("Pirate")  —  August 10, 2012
  • Panda 3.9 (#17)  —  July 24, 2012
  • Link Warnings  —  July 19, 2012
  • Panda 3.8 (#16)  —  June 25, 2012
  • Panda 3.7 (#15)  —  June 8, 2012
  • Penguin 1.1 (#2)  —  May 25, 2012
  • Knowledge Graph  —  May 16, 2012
  • Panda 3.6 (#14)  —  April 27, 2012
  • Penguin  —  April 24, 2012
  • Panda 3.5 (#13)  —  April 19, 2012
  • Panda 3.4 (#12)  —  March 23, 2012
  • Search Quality Video  —  March 12, 2012
  • Panda 3.3 (#11)  —  February 27, 2012
  • Venice  —  February 27, 2012
  • Ads Above The Fold  —  January 19, 2012
  • Panda 3.2 (#10)  —  January 18, 2012

2011 Updates
  • Panda 3.1 (#9)  —  November 18, 2011
  • Query Encryption  —  October 18, 2011
  • Panda "Flux" (#8)  —  October 5, 2011
  • Panda 2.5 (#7)  —  September 28, 2011
  • Pagination Elements  —  September 15, 2011
  • Expanded Sitelinks  —  August 16, 2011
  • Panda 2.4 (#6)  —  August 12, 2011
  • Panda 2.3 (#5)  —  July 23, 2011
  • Google+  —  June 28, 2011
  • Panda 2.2 (#4)  —  June 21, 2011
  • Schema.org  —  June 2, 2011
  • Panda 2.1 (#3)  —  May 9, 2011
  • Panda 2.0 (#2)  —  April 11, 2011
  • The +1 Button  —  March 30, 2011
  • Panda/Farmer  —  February 23, 2011
  • Attribution Update  —  January 28, 2011

2010 Updates
  • Negative Reviews  —  December 1, 2010
  • Social Signals  —  December 1, 2010
  • Instant Previews  —  November 1, 2010
  • Google Instant  —  September 1, 2010
  • Brand Update  —  August 1, 2010
  • Caffeine (Rollout)  —  June 1, 2010
  • Google Places  —  April 1, 2010

2009 Updates
  • Real-time Search  —  December 1, 2009
  • Caffeine (Preview)  —  August 1, 2009
  • Vince  —  February 1, 2009
  • Rel-canonical Tag  —  February 1, 2009

Tuesday, November 28, 2017

‘Hawk’ Google Local Algorithm Update

Have you noticed a recent shift in Google's local search results?

August 22, 2017: The day the ‘Hawk’ Google local algorithm update swooped in
‘Hawk’ Google Local Algorithm Update
The update, which I have dubbed “Hawk,” was a change to the way the local filter works. To get some history here, Google actively filters out listings from the local results that are similar to other listings that rank already. Basically, Google picks the most relevant listing of the bunch and filters the rest. It’s very similar to what they do organically with duplicate content. (Note: Google is typically loath to confirm algorithm updates, usually only saying that it rolls out several updates every day, so these observations are based on an analysis of how local results have changed rather than on any official announcement or acknowledgment.)

The filter has existed for a long time to help ensure that multiple listings for the same company don’t monopolize the search results. In September 2016, the Possum algorithm update made a significant change to the way the filter works. Instead of just filtering out listings that shared the same phone number or website, Google started filtering out listings that were physically located near each other.

This was very problematic for businesses. It meant that if another business in your industry was in the same building as you — or even down the street from you — that could cause you to get filtered out of local search results. Yep, that means your competitors could (inadvertently) bump your listing!

On August 22, 2017, Google refined the proximity filter to make it stricter. It still appears to be filtering out businesses in the same building, but it is not filtering out as many businesses that are close by.

Who is still filtered?


Naturally, this update didn’t help everyone. Although it tightened the distance needed to filter a similar listing, it didn’t remove it completely. I’m still seeing listings that share an address or building being filtered out of local search results. I also see the filtering problem persisting for a business that is in a different building that’s around 50 feet away from a competitor.

Why ‘Hawk?’


The local search community settled on the name “Hawk” for this algorithm update, because hawks eat possums. This is one of the few times where I don’t see any negative outcomes as a result of this update and just wish Google hadn’t taken a year to realize the proximity filter was way too broad.

Source: Search Engine Land

Thursday, November 09, 2017

7 Major Factors to Improve Page Speed Score.

Page speed is a measurement of how fast your site loads on browser Or the content on your page loads. let me tell you first what is page speed, and how it's matter in SEO Or Ranking for a website in 2019.

What is Page Speed?


Page speed is often confused with "site speed," which is actually the page speed for a sample of page views on a site. Page speed can be described in either "page load time" (the time it takes to fully display the content on a specific page) or "time to first byte" (how long it takes for your browser to receive the first byte of information from the web server).

No matter how you measure it, faster page speed is better. Many people have found that faster pages both rank and convert better.

page Speed

SEO Best Practices


Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages. And research has shown that Google might be specifically measuring time to the first byte as when it considers page speed. In addition, a slow page speed means that search engines can crawl fewer pages using their allocated crawl budget, and this could negatively affect your indexation.

Page speed is also important to user experience. Pages with a longer load time tend to have higher bounce rates and lower average time on page. Longer load times have also been shown to negatively affect conversions.

Here are some of the many ways to increase your page speed:

1. Enable compression


Use Gzip, a software application for file compression, to reduce the size of your CSS, HTML, and JavaScript files that are larger than 150 bytes.

Do not use gzip on image files. Instead, compress these in a program like Photoshop where you can retain control over the quality of the image. See "Optimize images" below.

2. Minify CSS, JavaScript, and HTML


By optimizing your code (including removing spaces, commas, and other unnecessary characters), you can dramatically increase your page speed. Also remove code comments, formatting, and unused code. Google recommends using YUI Compressor for both CSS and JavaScript.

3. Reduce Redirects


Each time a page redirects to another page, your visitor faces additional time waiting for the HTTP request-response cycle to complete. For example, if your mobile redirect pattern looks like this: "example.com -> www.example.com -> m.example.com -> m.example.com/home," each of those two additional redirects makes your page load slower.

4. Leverage browser caching


Browsers cache a lot of information (stylesheets, images, JavaScript files, and more) so that when a visitor comes back to your site, the browser doesn't have to reload the entire page. Use a tool like YSlow to see if you already have an expiration date set for your cache. Then set your "expires" header for how long you want that information to be cached. In many cases, unless your site design changes frequently, a year is a reasonable time period. Google has more information about leveraging caching here.

5. Improve server response time


Your server response time is affected by the amount of traffic you receive, the resources each page uses, the software your server uses, and the hosting solution you use. To improve your server response time, look for performance bottlenecks like slow database queries, slow routing, or a lack of adequate memory and fix them. The optimal server response time is under 200ms. Learn more about optimizing your time to first byte.

6. Use a content distribution network


Content distribution networks (CDNs), also called content delivery networks, are networks of servers that are used to distribute the load of delivering content. Essentially, copies of your site are stored at multiple, geographically diverse data centers so that users have faster and more reliable access to your site.

7. Optimize images


Be sure that your images are no larger than they need to be, that they are in the right file format (PNGs are generally better for graphics with fewer than 16 colors while JPEGs are generally better for photographs) and that they are compressed for the web.

Use CSS sprites to create a template for images that you use frequently on your site like buttons and icons. CSS sprites combine your images into one large image that loads all at once (which means fewer HTTP requests) and then display only the sections that you want to show. This means that you are saving load time by not making users wait for multiple images to load.

Source: SEOMoz

Thursday, August 10, 2017

Latent Semantic Indexing

Latent Semantic Indexing (LSI) is a mathematical method used to determine the relationship between terms and concepts in content. The contents of a web-page are crawled by a search engine and the most common words and phrases are collated and identified as the keywords for the page.

LSI looks for synonyms related to the title of your page. For example, if the title of your page was “Classic Cars”, the search engine would expect to find words relating to that subject in the content of the page as well, i.e. "collectors", "automobile", "Bentley", "Austin" and "car auctions".


When you search an LSI-indexed database, the search engine looks at similarity values it has calculated for every content word, and returns the documents that it thinks best fit the query. Because two documents may be semantically very close even if they do not share a particular keyword, LSI does not require an exact match to return useful results. Where a plain keyword search will fail if there is no exact match, LSI will often return relevant documents that don't contain the keyword at all.

What Is Latent Semantic Indexing, and How Will It Boost Your Overall SEO Strategy?


Latent Semantic Indexing is not rocket science, it is simple common sense. Here are some simple guidelines:
  1. If your page title is Learn to Play Tennis, make sure your article is about tennis.
  2. Do not overuse your keywords in the content. It could look like keyword stuffing and the search engines may red flag you.
  3. Never use Article Spinning Software – it spits out unreadable garble.
  4. If you outsource your content, choose a quality source.
  5. Check Google Webmaster Tools and see what keywords your pages are ranking for.
Latent Semantic Indexing is not a trick. You should bear it in mind when adding content to a web page, but do not get paranoid about it. The chances are if you provide quality, relevant content you will never have to worry about falling foul of and LSI checks.

Source: Search Engine Journal

Tuesday, May 30, 2017

3 Main Factors in On-Page SEO.

On-Page SEO Checklist

Here is a checklist you can use to make sure you are doing everything possible to rank higher in search engines.
  1. Keyword placement:
    • Keyword in the title.
    • Keyword in the permalink.
    • Keyword in the first paragraph.
    • Keyword in the image alt tag.
    • Use LSI keywords in the body (use SEOPressor plugin to find related keywords).
    • Use LSI keyword in H2 or H3.
    • Shoot for around a 1.5% keyword density.
  2. Other things:
    • Remove all stop words from permalink.
    • Add multimedia (video, slides, infographics).
    • Minimum 500 words.
    • Optimize images before uploading (compress and resize).
    • Optimize page load speed.
    • Create a meta title - should be less than 65 characters.
    • Create a meta description in between 120 - 150 characters approx.
    • Internal links to related articles.
    • Outbound links to relevant high-quality sites.
  3. Other things not mentioned here:
    • Make sure to add an image for Facebook, Twitter, etc.
    • Make sure to have social sharing buttons either at the end or floating on the site of your post.
    • Have related posts after each post to lower down bounce rate.

seo on-page

3 Secrets Techniques To Rank Top On Google SERP.

When it comes to optimizing a website or a blog post, there are below main factors at play:
  1. Onsite SEO - Onsite SEO refers to the optimization of the entire website with things like site-mapping and setting permalink structures.
  2. google serp
    • Post Title
    • Permalink Structure
    • Heading Tags
    • Keyword Density
    • Meta Tags
    • Images & Alt Tag
    • Word Count
    • Internal Linking
    • External Linking
    • Engaging Content

  3. On-page optimization - On-page SEO optimizes content for a target keyword within a single blog post. This includes using proper headings, proper keyword placement, ensuring content quality, and paying attention to many other factors.

  4. Off-page optimization - Process of link-building by social bookmarking, Forum posting, Directory Submissions etc.

On Page SEO Techniques

"Basic On-Page SEO Techniques"

On-Page SEO Techniques


There are two main factors in search engine optimization. On-page optimization and off-page optimization. Both factors are connected with each others for website search engine ranking. It is very important to optimize both techniques for better website ranking in search engine. On-page optimization help us to create website search SEO and search engine friendly, off page assist us to create quality back links for great ranking.
Some on page SEO techniques and their outputs:

Meta Title Tag:- Meta title tag describe the about web page content. It is most important part of on page SEO. Title tag with the appearance of keyword make easy to keywords ranking. We can find title in three places external website, search engine result pages and browsers.

Character Limit: - Google prefer character limit less than 70 character including space. With this character limit Google displays in search results.

Title Structure: - Primary Keyword | Secondary Keyword | Brand Name

Meta Description Tag:- Meta description is the HTML text which explain about web page content. It increases the possibility of user’s clicks. Means to say it increase the click through-rates of web pages. Most favorable character limit for meta description tag is 155-160 characters including space.

Meta Keywords:- Meta keywords is the part which also important in on-page SEO. This part help to boost keyword ranking in other famous search engines like Yahoo, Bing, MSN and Ask etc. So we can say appearance of meta keyword can increase the possibility of better ranking in other search engines.

Heading tag:- This tag represent the different section of web page content. Head tag help search engine spider relevancy between heading tag and web page content. Including that it also give the idea to user about web page content.

XML Sitemap:- Many webmasters think that XML sitemap is useless in SEO. It is the important part of website which help website in indexing and made website accessible to users and search engine crawler.

Robots.txt File: A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.

URL Canonicalization: Canonicalization is the process of picking the best url when there are several choices, and it usually refers to home pages. For example, most people would consider these the same urls:
  • www.example.com
  • example.com
  • www.example.com/index.html
  • example.com/home.asp
But technically all of these urls are different. A web server could return completely different content for all the urls above. When Google “canonicalizes” a URL, we try to pick the URL that seems like the best representative from that set.

Image ALT tag:- With the all on page parts image alt tag is also play very important role in website ranking. Search engine crawler cannot read image without this tag. So we can say image alt help crawler to read about image. It is also helpful to boost keyword ranking with images.