Ads

Showing posts with label Google Website Optimization. Show all posts
Showing posts with label Google Website Optimization. Show all posts

Tuesday, August 17, 2021

What are meta tags in Blogger, and how do you create them?

blogger meta tags

If you run a Blogger blog for your business or organization, you may occasionally need to alter the HTML code for it. In general, Blogger automates the coding aspects of your blog, saving you significant amounts of time creating HTML and CSS code. However, if you want to include meta tags within your blog, you do need to edit its HTML code. Adding meta tags to your blog allows you to include information that may affect how well it performs in search engines.

1. Create a blank text file in a text editor to build your meta tags. Rather than writing the code directly into Blogger, it is easier to prepare it in advance. The most common meta tags are for the site keywords and description. You can also include meta elements to indicate the author and revised dates for page content. To create a keywords meta element, use the following outline:

This example could be for a print and design business, with the keywords indicating the content of the site pages. Alter the keywords to suit the content on your own blog. For the description meta element, use the following syntax:

The description includes readable sentences concisely explaining the content and purpose of the site. Again, alter this to suit your own blog.

2. Access the HTML content for your blog. Log in to Blogger and find your blog in the Dashboard. If you are using the newer Blogger interface, select "Template" from the drop-down list for your blog. For the older style, click "Design" for the blog you are working on. Click "Edit HTML" from the list along the top of the Design section. A large text field appears with your blog code in it. Check the "Expand Widget Templates" check box above the text field to show all of the code for the template your blog is using.

3. Find the head section of your blog. You need to place your meta tags in the head section of your blog template. The easiest way to do this is to locate the closing tag. You can use your browser's "find" tool to do this by pressing "Ctrl-F" or choosing "Edit" and then "Find" from the browser toolbar menu. Enter "" (without quotes) and press "Enter" to find it in the HTML code. Place your cursor before the closing head tag.

4. Enter your meta tags. Copy your meta tags from the text file you created by selecting the code and pressing "Ctrl-C" or "Edit" and then "Copy" from the menu in your text editor. Go to the point you placed your cursor at in the Blogger HTML code and paste the meta tags by pressing "Ctrl-V" or choosing "Edit" and then "Paste" from your browser menu. Click the "Save Template" button under the HTML text field for your blog. Your blog template will be updated to include the meta tags.

Buy website traffic cheap

Monday, July 27, 2020

How to Prevent Search Engines from Indexing WordPress Sites?

Prevent Search Engine

Site owners will do anything to get their websites indexed. However, you might not want search engines to crawl through your website if it’s still in development. In a case like this, it’s recommended to discourage search engines from indexing your site. Stick with us if you want to learn more about this topic!

  1. Discouraging Search Engines From Indexing WordPress SitesUsing the WordPress Built-In FeatureEditing robots.txt File Manually
  2. Password Protecting Your WordPress WebsiteUsing Hosting Control PanelUsing WordPress Plugins
  3. Removing Indexed Page From Google

Why Would You Want To Stop Search Engines From Indexing Your Site?

There are some cases where people want to discourage search engines from indexing their sites:

  • Unfinished websites — at this stage of errors and trials, it’s best not to have your website available to the public eyes.
  • Restricted websites — if you plan to have an invite-only website, you do not want it to get listed on SERPs.
  • Test accounts — web owners create a site duplicate for testing and trial purposes. Since these sites are not designed for the public, don’t let it get indexed by search engines.

So how do you block search engines from indexing your site? Well, take a look at several options below and try it yourself.

1. Discouraging Search Engines From Indexing WordPress Sites

The simplest way to stop search engines from indexing your website is by preventing them from crawling it. To do it, you need to edit your website directory’s robots.txt file. Here are a few ways to achieve that:

Using the WordPress Built-In Feature

Editing WordPress robots.txt is quite easy as you only need to use a WordPress built-in feature. Here’s how:

  1. Login to WordPress admin area and go to Settings -> Reading.
  2. Scroll down and locate the Search Engine Visibility option.
  3. Check the option that says Discourage search engines from indexing this site.
  4. Save Changes, and that’s it! WordPress will automatically edit its robots.txt file for you.

Editing robots.txt File Manually

If you prefer the manual option, you can use File Manager or an FTP client to edit the robots.txt file.

In this article, we’ll show you how to do it through the hPanel’s File Manager:

  1. Login to hPanel and locate File Manager under the Files area.

  2. Go to your WordPress root directory folder (in most cases, it’s public_html) and find the robots.txt file. If you can’t find it, create a new blank file.
  3. Right-click on the file and select Edit.

  4. Enter the following syntax:

    User-agent:
    * Disallow: /

The code above will prevent search engines from indexing your whole site. If you want to apply the disallow rule to a specific page, write the page’s subdirectory and slug. For example: Disallow /blog/food-review-2019.

The syntaxes in robots.txt files are case sensitive, so be careful when editing.

2. Password Protecting Your WordPress Website

Search engines and web crawlers don’t have access to password-protected files. Here are a few methods to password protect your WordPress site:

Using Hosting Control Panel

If you are a Hostinger client, you can password protect your website using hPanel’s Password Protect Directories tool:

  1. Access hPanel and navigate to Password Protect Directories.
  2. Enter your root directory into the first field.
  3. Once the directory is selected, enter your username and password and click Protect.

If your root directory is public_html, leave the directory column blank

The process in cPanel is also quite similar:

  1. Log in to your cPanel account and head to Directory Privacy.

  2. Select your root directory. In our case, it’s public_html.
  3. Check the Password protect this directory option, and name the protected directory. Press Save.
  4. Create a new user to login to the protected website, and that’s it!

Using WordPress Plugins

There are tons of plugins that can help to password protect your site. However, the Password Protected plugin might just be the best one out there. It’s been tested with the new WordPress update, and it’s pretty straightforward to use.

After installing and activating the plugin, head to Settings -> Password Protected and configure the settings to match your needs.

3. Removing Indexed Page From Google

Don’t worry if Google has indexed your site. You can remove it from SERPs by following these steps:

  1. Set up Google Search Console for your website.
  2. Access Google Search Console of your newly added website and scroll down to Legacy tools and reports -> Removals.
  3. Click the Temporarily hide button and enter the URL you want to remove from Google.
  4. On a new window, choose Clear URL from cache and temporarily remove from search, then Submit Request.

And that’s it! Google will temporarily remove your site from search results. Make sure to apply the previous methods to prevent Google from indexing your site again.

Conclusion

There you have it! Quick and easy ways to discourage search engines from indexing your sites. Here’s a quick recap of the methods we’ve learned today:

  • Edit the robots.txt file, which can be performed automatically or manually.
  • Password protect your website by using a plugin or your hosting control panel.
  • Remove indexed pages from Google via Google Search console.

If you have any other methods, or if you have any questions, please do let us know in the comments. Good luck!

Friday, April 19, 2019

White Hat SEO and Black Hat SEO

White Hat SEO and Black Hat SEO

What is the Difference Between White Hat SEO and Black Hat SEO?


The difference between black hat SEO and white hat SEO has to do with the techniques used when trying to improve a website’s search engine ranking.

Black hat SEO refers to techniques and strategies used to get higher search rankings, and breaking search engine rules. Black hat SEO focuses on only search engines and not so much a human audience. Black hat SEO is typically used by those who are looking for a quick return on their site, rather than a long-term investment on their site. Some techniques used in black hat SEO include: keyword stuffing, link farming, hidden texts and links, and blog content spamming. Consequences of black hat SEO can possibly result in your site being banned from a search engine and de-indexed as a penalization for using unethical techniques.

White hat SEO refers to the use of techniques and strategies that target a human audience as opposed to a search engine. Techniques that are typically used in white hat SEO include using keywords, and keyword analysis, doing research, rewriting meta tags in order for them to be more relevant, backlinking, link building as well as writing content for human readers. Those who use white hat SEO expect to make a long-term investment on their website, as the results last a long time.

Does Black Hat SEO work?


Everybody has their own definition of “black hat SEO”. Put simply, black hat SEO includes any techniques that are against Google's guidelines. Some people view them as a fast track to achieve higher rankings. In fact, many SEO practitioners believe black hat SEO tactics are useful and they encourage others to use them.

Source: Google Blog

Monday, March 18, 2019

Google Algorithm Update 2019

A rare Google confirmation came related to a Google search algorithm update this week (12 march 2019). Google restated previous advice that there is no fix if your site was negatively impacted.

Google released a broad core search algorithm on March 12 - AKA Florida 2


google search update 2019

Why it matters:

Google does several core ranking updates per year and confirms very few updates throughout the year. Specific to broad core updates, Google has said numerous times that you cannot do anything specific to fix your rankings. Google’s previous advice is, "there’s no ‘fix’ for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages."
If your rankings did change recently, it may have been related to this broad core ranking update and not necessarily related to a technical change you made on your website.

What changed?

Right now it is very early and it is hard to guess what has changed. Based on the SEO chatter around this update, prior to Google confirming the update, some are saying this was again targeting the health/medical space. But, Google has said there was no specific target at medical or health sites with that August 1st update.
It is hard to know which types of sites were impacted the most right now. We will continue to monitor the situation and keep you updated on any insights we see related to this update.

Google’s previous advice.

Google has previously shared this advice around broad core algorithm updates:
"Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year.

As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.

There’s no "fix" for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages."
Source: To see more advice from Google around Google updates, see this Twitter thread.

Monday, October 01, 2018

Google's Medic Update 2018 - The Core Search Update

The big Google algorithm update, nicknamed the Medic Update, here is everything we know about it, including official information from Google and non-official insights from across the industry.


Google's Medic update and how to deal with it


The Google search algorithm update from August 1 is now fully rolled out, and here is what we know about the update, who we think was impacted and some of the analysis of what, if any, actions you may want to consider taking if you were negatively impacted.

In summary, Google is calling this a broad, global, core update, but based on much of the analysis done thus far, there seems to be a focus on health and medical sites and YMYL Your Money Your Life sites. But many sites besides those were impacted by the update. Google is telling us that there is nothing you can do to fix your site, so you should just focus on making a great experience, offer better content and a more useful website. This update has taken on the name the Medic Update because of its focus on the medical and health space. This specific focus is something Google will not confirm.

Why is it called the Medic update?


It’s called the Medic update because Barry Schwartz, one of the most prolific writers in the search industry, called it that. It doesn’t mean this update only affected medical sites.

Google has said that this update was a "broad core algorithm update" and that it does these updates "several times per year."

Google references its advice from the previous core updates, saying there’s "no ‘fix’ for pages that may perform less well, other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages." Google also said, "As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded."

Who was impacted by this update


As we explained above, Google said this is a “global” update, which implies every niche and every type of site could have been impacted. But based on the data that I’ve been seeing from surveys, multiple data companies and SEO consultants, there seems to be a focus on medical and health niches, as well as “Your Money Your Life” types of sites, with creeping into the entertainment and gaming niches as well. I’ve shown Google this data and a Google spokesperson responded by referencing the statements made above.

Source: Google Blog

Tuesday, September 18, 2018

What is SEO Linking?

link-building

Link building, simply put, is the process of getting other websites to link back to your website. All marketers and business owners should be interested in building links to drive referral traffic and increase their site's authority.

Basics of Quality Link Building for SEO


Why build links? Google's algorithms are complex and always evolving, but backlinks remain an important factor in how every search engine determines which sites rank for which keywords. Building links is one of the many tactics used in search engine optimization (SEO) because links are a signal to Google that your site is a quality resource worthy of citation. Therefore, sites with more backlinks tend to earn higher rankings.

There's a right way and a wrong way, however, to build links to your site. If you care about the long-term viability of your site and business, you should only engage in natural linkbuilding, meaning, the process of earning links rather than buying them or otherwise achieving them through manipulative tactics (sometimes known as black-hat SEO, a practice that can get your site essentially banned from the search results).

That said, natural, organic link building is a difficult, time-consuming process. Not all links are created equal: A link from an authoritative website like the Wall Street Journal will have a greater impact on your rankings on the SERP than a link from a small or newly built website, but high-quality links are harder to come by.

This guide will teach you how to build quality links that improve your organic rankings without violating Google guidelines.

Remember, link building is imperative in achieving high organic search rankings.

Why Link Building Is Important for SEO


Link building is important because it is a major factor in how Google ranks web pages. Google notes that:

"In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages."

Imagine that we own a site promoting wind turbine equipment that we sell. We're competing with another wind turbine equipment manufacturer. One of the ranking factors Google will look at in determining how to rank our respective pages is link popularity.

While the above example provides a general visual understanding of why link building is important, it's very basic. It omits key factors such as:
  • The trust and authority of the linking pages.
  • The SEO and content optimization of the respective sites.
  • The anchor text of the incoming links.

For a more in-depth explanation of how PageRank is calculated, read through these resources:
  • The original Google PageRank paper
  • An in-depth discussion of the formula behind PageRank
  • The Wikipedia page on the subject

The most important concept to understand is that, as Google says, you're more likely to have your content rank higher for keywords you're targeting if you can get external websites to link to your pages.

Monday, May 28, 2018

Why is SEO good for a website?


What is a SEO Friendly Website and Why do you Need One

Many times you hear the term “SEO friendly” or “SEO friendly website” but what does this really mean and why do you need one? How can a SEO friendly website help your business grow?

These are the questions I will try to answer in this post always having in mind that beginners to SEO may be reading it so I will try to avoid technical terms or advanced SEO practices and theories.

What do we mean by a Search Engine Friendly Website?

A SEO friendly website has those configurations and features so it’s easy for search engines to crawl (read) and understand what the particular website is all about. The most important characteristics of a SEO friendly website are:
  1. Unique titles and Descriptions for all pages:
    Learn the Secrets of SEO, Keyword Research & Link Building and Apply them to your Website.
    Each page of the website (including the home page) has a unique title and description. The titles are between 60-65 characters and the descriptions are aprx 150 characters. Titles and descriptions describe accurately what the page is about without being keyword stuffed. Example of a good title and description.
  2. Well formatted URLs – URLs:
    Permanent links (that’s the url of a webpage) are descriptive, all lower case and separated by dashes. Example of a well formatted URL.
  3. Fast loading web pages:
    Neither people nor search engines want websites that are slow to load. On the contrary fast loading websites are SEO friendly (meaning they have an advantage in ranking algorithms over websites that are slower) and generate more user interactions (sales, newsletter signups, contact forms submissions etc).
  4. It has unique content:
    Content on the website is not found anywhere else on the web, all pages have unique and useful content. This means a website cannot be SEO friendly if it has content copied from other web sites.
  5. Includes images that are optimized for search engines:
    Search engines prefer text that’s the truth, but you also need to have images to your pages because people like it, it makes your content more interesting, easier to read, shareable etc. When you do so, make sure that the images are optimized for size (tools like "smushit" can help you reduce image file size without losing quality) and also that you set a meaningful image filename and ALT text.
  6. Pages have a meaningful structure:
    A web page usually has the following elements:
    • Header
    • Breadcrumbs Menu
    • Page Title (that’s the H1 tag – there is only one per page)
    • Well formatted text – text is separated into a number of short paragraphs with subheadings
    • Author information
    • Footer
There are of course many other characteristics that make a website SEO Friendly, you can read them in our previous post, the ultimate SEO checklist but the above 6 elements are currently among the most important.

Why do you need a SEO friendly website?

Something that most CEOs, small business owners or new webmasters don’t easily understand is why you need a SEO friendly website and why make the effort to make your website friendlier to search engines.

Well, the reasons are a lot but those you need to know are:

Want to improve your SEO but do not know where to start? Stop Guessing! Read my proven SEO and link building advice and boost your traffic and rankings. Download this how to guide.
  1. It will get you more organic traffic (that is traffic from search engines)
    As expected, a SEO friendly website will get you more traffic from search engines as it is likely to run higher in the SERPS (search engine results pages). If you take into account that the majority of people who use the search box tend to select one of the first 5 results, you can understand the importance of SEO.
  2. It will make your website user friendly
    SEO is not only for search engines but good SEO is for users as well. Applying the principles of SEO to your website, will make it easier to use and this will enhance the user experience.
  3. It gives you brand credibility
    Users are more likely to trust websites (businesses) that are found in the first pages of Google, Bing or Yahoo. This is good for both brand awareness and brand credibility.
  4. It is cost effective
    A SEO website will drive targeted traffic 24×7 without needed to spend money on PPC or other forms of online advertising. While there is a cost to reach that point, the long term benefits are bigger.
  5. It helps you understand what your most important customers want
    SEO drives quality traffic and by analyzing the behavior of those users (how they enter your website, what they click, how they leave, what they like etc) is a great way to understand what your customers want and adjust your website or products to match their needs.
  6. SEO is even more important on mobile
    A website that is mobile friendly and has good rankings on mobile search can get more customers and traffic that websites that are not mobile SEO friendly. More and more users are using their mobiles to search for information or products while on the go it is important to be on top of the search results otherwise you are losing customers to competition, especially those searching for local products or services.

Conclusion

A SEO friendly website has certain features and characteristics that helps search engines understand what the website is all about and this increases the chances of achieving better rankings in the SERPS.
The most important advantage of having a SEO friendly website is that you will get more targeted organic traffic from search engines.
Source: Quora

Thursday, January 18, 2018

Major Google SEO Updates & Algorithm Changes from 2009 to 2017

Google has a long history of famous algorithm updates, search index changes and refreshes.

2017 Updates
  • Snippet Length Increase — November 30, 2017
  • Featured Snippet Drop — October 27, 2017
  • Chrome HTTPS Warnings — October 17, 2017
  • Google Tops 50% HTTPS — April 16, 2017
  • "Fred" (Unconfirmed) — March 8, 2017
  • Intrusive Interstitial Penalty — January 10, 2017

2016 Updates
  • Penguin 4.0, Phase 2 — October 6, 2016
  • Penguin 4.0, Phase 1 — September 27, 2016
  • Penguin 4.0 Announcement — September 23, 2016
  • Image/Universal Drop — September 13, 2016
  • "Possum" — September 1, 2016
  • Mobile-friendly 2 — May 12, 2016
  • AdWords Shake-up — February 23, 2016

2015 Updates
  • RankBrain* — October 26, 2015
  • Panda 4.2 (#28) — July 17, 2015
  • The Quality Update — May 3, 2015
  • Mobile Update AKA "Mobilegeddon" — April 22, 2015

2014 Updates
  • Pigeon Expands (UK, CA, AU) — December 22, 2014
  • Penguin Everflux — December 10, 2014
  • Pirate 2.0 — October 21, 2014
  • Penguin 3.0 — October 17, 2014
  • "In The News" Box — October 1, 2014
  • Panda 4.1 (#27) — September 23, 2014
  • Authorship Removed — August 28, 2014
  • HTTPS/SSL Update — August 6, 2014
  • Pigeon — July 24, 2014
  • Authorship Photo Drop — June 28, 2014
  • Payday Loan 3.0 — June 12, 2014
  • Panda 4.0 (#26) — May 19, 2014
  • Payday Loan 2.0 — May 16, 2014
  • Page Layout #3 — February 6, 2014

2013 Updates
  • Authorship Shake-up  —  December 19, 2013
  • Penguin 2.1 (#5)  —  October 4, 2013
  • Hummingbird  —  August 20, 2013
  • In-depth Articles  —  August 6, 2013
  • Knowledge Graph Expansion  —  July 19, 2013
  • Panda Recovery  —  July 18, 2013
  • "Payday Loan" Update  —  June 11, 2013
  • Panda Dance  —  June 11, 2013
  • Penguin 2.0 (#4)  —  May 22, 2013
  • Domain Crowding  —  May 21, 2013
  • "Phantom"  —  May 9, 2013
  • Panda #25  —  March 14, 2013
  • Panda #24  —  January 22, 2013

2012 Updates
  • Panda #23  —  December 21, 2012
  • Knowledge Graph Expansion  —  December 4, 2012
  • Panda #22  —  November 21, 2012
  • Panda #21  —  November 5, 2012
  • Page Layout #2  —  October 9, 2012
  • Penguin #3  —  October 5, 2012
  • Panda #20  —  September 27, 2012
  • Exact-Match Domain (EMD) Update  —  September 27, 2012
  • Panda 3.9.2 (#19)  —  September 18, 2012
  • Panda 3.9.1 (#18)  —  August 20, 2012
  • 7-Result SERPs  —  August 14, 2012
  • DMCA Penalty ("Pirate")  —  August 10, 2012
  • Panda 3.9 (#17)  —  July 24, 2012
  • Link Warnings  —  July 19, 2012
  • Panda 3.8 (#16)  —  June 25, 2012
  • Panda 3.7 (#15)  —  June 8, 2012
  • Penguin 1.1 (#2)  —  May 25, 2012
  • Knowledge Graph  —  May 16, 2012
  • Panda 3.6 (#14)  —  April 27, 2012
  • Penguin  —  April 24, 2012
  • Panda 3.5 (#13)  —  April 19, 2012
  • Panda 3.4 (#12)  —  March 23, 2012
  • Search Quality Video  —  March 12, 2012
  • Panda 3.3 (#11)  —  February 27, 2012
  • Venice  —  February 27, 2012
  • Ads Above The Fold  —  January 19, 2012
  • Panda 3.2 (#10)  —  January 18, 2012

2011 Updates
  • Panda 3.1 (#9)  —  November 18, 2011
  • Query Encryption  —  October 18, 2011
  • Panda "Flux" (#8)  —  October 5, 2011
  • Panda 2.5 (#7)  —  September 28, 2011
  • Pagination Elements  —  September 15, 2011
  • Expanded Sitelinks  —  August 16, 2011
  • Panda 2.4 (#6)  —  August 12, 2011
  • Panda 2.3 (#5)  —  July 23, 2011
  • Google+  —  June 28, 2011
  • Panda 2.2 (#4)  —  June 21, 2011
  • Schema.org  —  June 2, 2011
  • Panda 2.1 (#3)  —  May 9, 2011
  • Panda 2.0 (#2)  —  April 11, 2011
  • The +1 Button  —  March 30, 2011
  • Panda/Farmer  —  February 23, 2011
  • Attribution Update  —  January 28, 2011

2010 Updates
  • Negative Reviews  —  December 1, 2010
  • Social Signals  —  December 1, 2010
  • Instant Previews  —  November 1, 2010
  • Google Instant  —  September 1, 2010
  • Brand Update  —  August 1, 2010
  • Caffeine (Rollout)  —  June 1, 2010
  • Google Places  —  April 1, 2010

2009 Updates
  • Real-time Search  —  December 1, 2009
  • Caffeine (Preview)  —  August 1, 2009
  • Vince  —  February 1, 2009
  • Rel-canonical Tag  —  February 1, 2009

Tuesday, November 28, 2017

‘Hawk’ Google Local Algorithm Update

Have you noticed a recent shift in Google's local search results?

August 22, 2017: The day the ‘Hawk’ Google local algorithm update swooped in
‘Hawk’ Google Local Algorithm Update
The update, which I have dubbed “Hawk,” was a change to the way the local filter works. To get some history here, Google actively filters out listings from the local results that are similar to other listings that rank already. Basically, Google picks the most relevant listing of the bunch and filters the rest. It’s very similar to what they do organically with duplicate content. (Note: Google is typically loath to confirm algorithm updates, usually only saying that it rolls out several updates every day, so these observations are based on an analysis of how local results have changed rather than on any official announcement or acknowledgment.)

The filter has existed for a long time to help ensure that multiple listings for the same company don’t monopolize the search results. In September 2016, the Possum algorithm update made a significant change to the way the filter works. Instead of just filtering out listings that shared the same phone number or website, Google started filtering out listings that were physically located near each other.

This was very problematic for businesses. It meant that if another business in your industry was in the same building as you — or even down the street from you — that could cause you to get filtered out of local search results. Yep, that means your competitors could (inadvertently) bump your listing!

On August 22, 2017, Google refined the proximity filter to make it stricter. It still appears to be filtering out businesses in the same building, but it is not filtering out as many businesses that are close by.

Who is still filtered?


Naturally, this update didn’t help everyone. Although it tightened the distance needed to filter a similar listing, it didn’t remove it completely. I’m still seeing listings that share an address or building being filtered out of local search results. I also see the filtering problem persisting for a business that is in a different building that’s around 50 feet away from a competitor.

Why ‘Hawk?’


The local search community settled on the name “Hawk” for this algorithm update, because hawks eat possums. This is one of the few times where I don’t see any negative outcomes as a result of this update and just wish Google hadn’t taken a year to realize the proximity filter was way too broad.

Source: Search Engine Land

Thursday, November 09, 2017

7 Major Factors to Improve Page Speed Score.

Page speed is a measurement of how fast your site loads on browser Or the content on your page loads. let me tell you first what is page speed, and how it's matter in SEO Or Ranking for a website in 2019.

What is Page Speed?


Page speed is often confused with "site speed," which is actually the page speed for a sample of page views on a site. Page speed can be described in either "page load time" (the time it takes to fully display the content on a specific page) or "time to first byte" (how long it takes for your browser to receive the first byte of information from the web server).

No matter how you measure it, faster page speed is better. Many people have found that faster pages both rank and convert better.

page Speed

SEO Best Practices


Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages. And research has shown that Google might be specifically measuring time to the first byte as when it considers page speed. In addition, a slow page speed means that search engines can crawl fewer pages using their allocated crawl budget, and this could negatively affect your indexation.

Page speed is also important to user experience. Pages with a longer load time tend to have higher bounce rates and lower average time on page. Longer load times have also been shown to negatively affect conversions.

Here are some of the many ways to increase your page speed:

1. Enable compression


Use Gzip, a software application for file compression, to reduce the size of your CSS, HTML, and JavaScript files that are larger than 150 bytes.

Do not use gzip on image files. Instead, compress these in a program like Photoshop where you can retain control over the quality of the image. See "Optimize images" below.

2. Minify CSS, JavaScript, and HTML


By optimizing your code (including removing spaces, commas, and other unnecessary characters), you can dramatically increase your page speed. Also remove code comments, formatting, and unused code. Google recommends using YUI Compressor for both CSS and JavaScript.

3. Reduce Redirects


Each time a page redirects to another page, your visitor faces additional time waiting for the HTTP request-response cycle to complete. For example, if your mobile redirect pattern looks like this: "example.com -> www.example.com -> m.example.com -> m.example.com/home," each of those two additional redirects makes your page load slower.

4. Leverage browser caching


Browsers cache a lot of information (stylesheets, images, JavaScript files, and more) so that when a visitor comes back to your site, the browser doesn't have to reload the entire page. Use a tool like YSlow to see if you already have an expiration date set for your cache. Then set your "expires" header for how long you want that information to be cached. In many cases, unless your site design changes frequently, a year is a reasonable time period. Google has more information about leveraging caching here.

5. Improve server response time


Your server response time is affected by the amount of traffic you receive, the resources each page uses, the software your server uses, and the hosting solution you use. To improve your server response time, look for performance bottlenecks like slow database queries, slow routing, or a lack of adequate memory and fix them. The optimal server response time is under 200ms. Learn more about optimizing your time to first byte.

6. Use a content distribution network


Content distribution networks (CDNs), also called content delivery networks, are networks of servers that are used to distribute the load of delivering content. Essentially, copies of your site are stored at multiple, geographically diverse data centers so that users have faster and more reliable access to your site.

7. Optimize images


Be sure that your images are no larger than they need to be, that they are in the right file format (PNGs are generally better for graphics with fewer than 16 colors while JPEGs are generally better for photographs) and that they are compressed for the web.

Use CSS sprites to create a template for images that you use frequently on your site like buttons and icons. CSS sprites combine your images into one large image that loads all at once (which means fewer HTTP requests) and then display only the sections that you want to show. This means that you are saving load time by not making users wait for multiple images to load.

Source: SEOMoz