The In-Depth Guide to Website On-Page Optimization by Tina Zennand, Michael Evans - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

Part 2: On-page Optimization – Advanced

Hopefully, you have a proper understanding of the basic on-page SEO techniques. Now it's time to go deeper on how you can optimize pages of your website and explore more tactics you can work with to help your website rank better. 

Cleaning Up Broken Links, Redirects & Crawl Errors

You can easily identify the URLs that are generating Crawl Errors by login to Google Webmaster Tools. Go to Crawl > Crawl Errors. There you will find a list of URLs that are generating errors, and the server response code.

Click on a URL and you will see the errors that page is generating and from where they are linked. There are two ways of fixing this thing.

  • 301/302 Response Code

If it is showing 301 [permanent redirect] or 302 [temporary redirect], you need to find the pages from where they are linked and then need to update those internal links with their final landing page URLs. For example, if the page – http://www.example.com/301redirect.php - is getting redirected to http://www.example.com/webdesign.php and the first URL is linked from the home page that is – http://www.example.com, you need to replace the link reference of 301redirect.php with its final landing page URL i.e. - webdesign.php.

  • 404 Server Response Code

These are basically defunct or broken links. You need to remove their link references from the website or else, you just need to replace them with their nearest landing page URLs.

Duplicate Content

Duplicate content is the content that appears in more than one location on the web. Duplication of content can be considered as an attempt to influence search engine rankings by establishing a higher influx of visitor traffic.

So, what generally happens to a website with duplicate content?

  • The page with duplicate content won't rank well;
  • The weight of the page will be negligible;
  • The website gets a point against, and may not be considered as a reliable source of quality.

To avoid penalization for duplicate content, you always need to take care of  the content uniqueness.

Duplicate content are of two types:

External Duplicate Content

If you find that the content of your website bears some resemblance with external source, you'd better replenish it with fresh and unique content.

Internal Duplicate Content

It just happens with most eCommerce websites that a single product is available via two different URLs. This is known as internal duplicate content. This is detrimental to the visibility of the website. Here are the ways of fixing the issue:

1) Specify rel="canonical" for the preferred version of the URL. Use canonical for similar pages! Only slight differences are allowed.

Example – say a product page – Sony Camera - is available via these different URLs:

http://www.example.com/camera/sony/Sony-Camera

http://www.example.com/sony/camera/Sony-Camera

http://www.example.com/index.php?product=21

Let's choose this URL - http://www.example.com/camera/sony/Sony-Camera -  as canonical.  We need to add the following tag within the </head></head> section of all the aforementioned URLs:

<link rel="canonical" href=" http://www.example.com/camera/sony/SonyCamera" />

This will fix the issue of internal duplicate content. Google will be crawling all these pages but only the canonical version will get preferential treatment when it comes to ranking.

2) Use “noindex, follow” on the duplicate product pages.

Authorship

Google stopped offering the rel=author markup in non-personalized search results, but still Google+ connections may have authorship-like markup and rel=publisher options to display a profile/brand image and details with results.

Authorship is a good way to tell search engines that you are the author of an article. Articles written by verified authors are trusted by search engines. So, here is how it is done:

  • Create a Google Plus Profile. It is very simple. Just login to your gmail account and go here.
  • Fill out the information.
  • When the author account is ready, click on Edit Profile > Contributor To > Add Custom Link.
  • Enter the website name and URL.
  • Once done, you need to paste the following code in those articles where you want to activate Authorship - <a href="[replace with your profile URL]?rel=author">Google</a>
  • Now test the web page with   Google Rich Snippet Testing tool.

Trust Signals

The pages such as Privacy Policy, Terms of Use, Contact Us and About Us are signals of trust for both search engines and users. If you don't have them, we highly recommend you create such pages and always keep them up-todate.

XML Sitemap

XML sitemap is a great way to help search engines locate and index those web pages that they might not be able to locate otherwise. Generating a sitemap is easy; http://www.xml-sitemaps.com/ is a great website to generate a sitemap for free unless your website is a pretty big one.

Once you have created an XML sitemap, you need to submit it to different search engines. To submit to Google, you need to login to Google Webmaster Tools and then go to Crawl > Sitemap. Enter the sitemap URL and submit it.

Server Side Optimization

It is the best way to boost the performance of a website. The purpose behind it is to make the website load faster and to help it render without facing any glitch.

Choose Decent Web Hosting

Choosing a web hosting company that promises close to 100% uptime, little or no server outage and has some credibility online is another way to bolster the web presence of a website. Just make sure that the website is up and running round the clock.

A good web hosting company offers the following features:

  • Disk Space – almost all web hosting companies promise you to offer unlimited disk space. But there is a catch to it. ‘Unlimited’ until your website gets big, really big and starts driving huge amount of traffic. In that case, you will be asked to move to dedicated web hosting server that will of course cost you more money.
  • Uptime - as the name suggests, Uptime simply refers to the amount of time the hosting service has been up. If your website remains forever down, it will not be a good advertisement of your company. So, make sure that the hosting company offers a good up time.
  • Bandwidth – there is a cap on the amount of data that your website can transmit to your visitor to a specific period of time and this is what is known as bandwidth. Some web hosting service providers offer limited bandwidth and this can cause trouble for your website if it happens to cross the limit and therefore, it makes sense to settle for  hosting service that offers unlimited web hosting service.

We have checked the service of different hosting companies and found that Bluehost meets all the requirements of a reliable hosting. It offers high level of security to ensure safety of the data, and a good level of support as well.

Minimize Redirects

Each redirect sends additional server request and therefore, it makes sense to use fewer redirects as it will slow down the server, and this will have a direct bearing on the page loading time. Also, too many temporary (302) redirects are not only unnecessary in most cases, but may hurt your website if you use 302 an excessive number of times.

Schema.org [Microformat]

Microformat is a way to offer more information to search engines about what the page is about. Google, Microsoft, and Yahoo have collaborated with each other and have developed Schema.org markups and its tags are now recognized by almost all major search engines.

The good thing about Schema.org Markup is that it is not that complicated to integrate the mark-up with the HTML. To appear more prominently in search results, you need to integrate the tags of Schema.org successfully.

Here we are going to give you a sample of how to integrate Schema.org in HTML file.

This is a simple text:

Adobe Systems Incorporated

Contact Details:

Main address: 345 Park Ave, San Jose, CA 95110

Tel: (408) 536-6000

URL: <a href="http://www.adobe.com"> www.adobe.com</a>

 

This is the HTML part with microformat:

<div itemscope itemtype="http://schema.org/Organization">

<span itemprop="name"> Adobe Systems Incorporated</span>

Contact Details:

<div itemprop="address" itemscope

itemtype="http://schema.org/PostalAddress">

Main address:

<span itemprop="streetAddress">345 Park Ave</span><br>

<span itemprop="addressLocality">San Jose</span>,

<span itemprop="addressRegion"> CA</span>

<span itemprop="postalCode">95110</span>

</div>

Tel:<span itemprop="telephone">(408) 536-6000</span>,

URL: <a itemprop="url" href=" http://www.adobe.com"> www.adobe.com</a>

</div>

This will help search engines to connect the dots and also to figure out the relation between different elements of a web page.

Services and Tools for Analysis

In order to rank high in SERPs, you website must deserve to be there. Below  is a list of tools that will help you test how well your website is optimized. Each of them will always be of a great use for you.

  • Xenu's Link Sleuth

This is the best tool to locate error pages within a website. The best thing about this tool is that it is free. All you have to do is to enter your website URL and it will crawl the website and generate a report automatically.

  • Siteliner

This is a really useful tool when you need to find duplicate content and broken links on your website. When it comes to on-page optimization, there is no more important issue than duplicate content.

  • Screaming Frog

Screaming Frog is one of the most powerful SEO tools we have ever seen. It crawls the entire website and generates a complete report that includes almost everything – URL, Metadata, response code, link from, robots etc.

  • Structured Data Testing Tool

This tool launched by Google lets you figure out how your web page may appear in the SERP. In case, you have included ‘Structured Data’, it will also give you valuable inputs whether or not the structure data is working and what to do about it.

  • Copyscape

Copyscape is probably the best and the most trusted tool out there to figure out whether or not your website content is unique. It is a paid tool but totally worth every penny spent on it.

  • Moz

Moz comes with a suite of pro SEO tools that includes these – Open Site Explorer [a great tool to analyze backlinks and also to spy on your competitors backlinks], Moz Analytics [A powerful tool to analyze and understand your website’s visitors, interaction and engagement], Fresh Web Explorer [locating new link opportunity by finding the mentions of your brand is what this tool does], FollowerWonk [this tool lets you analyze twitter account and helps you to bolster your brand’s presence on this platform], Keyword Difficulty Tool [figure out how tough some keywords to rank in present conditions] and more.

  • Ahrefs

As far as the claim of the company is concerned, the index of this backlink analyzing tool gets refreshed every 15 minutes. This tool has got huge data and its easy and intuitive interface have made it a huge hit among web marketers. You can sort backlinks by IP address, Anchor texts, top pages, do follow, nofollow links and more.

  • SemRush

It is much more than a simple backlink analysis tool. It gives you all the information you need to make your website search engine friendly. You will be able to track organic rankings of your websites and your competitors’ websites, analyze backlinks and more. You will be able to do keywords research, website audit, analyze keywords density. What else can you want from an SEO tool?

It's time to show you the promised solution for website creation, which (in addition to stunning complete design, responsiveness and brand new functionality) will give you a cool set of features for easy website optimization. So, let's get to the Part 3 of our guide…