On-site SEO Checklist : 9 Essential Tips to Implement in 2019 for Better Ranking(Technical SEO)

On-site SEO Checklist : 9 Essential Tips to Implement in 2019 for Better Ranking(Technical SEO)

Ranking in the top 10 Google search results takes significant effort.

While you spend considerable time in writing content and promoting it, make sure you are also giving due importance to essential on-site SEO parameters.

Some of these parameters are extremely critical and maybe the reason behind you not gaining significant search ranking, despite all your hard work.

On-site SEO vs On-page SEO
On page SEO refers to how well you have optimized content and HTML on a specific page so that it can rank for a specific keyword.

Contrary, on-site SEO includes factors, largely technical in nature, which are not related to a specific page/content/keyword but are applicable to your entire website.

On-site or technical SEO parameters are more of hygiene factors in the present day. Their presence may not help you much but their absence is definitely going to impact your search rankings.

Below, I am providing you with a 9 pointer technical SEO checklist to get started.

9 Essential Technical SEO Parameters to Optimize

#1. Implement ‘https’ without any delay

Google views the security of its users with extreme seriousness, therefore it considers ‘https’ as an important ranking signal.

Https protocol ensures the data exchange between your web server and the user’s browser happens through a safe channel and is a step towards making the internet more secure.

Wayback in 2014, Google announced its inclination towards using https as a ranking signal and clearly pointed out that the importance of https in the future. Here’s one video from Matt Cutts.

#2. Have an XML sitemap in place

An XML sitemap is like a map of your website that helps the Google crawler discover important parts of your website, thereby resulting in an SEO boost.

Generally, Google tends to look out for linked pages in the already crawled pages to explore new additions to the website.

But if you have a sitemap in place, Google can discover new pages even though they may not be linked from any other page.

Sitemaps also have ‘last updated’ timestamp against each page/image/media asset. That tells the Google bot about the freshness of your website, thereby resulting in faster indexing.

You can check the sitemap of your website by typing yourdomain/sitemap_index.xml in the web browser.

If you are running a WordPress based website you can use Yoast plugin to automatically generate and update your sitemap.

#3. Create a robots.txt file

Always ensure that a robots.txt file is available in the top-level directory of your website. You can find the robots file for your website by typing in yourdomain/robots.txt (case sensitive) in the web browser.

This file sets the rules for each search engine bot and directs their site-wide crawling and indexing behavior.

Using this file, you can help the spider bot skip specific non-important sections of the website (like terms, privacy policy, etc.) and instruct it to crawl and index more important sections (like blogs, products, etc.).

You can also use this website to keep sections of your website private and non-indexable or to avoid getting your duplicate content crawled.

In its most basic form, robots.txt file looks like:-
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]

#4. Improve page load speed

Page load speed (defined as how fast the content on a web page loads) is one of the most important ranking signals used by Google search bot.

While Google has stayed quiet on how it measures the page load speed, it’s an open secret that it considers factors like time to the first byte and full-page rendering time.

You can use free tools like GTMatrix or Pingdom to find page load speed of any web URL.

GTMetrix_Improving page load time

Important is to understand why Google gives so much importance to fast loading pages. It’s because Google wants to answer the user query in the shortest possible time, and a slow loading page will disrupt the user experience.

For you as a web admin, there is another disadvantage of slow loading pages.

Slow-loading web pages will reduce the crawlability of your website since the Google bot allot a fixed crawl budget to every website.

Slow loading pages mean less number of pages being indexed with every Google crawl, thereby resulting in increased time for your pages to start ranking.

You can take a number of measures to improve page load speed (like compressing images, reducing redirects, using CDN, getting the faster server, minifying CSS/JS/HTML, etc.).

#5. Clean permalink structure

Permalink is nothing but the complete URL of a particular web page.

Opt for a user-friendly permalink structure that both search engines and the user can remember and relate to, but without any compromise on your business goals.

For e.g., a permalink structure like ‘yourdomain/this-is-test-post’ is preferred by many websites, but if you are a news website you may want to follow a date wise structure like ‘yourdomain/2019/08/15/this-is-a-test-post’.

#6. Ensure mobile responsiveness

Today, almost 60% of users worldwide access the internet through their smartphones.

With such a large audience to serve, Google views the mobile-friendliness of a web page as an important criterion as part of its ranking algorithm.

After all, a poorly optimized website will disrupt the user experience on the small screen. This is something that Google doesn’t want, hence it pushes down the websites with no or poor mobile responsiveness.

#7. Opt for Accelerated Mobile Pages (AMP)

This one is an extension of point #4 and #6 above.

Accelerated Mobile Pages result in fast page loads over mobile devices, thanks to the stripped-down version that shows only the essential elements on a mobile device.

As a result, pages with AMP version are pushed higher in mobile search rankings by Google.

However, AMP may not be required for all the websites. Before opting for AMP, you should do a thorough cost-benefit analysis to understand the impact it will have on your website.

Sometimes it makes sense to improve the UI/UX of the existing pages rather than switching to AMP.

#8. Check for duplicate content issues

Duplicate content refers to the content that is accessible on the internet through multiple URLs.

Most of the time, duplicate content gets created by mistake (for e.g. using a session ID for every user visiting a web page. This will generate separate URL but all having the same content).

The problem arises because Google bot gets confused on which URL to rank, and therefore the original content doesn’t get the search value it deserves.

As a result, it gets extremely important for webmasters to handle duplicate content issues (through canonicalization, 301 redirect or by deleting the duplicate content pages).

Here’s Matt Cutts again to explain how Google handles duplicate content issue.

#9. Look out for broken links

Web links in a page add context to the information and allow the user (and the crawler) to discover more information.

However, a nonworking link (termed as a broken link) results in bad user experience and is therefore not liked by Google. Such links can be both internal links or external links.

Websites with too many broken links indicate poor quality and can be penalized by Google.


We have covered the most critical on-site SEO parameters (also called technical parameters) in our checklist.

These factors are in the control of a web developer and should be given their due importance if one wants to rank high in search results.

While On-Page factors are largely connected to the content, technical parameters are related to the way you have built your website.

1 thought on “On-site SEO Checklist : 9 Essential Tips to Implement in 2019 for Better Ranking(Technical SEO)”

  1. In on-page SEO we normally make adjustments to
    the location itself like we change the construction of the location, navigation, contents, Title tags, Header tags, Meta
    description, Keywords density and many others. It is the
    most important a part of an SEO. This extension is considerably simpler to make use of than the browser’s built-in Inspect Element performance, the only
    downside is that you can not change an element’s types on the fly.

    Another manner to put reasonably priced search engine optimization ways to work is to find out which key phrases and phrases a person is likely to use
    when searching for the services or products that companies are offering.
    Then I discovered a documentation how Google Bot works which is basically based
    on Chrome 41 rendering engine. If you’re considering
    of the traditional SEO strategies that many firms use, such as related title tags
    and Meta tags, then no, not every webpage is in determined need of this sort of optimization. If you
    happen to think that your businesses have been serving other surrounding towns, then try to record as
    many as these you possibly can in your site. Ideally with content material development, firms can have the higher hand in pouring business to their websites.

Leave a Comment