Category Archives: Technical SEO


Applying a SEO strategy for your online presence is vital for your visibility on the search engines. However, you also need to track your performance. You’ve put a lot of effort into optimizing your web pages, now let’s see if your hard work has paid off. This is the first part of our informational articles on Tracking Your SEO Performance – learning which metrics you should follow and what they can tell you about the direction of your business.

Domain Rating and Domain Authority

These are the first metrics you should know and take a look at. Ahref and Moz are the two big companies which are providing SEO tools for tracking metrics of your website, researching competitors and improving your organic visibility. They each have a devised a term for tracking sites’ authority on the web.

Domain Rating (DR) is a term devised by Ahrefs to explain the authority of a domain name on a scale from 0 to 100. The definition given from Ahrefs is the following: “…the strength of a target website’s backlink profile on a 100-point scale.” 

On the other side, Domain Authority (DA) is established by Moz and it “predicts how well a website will rank” on search engines on a 100-point scale as well so the prediction has to utilize more than backlinks.

Depending on which tools you are using, you need to use the right terms of tracking the authority of your website. But otherwise, they are both proof for the quality of your content. The higher the number is, the more trustworthy the domain name is. The domain rating is based on the quality of links and mentions your company has. Having a lot of backlinks is a good thing only if they come from websites with high domain rating. There is no point of having thousands or millions of backlinks from sites with low domain authority. This can only hurt your own DR. Instead, you SEO strategy should focus on targeting websites that are relevant to your business and have high DR.

The domain rating is a logarithmic scale. Meaning that going from DR 10 to 20 will be easier than going from DR 70 to 80. If you notice that your DR is going up, this indicates that your SEO strategy is working. If it is going down, you need to make a link audit and see, if you have any spamming domains referring to you or if you’ve crossed some of the rules that search engines pose for organic traffic.

Organic Traffic 

Examining your organic traffic over time, will also show you the effectiveness of your SEO efforts. This is one of the most accurate metrics you can check. Other metrics will indicate trends and direction of your strategy, but organic traffic is a quantifiable proof of all your work. It is the traffic you get from appearing in the search engine results pages (SERPs) without paying for ads. It shows you how many people are visiting your page and the metric is usually per month. So, you need to check that specification when you are looking at your organic traffic’s numbers. 

You can track organic traffic by landing pages or locations. Tracking by landing pages will show you which pages get the most attention. Find what you are doing right and apply it to all your pages. And tracking by location will show you where your organic traffic comes from. If you are positioned in the UK, but you’re getting a lot of traffic from the USA, it might be time to reach overseas and expand your business. And, vice versa – if you are targeting Canada as well and you spend your time and budget on that country without getting results, it’s probably time to stop spending efforts there. 

Keyword Rankings

Keyword Ranking is where a keyword is positioned in major search engines (Google, Bing). The closer to number 1, the better the ranking is. You want to rank as high as possible for the high-volume keywords (the ones that are searched the most). This metric gives you a general direction of your SEO strategy since ranking higher for a keyword shows improved rankings overall. It can also indicate, if you keyword selection is appropriate. If your other SEO metrics get better (DR, organic traffic), but your keyword rankings does not, it means that you’ve chosen poor keyword selection. In this case, you can choose less competitive keywords and try ranking for them first.

Keyword research is one of the first things you need to do when you start working on your SEO strategy. This will help you generate more traffic, leads and sales. Having said that, keywords’ volumes can fluctuate as Google often updates its algorithm. Also, some months people search more for specific keywords than they would do the rest of the year. For example, before Christmas, in November and December, the volume of the keyword “Christmas lights” will be much higher than after Christmas. So, tracking keywords ranking and the keywords volume should be a regular activity in your SEO efforts.

You can read more about how to use keywords for SEO in one of our previous blog articles. 

Backlinks and Referring Domains

SEO performance is largely determined by the number and quality of backlinks. If there are two websites with similar on-page metrics, such as bounce rate and time spent on site, the site with more quality backlinks will rank higher on the search engines. There is also a strong correlation between rankings and number or referring domains. The backlinks are the links that are referring to your website and the referring domains are the domains from where those backlinks are coming from. One domain can have multiple backlinks to your website.

Tracking your backlinks and referring domains will show you how successful and cost-effective your link building strategy is. If you spent $1000 to receive one link, and it didn’t affect your rankings in the future, means that you might need to alter your strategy. Make sure you target websites with high domain authority. Otherwise, your website might suffer in rankings.     

Local Rankings 

Local rankings are important for companies that have local SEO campaigns. It shows you, if your traffic is coming from your specific target audience or not. Some of the local SEO metrics you can track are Google Maps rankings, Google My Business Insights, Google 3-Pack results, and Session Location. Setting up a Google My Business account is vital, if you want local people to find your business. And session location can show you the cities where your website sessions are happening. Those local rankings can help you evaluate the amount of local traffic you get. 

Mobile Traffic

It’s important to know, if you traffic comes from desktop or mobile searches. Optimizing for mobile devices is crucial since Google announced they’re switching to a mobile-first index. It might be surprising to know that approximately 60% of all searches are made from mobile devices. The mobile traffic can indicate how mobile-friendly your website is. If it stays still while your overall traffic increases, you might have some problems that need resolving. On the other hand, if your mobile traffic increases, it indicates shifting usage patterns in your target market. In this case, it’s probably time to invest more in mobile development. Lastly, searching on mobile is different than on desktop. Around 20% of mobile searches are through voice only. Moreover, users generally use fewer keywords when searching on their mobile phones. All those trends can help you adjust your SEO strategy. 

Engagement Metrics

Engagement metrics display how the visitors are engaging with your website. There a few metrics you should know and follow: 

  • Bounce rate is a percentage number of visitors who left your website without clicking on anything and without performing any action. Generally, bounce rates vary between 40% and 60%. These numbers differ depending on your industry, of course. It’s calculated by dividing the number of unengaged sessions by the total number of sessions.
  • Time on site is the amount of time visitors spend on your website on average.
  • Pages per visit is the number of pages visitors open on average before leaving your website.

Bounce rate is proved to be very important for your rankings. If more people bounce out from your website, this means your content is not what users are looking for. Therefore, Google’s algorithm will rank you lower on the search engine result pages. So, if you see that your bounce rate is higher than normal, maybe you need to start updating your landing pages. 

Crawl Errors 

If you want Google to rank your website, the search engine has to be able to read it, or crawl it, first. Having a lot of broken links and missing pages, will make it difficult for Google to crawl your overall website. And if Google can’t find your pages, it won’t rank them. Therefore, you have to check for crawl errors and fix them immediately. This will improve the site’s readability.

A type of crawl error is a server error, meaning that Google cannot communicate with the DNS server, the request timed out, or your website is down. Another type of error is a URL error such as a non-existent page or redirection has a long chain. You can check for crawl errors using the option “Fetch as Google” in Google Search Console

Page Speed

If your website takes too long to load, users will just press the back button and look for their needed information elsewhere. Most visitors will abandon your website. And even if they do decide to stick it through and convert, their first interaction with you now has been too frustrating. A web page that takes 5 seconds to open, has a 90% chance to be abandoned. So, you need to ensure that your website is not crowded with unnecessary or heavy files. It will be easier to compress and optimize your images, code and text beforehand. This way you won’t have to waste time and effort redesigning and fixing errors after you’ve found there is a problem with the page speed. 

Organic Conversions

Organic conversion is the ranking you want to improve the most. It’s the leads you’ve generated through organic traffic. A lead is a contact with a potential client that you will hopefully convert into a sale. It’s the closest one to your main business objective – growing your customer base and increasing sales. The organic conversion shows you the quality of your traffic. You can follow the points of contact by looking at your emails’ signups, phone calls, requests for direction, form submissions, and actual purchases.


For information on how to track all those metrics, follow our blog in the future for part 2 of tracking your SEO performance.


Technical SEO: what is it and why you need it
By improving different technical elements of your site you can help search engines better find, crawl, and index your site.

How Search Engines Read the Code Behind Your Page

Search engines cannot interpret the content of web pages the same way we do. Take this page for example. To you, it is a page containing different visual elements like logos, photos, etc. To an engine it looks like this:

When search engines look at web pages they look at the page HTML code (Hypertext Markup Language). This same code tells your browser where to download all the images from, how things are laid out, what fonts and colors to display etc. The HTML of pages also refer to and load the style sheets (CSS) which are extra instruction for the visual representation of the content.

Keeping a clean and error-free HTML pages help browsers display your pages correctly and help search engines understand the structure of your page.

Why You Should Use XML Sitemap

Much like humans, search engines visit links to find new content. Unlike humans though, engines click on every link they can find. The easiest thing to do to ensure that your content is indexed is to link to it.

Another way to help engines find your content is through XML Sitemaps. A sitemap is a listing of your pages’ content in a format that engines can read through.  You can learn more about sitemaps and how to generate them on Once you create your sitemaps you submit them directly to the search engines through “webmaster tools”.

Why You Should Use Meta Robots

Linking and using sitemaps is a great way to tell search engines about the location and changes of your content, however, when there are pages which you do not want to be indexed you can use meta robots. Think of members-only pages you might have or test pages of your site which you might want to hide from being indexed and show on SERPs. To do this you can set rules for search engines in what is called robots.txt file. This file is added to the main root folder of your site. Creating a robots.txt file can be a little technical but you can learn more by visiting the

Dealing With Duplicate Content

In theory, all pages on the internet should have their unique URL, in practice however pages often can be reached through slight variations in their URL. This results in duplicate URLs (content) in the search engines index. Having duplicate content on your site is very dangerous because search engines like humans prefer unique content. Duplicate content can cause engines to penalize your site which can really hurt your organic traffic!

Using Canonical URLs

One way to resolve duplicate content issues is to use the Rel Canonical Meta Tag. You add this tag to the code of your duplicate pages telling search engines which is the primary URL to access your content. You can also clarify to search engines how you use URL parameters directly in webmaster tools.

Using Redirects

Duplicate content issues may also arise when you move content around your site. For example, you may move content from URL A to URL B but both URL A and B can exist in a search engine index. This may cause search engines to send traffic to the wrong page (URL A). To avoid this it is important to use redirects. Here are three of the most important redirects and when to use them:

  • 302 Temporary Redirect ­ use it only for short term content move. It tells search engines that the pages are not available now but will be back shortly.
  • 301 Permanent Redirect ­ use when you are moving your content permanently. It tells search engines that it should take everything it knew about the old URL and apply it to this new one. To rank high on SERPs you must ensure that engines see unique URLs and know the exact location of your content.

Use of JavaScript, Flash, and AJAX

You can use technologies like JavaScript, Flash, and AJAX to enhance the user experience on your site. Unfortunately, these technologies may cause problems for search engines. The crawlers are consistently becoming more advanced and are getting better all the time, but traditionally they have struggled with these technologies and it is advisable to use them carefully. You can still have this type of elements on your site but as a general rule avoid using JavaScript, Flash, and AJAX for your important page elements and content. Site navigation is one example, even the best site architecture in the world can be undermined by navigational elements that are inaccessible to search engines.


Flash is great for making websites look pretty but a bad choice from an SEO point of view. When reading Flash content, search engines have a very difficult time associating specific parts of content with the correct spot in the file. Flash can add a lot of value to a website, but it should be used sparingly. Flash can be implemented in a way that really adds to a website’s user experience, but you should never make a site entirely in Flash (or any significant portion).  With the advent and increasing support of HTML5, there are increasingly fewer reasons to use Flash as HTML5 can accomplish most anything you would want to do with Flash. See all those amazing examples of HTML5 sites.


AJAX is an acronym for Asynchronous JavaScript and XML. It is a great technology from a user point of view as it reduces load time and increases site speed by only requesting content when users request it. However, AJAX may cause issues as it requires the search engine to “click” or request the content before it can be read. Refer to the Google Guide to AJAX crawling for webmasters and developers for a detailed list of things you can do to help Google and other Search Engines crawl and index your content.

Server Side Factors You Must Consider

The hosting and management of your web server can have a tremendous effect on your SERPs ranking. Engines want to provide a good experience to their users so they want to send them to sites that are reliable and have a fast load speed. You need to make sure that you are serving pages quickly and have minimized your server downtime.

The location of your server can have an effect on speed. If most of your users are located half a world away from your server location they may find that your pages are loading slowly. For this reason, you must ensure that your web server is located in the geographical region of the majority of your visitors. If your business is global and you expect traffic from around the world, consider a hosting solution which can distribute a request for your pages to a global network of servers.

Google Search Console and Bing Webmaster Tool

Google Search Console and Bing Webmaster Tools let you see what search engines know about your site, whether there had any problems while indexing your pages, as well as other important information. These tools are very useful because they let you provide the engines with directions like uploading sitemaps and setting geo-targeting for your sites. Here are some specific benefits of creating and actively managing webmaster accounts:

  • See potential issues that search engines have detected on your site
  • See metrics of your organic traffic
  • Make optimizations and provide the engines with information to better understand your site

Google Search Console

Go to and sign in/create a google account.  When you log in you will be able to add websites you can manage. You will need to verify to Google that you are the webmaster of the site you want to add.

Once you verify your site you will access the dashboard with all the areas of webmaster tools. Detailed tutorials on how to use Google Search Console are available on the Google Search Console YouTube Channel.

Bing Webmaster Tools

Go to Much like Google Webmaster Tools, you can find out what information Bing has about your site, and you can provide instructions. Visit the Bing Webmaster Tools Help & How­To Center for training and resources.

Keeping a clean code and visible to the engine’s content must be a fundamental part of your SEO efforts. Hopefully, this guide provided you with a technical overview of what you must consider when building and optimizing your websites.