When it comes to undertaking SEO in an effective manner, there are so many things to bear in mind. This is why it can be helpful to use an SEO checklist to stay on top of all of the different things that you need to take care of. This guide is relevant for SEO in the second half of 2015, which will ensure that you can be confident about these tips and guides being relevant for the present day.
The guide will be grouped together into smaller subsets, beginning with tips for On-Page Optimisation.
Order of headers
With respect to the best practices with respect to Header orders, you should look to use the following:
Title > Description > Keywords
The information that you use in these tags will provide search engines with the title and description information, and this is the information that searchers will see when they find your company. You should provide information with respect to providing information to readers and to search engines.
Title tags should include your keywords and they should be no more than 55 characters. This usually comes in at around nine words, with scope of plus or minus three words, but of course, words are of different length. There are online tools that will count the amount of characters in your text, allowing you to create title tags with the correct parameters.
You should look to make your description tag as compelling as possible and you should look to use a keyword term in your description tag. Always bear in mind that your description tag should be no more than 156 characters and of course, if you are struggling to calculate this, there are online tools that will help you to calculate the length of your tags effectively.
For your first heading in the body of your content, the tag should be <H1> and then all subsequent heading tags should follow on in the order of <H2>, <H3>, <H4> etc and this will be utilised as the table of contents for your page.
It can be difficult to strike the balance between quality and quantity but when it comes to word count, you need to have both of these elements in mind. No matter how lengthy your page is, it should be good quality content and the content should be focused. You don’t want to have a page that contains less than 250 words, and at the moment, a page length of at least 450-500 words is of merit. However, this doesn’t mean that you should stuff words (and keywords in) or that you should waffle or repeat yourself. If in doubt, err on the side of quality and brevity but ideally, you should be looking to include around 500 words of a good quality.
Alt attributes should contain keywords
When it comes to providing a good experience for all of your potential customers, there is a lot to be said for properly using the alt text feature for images. This is because this will provide information about any images you use to visually impaired users. However, there is also a lot to be said for the fact that images with alt text send information to search engines, which gives them further information on how best to categorise your site.
Urls: Dashes v Underscores
It can be difficult to know what to do for the best when creating a URL. You should be aware of the fact that underscores are alpha characters but they do not separate words. If you use a dash (or a hyphen) you will separate words but this can look spammy if you do this too often. There is no right or wrong way but try to look at the URL as a regular visitor and think about what you would think about or expect from a site that had a URL of this nature.
Fully qualify links
If your links are fully qualified, search engines and browsers know where to locate files, which is what you are looking for. You should use fully qualified links rather than complex URLs and your site map should contain fully qualified URLs. A fully qualified URL will have the http:// (or https://www) element. An example of a fully qualified link is http:www.yoursite.com while http://yoursite.com is not a fully qualified link.
It makes sense to ensure that the most vital code is the code that search engines crawl first. It is possible to place this code ahead of body text, so that you can be sure that crawlers come across it.
You should look to have a HTML sitemap on your site and every page on your site should be linking to the sitemap (ideally in the footer of the page). It is also advisable to have an XML Sitemap, which you should submit to search engines.
You want to make sure that you have text navigation on your site, at the very least on the bottom of your page.
Create Robots.txt File
Even if this file is empty, it is important that you create this file. This file will inform search engine spiders what they should not be indexing. You also want to ensure that this file doesn’t contain important pages, files, directories or even your own site. Sometimes accidents can occur and an accident of this nature could have a hugely negative impact on how you are ranked on search engines.
Develop a keyword strategy
Keyword strategy is vital but it is often a step that is overlooked. Even if you think that you know what people are looking for when it comes to what you offer or that keywords are intuitive for your site, do some research. Undertaking research on what people search on to find out what you or your rivals offer will provide you with the ideal starting point when developing your site and content.
The only way that you can determine if you are succeeding or failing is through some form of analysis. You need to know if your keyword terms are working, that your site is optimised and that you are reaching out to people. The only way that you can do this is through analysing your website, so make sure that you familiarise yourself with web analytics and that you look for tools that can help you to review your sites.
Develop a linking strategy
You need to have inbound and outbound links that have been created organically. Not having inbound and outbound links is bad but having poor quality or purchased inbound links is probably worse. This is an area that needs in-depth study and if in doubt any your linking strategy, never take shortcuts. Quantity and quality are major aspects when it comes to the impact that links will have on your site, so make sure that you trust the links that point towards your site.
Configure your server
It is important to regularly examine your server and you should be on the lookout for 3-1 redirects, 404 errors and any other errors that may occur on your site.
Recent research and studies indicate that a “privacy statement” is becoming a major factor that the leading search engines are looking for. You should consider a privacy statement to be best practice for your site. Given that a privacy statement can provide confidence to your users and vital information to search engines, it makes sense to provide this data.
If you have duplicate content, or you find that URLs with at least two query string parameters are not indexing, you should think about converting them into static pages. This can be a lot of work, especially depending on the size of your page, but it is of benefit when it comes to optimising your site. You will find that using the Canonical tag is another way to take care of this problem.
Static Index Pages
If the content on your homepage changes regularly, you may find that you dilute the expected theme of the site and this can have a negative impact on rankings for keyword phrases or terms. You should look to maintain consistent elements of text on your home page.
Text has a place
It is important to note that visual content is increasingly important on modern sites and that this content can be tagged to provide search engines with information. However, you should also bear in mind that text informs spiders, and if you use the right text in the right places, you can guide crawlers, search engines and people around your site effectively.
Don’t spam or use underhanded tactics
It shouldn’t have to be said these days but anyone engaging or indulging in underhanded tactics is placing themselves and their site at great risk. The leading search engines have a greater understanding of the underhand tactics that site owners use these days and the penalties imposed on people that use these tactics can devastate a business or website.
Make sure to check for duplicate content
You don’t want to use content that is available elsewhere on the internet and you want to make sure that other sites are not taking your content and using it for their own purposes. There are various online tools that can be used to ensure that you are not plagiarising content or that your content isn’t being plagiarised.
Set up Webmaster Tools accounts with Google and Bing
In order to be found on the leading search engines, it makes sense to know what the leading search engines are looking for and what you can do to please them. Google and Bing offer Webmaster Tools accounts that provide site owners the chance to see how search engines look at their site. This is a step that every site owner should take and devoting time to learn the various tools and functions will be of benefit.
Check for any reported crawl errors
If there is a crawl error for a page on your site, it means that a search engine has not been able to access the page. You can find out more by looking at the Crawl Errors Report which is found in the Crawl section of your Webmaster Tools account.
Make sure your site is set for mobile users
In the modern era with more and more people accessing the internet via mobile devices, you need to make sure that you provide these users with a great experience. On Google, you can use the Mobile Usability Report which is found in the Search Traffic section of the Webmaster Tools. Aspects that are highlighted here include telling you the size of your font, the level of Flash usage, whether the touch elements are too close together and whether the content has been correctly sized to the viewport.
It is also possible to use the Google Fetch tool to setup your site in the way that different mobile devices access the site. It is also possible to place URLs on the Google mobile friendly test to make sure that your site can be viewed effectively by every user.
Check for any manual penalties
If you have suffered a manual penalty, Google will report this within their Webmaster Tools. You should look in Search Traffic and then Manual Actions Report. With Bingo, you can examine the Index Summary Chart within the Webmaster Tools and if you find that the number of pages listed for the site you are reviewing is set at zero, you have been issued with a penalty.
Analyse against the Google Algorithm updates
If you have access to Google Analytics, there are tools you can use which will provide a report on the levels of your traffic with respect to the various algorithm updates provided by Google.
Consider the speed and performance of your site
The length of time it takes your site to load is an important factor. If it takes too long (more than 5 seconds), people will log off and Google will penalise you. The Google Webmaster Tool offers PageSpeed Insights to allow you to check your speed but there are also commercial apps which analyse the speed and performance of your site.