Ten On-Site SEO Techniques for Your Website
SEO techniques are broadly divided into on-site and off-site categories: off-site SEO deals with the techniques and strategies used away from your site, such as building links from other websites to yours. On-site SEO covers anything which has to do with optimizing your website itself to promote the site in search engines, and as such, it is firmly within your control. After all, you own the site and have full access and discretion over what is on the site and how the back-end is set up.
Rankpay gives you ten on-site SEO tips for optimizing your website:
Content is King!
Content alone stands out as the number one factor in determining your search engine rankings!
There is no substitute for fresh, relevant and well presented content both for attracting and retaining surfers and potential customers, nor for persuading the search engines to give you a higher ranking.
The ultimate objective of a search engine, especially Google, is to deliver relevant results to searchers on their search terms – anything you provide on your website which demonstrates how relevant you are is sure tio score you more highly in the ranking results.
By providing great content you not only deliver info that the user is looking for, but you’ll be rewarded for it by natural backlinks from people that truly enjoy what you wrote and or providing.
Use an XML Sitemap
A sitemap is exactly what it says – a map of your site.
There are two kinds – a static sitemap and an XML sitemap. The static sitemap is usually used by human surfers and is chart of your website, showing where pages are in relation to one another, as well as how the pages link to each other.
An XML sitemap is used by Google’s web crawlers and is invisible to human users. Using an XML sitemap will ensure that web crawlers access all of the web pages effectively and that none get missed.
A keyword-format url is simple and easy to read and looks something like this:
This is easier to use and navigate with compared to the old-style url format which may look something like this:
Keyword friendly URLs not only help the search engines determine the content and importance of a page, but also helps the users identify the topic of pages by simply looking at the URL.
Meta Tags and Title Tags
Meta tags are used to tell the search engine’s web crawler all about your web page. They allow web crawlers to index your web page and enhance the visibility of the web page to search engines which provides an advantage in the rankings compared to those web pages without meta tags.
Title tags give the page a description for use by the search engine when it displays the ranking results, and this is frequently the first thing a user will actually see in connection to your site when they come to decide which url they are going to click. Be sure to include your main keyword in your title as well.
You need to optimize the meta tags and meta title tags with your keywords and phrases as well as write them in such a way that they also make sense to human readers. Do not “keyword stuff” the meta descriptions because this will result in you being penalized.
Use Image Alt Tags for Graphics
Search engine web crawlers are unable to index images because they cannot read them. One way to impart the information to crawlers is to use the Image Alt Tag – include a short description which is optimized with your keyword and write it in a way so it makes sense to a human reader too. Be sure that it is an accurate description of the image, as this will help gain extra traffic to your site by those using the Image searches found in most of the search engines.
Avoid Using Graphical Text
As we’ve already seen, web crawlers cannot read graphical images and any text which is graphical will also be unable to be read. This is a situation which is commonly encountered with text used to describe a graphic and is a further reason why the Image Alt Tag should be used instead.
Use a Robots.txt File
Web crawlers will always look for a robots.txt file on your website because it tells them what they need to index and which web pages should be ignored. This may seem strange at first that you do not want every page indexed on your site, but there will be occasions when this is beneficial such as when you are looking to save bandwidth or want to stop web crawlers from indexing irrelevant pages such as feedback forms or your internal search engine tool.
Avoid Duplicating Domains: www and non-www
Many clients maintain websites which are duplicated on www and non-www, for instance:
http://www.anysite.com the www site; and
http://anysite.com the non-www site
Effectively you are duplicating content which is usually a very bad thing to do, but in this instance you will not be penalized, however you are spreading the effect of your content and SEO work across two sites for the same purpose. In other words, use one site or the other but not both – this will ensure all your SEO strategies are focused upon gaining high rankings for one site only and not split between two. You can fix this problem in your htaccess file by forcing all URL’s to either WWW or non WWW.
Build a Good Internal Linking Structure
These are the links through which surfers will navigate between different pages of your web site and may be embedded within the content, or by use of a navigation bar offering a menu of sections and pages of the web site.
The anchor text used for links within the content should be relevant to the page to which it is linking to – this is logical and common sense, but many developers will do something like:
For more information on Rankpay’s SEO services check out our page here
The link would be better placed like this:
You can learn more about Rankpay’s SEO services by following the link.
Internal links are used by web crawlers as the internal doorway to the rest of the web site – if you do not have a good internal linking structure, then the risk is some of your web site will not be indexed by the search engines and therefore you lose out on gaining ranking points.