Tuesday 18 March 2014

Search Engine Optimisation

The basics of SEO

Understand how the search engines work

Googlebot is Google's web crawling bot. Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google Index (Googlebot - reads HTML, begins with list of URLS, follows links, user agent)

The crawler will go to a website, follow the link and go to another website.

Links became the most important factor for SEO, if one website links to another it's saying that it thinks it is useful, but these days people cottoned on to this idea, and began spamming links.

If you're running your own website it's important to keep an eye on who is linking to your website: Google master tool will give you a long list of places that link to your site, sometimes it may be negative SEO.

Once it finds the server, it gets logged in the Google Index - this is the filing cabinet essentially.

Tags - need to be kept concise and simple.

Three types of SEO
Technical  - letting the search engine come to the site and move around
On page - make it clear what it is about, the content you are writing
Off page - what Google knows about you and whether it is trust worthy, good authors, social media etc.


Why should I care about all this?
Google shows people things they like, the results are tailored to the user, localised and biased towards fresh content. This is useful because it makes sure they like you, connects directly with them (authorship & social) writes for you audience and stays up to date.

Search Engines - most popular are Google 80% and Bing 20%

Responsive - This is where you can still view the entire screen when altering the size of the screen, it will be beneficial to make it responsive across multiple devices, e.g mobile phones, tablets etc. The theme which the site uses will need to be responsive, otherwise it requires lines of code.


Technical

XML Sitemap - a long list of pages on your website which can be used to submit to a search engine so that they can discover pages on your site they may not other have seen. 

Robots.txt - Use to guide crawlers / robots around parts of you website. Can be used to stop robots accessing unnecessary parts of your website.

HTTP Status Code
200 OK - page works
404 Not Found - Page not found (broken link)
301 Moved Permanently - URL has permanently moved to another location
302 Found - URL moved temporarily to an alternative

Schema Marker

On page

It needs to be accessible and have good user experience, e.g responsive. 

Target Keywords - keyword research, and understanding what people are searching for and using these keywords on the page increases the chances of getting into the Google Index filing cabinet. 
Include Keyword in page title.

Elements of optimised page
  • Website is easy to use
  • Content is created to be shared and social sharing is a feature of the page
  • The page targets one clear 'theme'
  • Content can be viewed on various devices
  • Context added by including authorship information / Schema.org markup etc
  • Page provides genuinely unique and valuable content
Google plus authorship, it lets you post pages and history of articles you have published. 

Off page

Once you have a site that is well optimised, you need to build links to the site, so that Google can begin to rank you form trust to fake. 
Link Building - Links from trustworthy related websites, factors include Number of links, Quality of links, and Anchor Text. (Negative factors include being paid for links, and spam).
Trust - Verify who is behind a website and where it is based, local factors include how old the site/domain is? Has the site been flagged for spam?
Social - Social media can become primary source of traffic to your website, factors include reputation, shares and authors. 



No comments:

Post a Comment