SEO has quickly become such an important part of website design and marketing that many useful tools have developed to aid in the process. These range from tools which track user behaviour, to market competition and keyword analysis. Here are just some of the tools we at ZumGuy have found most useful in the development of SEO-friendly websites:
robots.txt is a file placed in the root of a domain. It informs search engines how the publisher wishes the site to be accessed, i.e. which files and directories may be indexed and followed.
A meta tag in the
<head> section which can allow/prevent indexing and follow/nofollow of links from a page:
This technique is useful when there are canonicalization and duplicate content issues. It can also balance out the site content so that less valuable content is not referred to in SERPs (such as blog drivel).
This file is usually placed in the root, and is a guide to both search engines and manual users to how the hierarchy of the site is organised. Using this file, an SEO practitioner may gain some idea of the structural problems a site may have. For example, if categories are not referred to as comprehensible names, but are generated from a dynamic numbering system which cannot be read easily from the URL. There are techniques for rewriting dynamically generated URLs (e.g. sciencelibrary.info/entry.php?subject=9&topic=88?&id=183) to readable URLs (sciencelibrary.info/computing/SEO/entry.php?title=Site_analysis), which is not only more convenient for users, but supply search engines with more meaningful data to play with.
Too often design concepts are great pieces of art. However, if they are not based on the needs of the content which will occupy its spaces, it quickly comes into conflict with the true purpose of the site.
To avoid this conflict, and indeed to ensure that opportunities presented by innovative and quality content are not lost, a preliminary step to design is to make a content inventory. This is a (fairly) detailed breakdown of how information is to be communicated on the pages of the site, and how these pages are interconnected. A hierarchical diagram, containing page levels, linking structures, titles, in-text keyword focii, etc. can be developed. This will serve as a blueprint for all involved: artistic conceptualisers, graphic artists, programmers, text content generators, and SEO consultants.
The browser of choice for web design professionals has always been Mozilla Firefox. With this free browser comes a (wonderful) set of plug-ins which provide many developer tools and document analysis insights at a glance.
Firefox is a good way to find errors, check for styling problems, and its layout allows for rapid and accurate viewing of source codes. The source code can be used to check the meta data, formatting issues and to break down global navigation to its component parts.
For SEO purposes, the source code is a useful tool for checking whether keywords are in the places they were planned for, particularly with automatic keyword location systems. For example, if the
head section of the HTML page is an included file, the all-important page title is often dynamically generated from information passed from individual pages. If for some reason this string is not creating targeted keywords and placing them in the first part of the title, vital keyword targeting value is being lost. The source code reveals the actual text, rather than the variable name in the actual PHP code.
The set of tools provided by Google are probably the most comprehensive and useful on the web, are mostly free, and used by amateurs and professionals alike.
This tool provides a great deal of useful information about a specific site. Problems which Google has with the site can be identified, with suggestions for traffic improvement, and settings for the site can be edited. SEO practicants can see how Google sees their site, whether there are any issues like penalization or problematic structure, as well as access messages from Google to the site owner.
A very powerful keyword comparison tool. This can be used to crosscheck the target keyword list against actual numbers of queries using those words, or alternative synonyms and forms.
Website owners can sign up and register their sites. A code is placed in the head section of each page (or once if the head is an include file), and when that page is accessed from anywhere on the web from a Google server, the data related to overall traffic, and the individual visitors (geographic location, device and browser, time spent on site, number of pages, entry and exit points, etc.) is collated in a centralised report page. Very useful also for tracking which keywords were used to find the site, and how click-through rates are varying over time.
Returns metrics about a website or page concerning the number of keywords, and any problems related to that.
Information about targeted keywords on a page from an analysis of a number of search engine factors.
Provided by Microsoft adCenter, this tool generates keyword suggestions.
Keyword research, including listing popular terms for potential target keywords. This tool also has insight to the frequency of use of certain terms in the meta tags of sites.
Statistically based information about trends and uses of keywords.
Competitive web statistics. Useful for longtail search data, as well as paid and organic search stats.
Content © Renewable-Media.com. All rights reserved. Created : September 11, 2014
The most recent article is:
View this item in the topic:
and many more articles in the subject:
1902 - 1984
Paul Dirac is a leading figure in 20th century physics. His Dirac Equation describes fermions and predicted the existence of anti-matter, winning him a Nobel Prize in 1933.
How wonderful that we have met a paradox. Now we have some hope of making progress.
Website © renewable-media.com | Designed by: Andrew Bone