Science Library - free educational site

Glossary of SEO Terms

  • Algorithm
  • Numerical analysis of data from a wide range of sources. Each search engine (Google, Bing, ... etc.) has its own algorithms by which they initially assess and then monitor on a regular basis how much a site is likely to satisfy a search query. These algorithms are adapted continuously to the changing Internet, the needs of searchers, and to counter illegitimate practices, such as spamming. See also SEO Basics main article

  • Authority
  • A major search engine algorithm parameter. The relative authority of a site is determined from how well the site is respected and referenced by the quality sites within the online community it resides. In measuring authority, the perceived authority and relevance of the linking sites is more important than the number of links in total. See also Link Building main article

  • Backlink
  • A link from an external site to a site or page. Backlinks are important for assessing popularity and authority of a site for specific themes and keywords. The context of the internet community from where the links come from, as well as the quality (authority and popularity) of the external sites is also an important indicator. It is better to have a few links from high-quality sites than many from irrelevant, low-quality sites. See also Link Building main article

  • Bounce Rate
  • A parameter used to judge how satisfied a user is by the website the search engine has directed them to in response to a query. 100% bounce rate means the visitor left the site without opening a second page. This is interpreted as the site being irrelevant for the query. Too many results of this type will cause the Search Engine to begin to downgrade a site for that and similar search queries. See also SEO Tools main article

  • Brainstorming
  • A formal procedure in which a cross-disciplinary team meet in an intensive session to find creative solutions to problems. Brainstorming lays emphasis on lateral thinking techniques and non-exclusion of any ideas during the initial phases. In SEO consultation, brainstorming is a standard tool to utilise the in-house knowledge of all departments of a company, and to expose bias in marketing and development strategies. It also serves as an opportunity to communicate that SEO as the responsibility of everyone in an enterprise. See also Content First Design main article

  • Canonicalization
  • Multiple versions of the same site or parts. The principle of identifying clearly the 'official' version of content that may appear in more than one location is standard good SEO practice. For example, each domain has a URL with and without the 'www.' prefix. A 301 redirect should be used from one to the other to ensure that link juice is not being divided between what the search engine sees as two distinct domains. See also SEO Best Practices main article

  • Cloaking
  • The practice of deceiving users as to the content of a page, while presenting search engines with a different profile. A simple example would be mislabeling elements which are visible only to humans: such as images, AV and Flash content. Google sees a title and alt text, and assumes the content of the video or image is true to that. Cloaking would take advantage of that to present users with different content, such as an ad or worse. See also SEO Best Practices main article

  • Consultant
  • SEO professionals who advise clients on such matters as keyword analysis, site and content optimization, targeted audience behaviour, monitoring of site performance, and backlink generation. See also SEO Basics main article

  • Content Design
  • A branch of SEO concerned with the quality of content within a website. Its primary aim is to provide a site with material in sufficient quality and abundance, language options and degree of authority, that other sites within its online community voluntarily link to it and use it as a prime reference. See also Content First Design main article

  • Content Engineering
  • A branch of web design concerned with the structuring of information within a formal architecture. SEO is concerned that this information architecture (IA) provides the desired level of access to search engines, so that they can build an accurate and fair index, and that human visitors may navigate easily through it. The hardest part of content engineering is ensuring relevant cross references between materials at the same hierarchical level. See also Content First Design main article

  • Content Inventory
  • A formal phase of SEO consultation, in which existing online content is analysed for suitability, placement and performance in terms of the SEO parameters of keyword targeting, link patterns, and semantic mapping. Material that is not online may also be integrated into the inventory if deemed suitable for online publication. See also Content First Design main article

  • Crawling
  • When a programme, often called a 'spider', passes through websites via sitemaps and hyperlink paths, and generates an index of the World Wide Web. Spiders pass through a domain on average every 6-12 weeks, and index a maximum of 1000 pages per domain. Their behaviour on the site can be controlled to some extent by the domain publisher. See also Search Engines main article

  • De-personalization
  • The removal of Google personalization settings which distort search returns. SEO practitioners should view the internet as much as possible as anyone else would. See also Site Analysis main article

  • Folksonomy
  • A system of collaborative tagging of items in a catalogue, so that on a large scale a reliable naming and classification of the items will emerge. See also Information Architecture main article

  • Geolocation
  • The information and techniques employed to create a strong association between a site and a region or locality. Among the techniques are choosing a country-code TLD (.ch) in preference to a global TLD (.com), adding the address to every page, employing GoogleMaps locators, and associating the site with the local telephone directory entry, and other companies which are localised. See also Site Analysis main article

  • Global Navigation
  • Whole-site system for the orientation of the user. Beyond ensuring that every page has an evident return to home button (a logo is an effective continuity for this), a 'breadcrumb' system is useful. This works like the stub system on wikipedia.org, where the path from the homepage, through the various levels and category and sub-category pages, to the current page can be seen as a logical sequence. See also Site Analysis main article

  • Indexing
  • The records kept about sites on the World Wide Web, formed from data collected by spiders. Indexes are the results of machine learning algorithms which interpret and evaluate many different parameters, such as link density, semantic analysis of content, and keyword placement. See also Search Engines main article

  • Keyword
  • A word or phrase which is expected to be used regularly by searchers as their input string for a specific topic. Keyword analysis and targeting are important techniques in SEO. See also Keyword Analysis main article

  • Keyword Analysis
  • A set of tools and techniques to acquire an accurate understanding of the relative importance of keywords for a specific site or page. Keyword analysis is done as an initial step in the SEO consultation process, and involves brainstorming, with the participation of all stakeholders. See also Keyword Analysis main article

  • Keyword Cannibalization
  • The negative effect of conflicts in keyword targeting, confusing human searchers and search engines as to which page is the best match for specific keywords in searches. Cannibalization is countered by carefully structured hierarchies in the information architecture. See also Keyword Analysis main article

  • Keyword Placement
  • The various elements of a page do not have the same weighting in terms of contribution to overall ranking importance. SEO is concerned with developing a strategy for placement of keywords to target individual pages for specific search queries, while avoiding keyword cannibalization. See also Keyword Analysis main article

  • Keyword Targeting
  • The practical application of the knowledge gained in keyword analysis. Keyword targeting is a page by page application of keywords which specifically channel search query results to the most relevant pages on a site. A global plan for keywords is integrated into the information architecture of a site to prevent keyword cannibalization. See also Keyword Analysis main article

  • Link Farming
  • Abusive attempts to gain authority by excessive backlinks with low relevance. This is one of the deadly sins that Google is on the look out for. See also Link Building main article

  • Link Juice
  • The enhancement of a site's authority on a subject for a search engine through relevant and judicious links and backlinks. SEO consultants and site designers seek ways to increase and internally distribute a site's link juice. See also Link Building main article

  • Long tail
  • The 70% of all search queries using terms which are not popular enough to be considered keywords. These may be highly specific queries (e.g. 'Brother DCP-365CN' instead of 'printer'), foreign terms in primarily English queries, less-popular synonyms and morphemes, misspellings, colloquialisms, and the plain weird. Named after the shape of the graph of frequency of term use in queries, which has 30% of queries based on a few known keywords, but 70% forming a long, rapidly diminishing frequency 'tail'. The long tail is a much more difficult area to exploit than obvious keywords, but can bring considerable benefits if incorporated well into a keyword targeting strategy. See also Keyword Analysis main article

  • Meta data
  • The information (description, author, keywords) stored in meta tags in the head section of an HTML page. Only the title is used by search engines for page ranking purposes. Description is an important human visitor advertisement in the SERP list, and keywords has no value at all to machine or beast. See also Keyword Analysis main article

  • Organic results
  • Non-paid returns to queries in the SERP (Search Engine Results Page). Sites are indexed and evaluated by algorithms to assess their suitability as matches for specific keywords in the query. Long-term good ranking from organic searches is the aim of SEO. See also SEO Basics main article

  • Paid results
  • Sites may be returned for specific keyword searches, irrespective of their evaluation by the search engine algorithms. The results are displayed prominently at the top of the SERP, and must be paid for. Competitive bidding for important keywords makes paid results an ever more expensive way to advertise. SEO utilises data available from paid results tools, such as keyword analytics, but proposes that organic results is a better return on investment strategy. See also SEO Basics main article

  • Popularity
  • A major search engine algorithm parameter. The popularity of a site is essentially how much traffic it receives, and the number of backlinks it has. However, the algorithms do weight these parameters by traffic type and backlinking site quality and relevance. See also Link Building main article

  • Ranking
  • The non-paid (organic) position of a site or other resource on a SERP in response to a particular search query. The aim of SEO is to improve rankings by optimizing a site for the key parameters used by search engines to determine ranking. See also Search Engines main article

  • Relevance
  • A major search engine algorithm parameter. The evaluation of a site for applicability as an SERP (Search Engine Results Page) return option for particular query keywords. Relevance is strongly influenced by semantic analysis of site and page content, and perceived site authority. Backlinks and keyword placement are the focus of SEO for improving a site's relevance for specifically targeted themes. Key parameters in measuring relevance include user response to the SERP listing (click-through and bounce rates). Do searchers click through to the site for specific keywords, and if they do, do they continue on the site, or return immediately to the SERP (bounce)? See also Search Engines main article

  • Search Engine Optimization
  • The specific techniques in design and content of a website to enhance through fair techniques the search engine return page (SERP) results. In the Web 2.0 Internet of today, SEO is the primary factor in the success or failure of an online business. SEO is very dependent on the changes in search engine algorithms which control the way in which organic search queries assess the relevance and value of a website to satisfy the probable expectations of the user. See also SEO Basics main article

  • Semantic Analysis
  • A method by which search engines develop a map of the content of a page, and in particular how well the content is true to the theme profile presented through keywords in titles and headings. Semantic analysis is a tool to counter spamming, and to establish the relative value of a page for specific search queries. See also Content First Design main article

  • Semantic Connectivity
  • A determination of how well content links within a page and to other pages of a site, or other sites, are relevant to each other. See also Keyword Analysis main article

  • Semantic Map
  • A description of a page's semantic connectivity and structure resulting from document analysis, based on data collected by a spider. It is used in conjunction with algorithms which categorise by parameters of authority, relevance, importance and popularity, to develop a profile of a page's suitability as a response to specific keywords in queries. See also Keyword Analysis main article

  • SERP
  • Search Engine Results Page. The page of possible websites and other internet resources, such as videos, images, maps, which may satisfy a searcher's query. SERPs return websites ranked according to parameters such as relevance and authority. Most results are organic, chosen from the search engine indexes, but some are paid by the site owners to obtain prominent rankings. See also SEO Basics main article

  • Spamming
  • Attempts to raise the rankings of a low-quality site through techniques which are unfair and misleading to the user, and which search engines have declared to be illegitimate. Spamming includes cloaking, keyword stuffing, content theft, page scraping, content duplication, mislabelled links, and link farming. See also SEO Best Practices main article

  • Spider
  • Search engines collect information about a site by the use of 'spiders'. The site is accessed and information is collected. These include, for example, the regularity of use of certain keywords, and the relative value and relevancy of key elements, such as titles and site internal structure. Links to and out from the site are also assessed. See also Search Engines main article

  • Structure
  • Also known as information architecture, structure is concerned with the accessibility of information and the clarity of its presentation and thematic fidelity, in logical, readily understood patterns. In CMS systems, structure must ensure self-replicability to any scale, without loss of accessibility and clarity. See also Information Architecture main article

  • SWOT
  • Strengths, Weaknesses, Opportunities, Threats. A technique in marketing which facilitates analysis of a company with respect to competitors. See also Marketing Objectives of SEO main article

  • Title
  • The page title, found in the head section of an HTML document, is the highest ranking keyword location opportunity. Contrary to popular belief, the title is the only piece of meta data which is considered by search engines in their page ranking systems. See also Keyword Analysis main article

  • URL
  • Universal Resource Locator. The URL is the address of a page that appears in the long field in the control bar at the top of all browsers. e.g. http://www.sciencelibrary.info/library/computing/seo/Glossary_of_SEO_terms.php. The URL is a major indicator to a search engine and human navigators about the nature of the site. It should be clearly legible, not contain codes and illegible flags, and the suffix should be local (e.g. country code .ch, .co.uk) for geolocation of the site to a region. Generic top level domains (.com, .info, .net) should only be used for genuinely international sites, since they lose ranking importance for local service queries. See also SEO Best Practices main article

  • 301 redirect
  • A permanent redirect. A better option than the temporary 302 redirect, since 301 passes link juice, while 302 does not. See also Article: Hyperlinks and Redirecting

Content © Renewable-Media.com. All rights reserved. Created : October 14, 2014

Latest Item on Science Library:

The most recent article is:

Trigonometry

View this item in the topic:

Vectors and Trigonometry

and many more articles in the subject:

Subject of the Week

Computing

Information Technology, Computer Science, website design, database management, robotics, new technology, internet and much more. JavaScript, PHP, HTML, CSS, Python, ... Have fun while learning to make your own websites with ScienceLibrary.info.

Computer Science

Great Scientists

Daniel Bernoulli

1700 - 1782

Daniel Bernoulli was a major scientist and mathematician of the 18th century, a friend and associate of Leonard Euler, and a member of the illustrious Bernoulli dynasty of mathematicians.

Daniel Bernoulli, 1700 - 1782
Lugano English

Quote of the day...

Newton was not the first of the age of reason: He was the last of the magicians.

ZumGuy Internet Promotions

English language school