Science Library - free educational site

Site Design for SEO

The first step in an SEO practitioner's service is to make a general assessment of the website in question, within the context of the internet environment of its competitors and related suppliers.

Google adapts its search results to match what its machines believe they have learnt about your online habits. To understand SEO, it is necessary to turn this biased search screen off, so that searches can be run as they will appear on anyone else's screen. Well, almost - the geolocation and language bias remains in place.

To de-personalize the FireFox Search function: open the Bookmarks folder, right click a folder and create New Bookmark.

Google depersonalization

Enter 'Google de-personalized search' in the Name field, 'http://www.google.com/search?q=%s&pws=0' in the Location field, and 'dp' (or any keyword that is short and you will remember) in the Keyword field.

To run a de-personalized search, enter 'dp ' (dp space) before the search string in the URL field (not the Google.com search field) at the top of the screen where the name of the site you are on appears.

Initial Survey

When assessing a site, start by a very general search, using key words which roughly match the core business of the site.

Identify the general quality of the sites that are returned. Are these sites well-reputed or do they look suspicious and spammy (likely to contain ads and irrelevancies).

Make a list of major competitors for respective search keywords, for future reference.

Next, the site itself is examined. First, the usual non-SEO compliant design problems can be quickly identified and remedied. In particular, searches can find cases of problematic duplication of material, including:

Canonicalization

It is important to ensure that content is unique, and does not appear anywhere else, either on the same site, or elsewhere on the Web. Search engines frown on non-original material, in the same sense that a newspaper would not be happy to see an article it published appear in a competitor's publication.

Every domain has two URL versions: with and without the www. prefix. Good practice is to select one URL and use that as the source for all the material using the same domain name. http://www.sciencelibrary.info and http://sciencelibrary.info are considered distinct domains in the eyes of search engines.

A 301 redirect should be used to redirect one of the domains to the other domain (whichever you think has the most value, which probably means the one with the longest history of link building). Use a tool like http://www.seomoz.org/linkscape/ to measure the page links and authority to a site's homepage.

301 versus 302 redirects

Always use 301 redirects whenever possible. A 301 redirect passes most of the SEO value (link juice) of the redirected site, whereas the alternative 302 redirect passes none at all. The effect for the visitor is the same in both, so use the 301.

To make this redirect, create a .htaccess file containing the following code:

//Permanent 301 redirect of URL without 'www' to the full URL

RewriteEngine On

RewriteCond %{HTTP_HOST} ^sciencelibrary\.info

RewriteRule (.*) http://www.sciencelibrary.info/$1 [R=301,L]

and place it in the root of the site.

To do the opposite - redirect www. prefixed URLS to the form without the www., use this code instead:

//Permanent 301 redirect of URL with 'www' to the URL without

RewriteEngine On

RewriteCond %{HTTP_HOST} ^www\.sciencelibrary\.info

RewriteRule (.*) http://sciencelibrary.info/$1 [R=301,L]

Site Accessibility

A general principle is that as much link juice (the value gained by backlinking to the site, reflected in page ranking in SERPs) as possible should be passed on to pages contained within the site.

Website owners tend to link to the homepage of another site. This is because the URL address is shorter, and more reliable, since individual page addresses are much more likely to change at some point than the domain name. Consequently, there is the risk that a visitor following the recommended link is dissuaded from pursuing the path. This can occur for a number of reasons:

Compliant Styling

If the site is garishly different or clashing in style, the visitor may be disoriented, and return to the safety and reassurance of the original, more familiar site. This is where an understanding of the 'Internet Community' in which a website resides is important. No site is an island unto itself. To blend in and be accepted as a natural and trustworthy continuation of an online experience, a site should take into account how the linking sites look and feel.

Clear Navigation

If navigation is non-global and unclear, visitors may easily get lost down a blind alley. Global navigation means ensuring that from any page on the site, the user may easily access any other part, and retain a sense of orientation and location within the site structure. Users tend not to follow the URL addresses as a guide (unless they are SEO buffs), but trust the signposts placed along the top or up the left-hand-side of the page.

In a large, complex site, navigation tends to be divided into two bars: one for global navigation, and one for specific area/topic navigation. For example, university sites almost always have a top bar which takes the visitor back to the university lobby and the reception desk, and a secondary navigation system which takes her deeper into the faculty she is currently in. Highlighting the section in the global bar is a good way to retain orientation. These conventions of location and functional grouping have already become established, so it is best to comply to them. Being too original in this regard may only lead to visitor frustration.

Content Quality

In the age of anonymous, third-party click-ads, site owners are plastering their pages with flashy, distracting tins of spam whose sole purpose is to distract the visitor from the core content of the site, and to lead them off somewhere else. This contradiction of purpose leads to the type of site which SEO practitioners label with the highly technical term 'spammy'. The spam sites do not have a primary purpose of rewarding the visitor for their time and trust with quality-controlled content. Instead, they are more often than not full of non-original material and/or rely on erratic UGC (user generated content), such as blogs, for which there is no quality assurance possible.

Hierarchy

Website hierarchy example
A good website hierarchy is shallow, and all items of a similar type are at the same level

In a poorly designed site, paths to sought content can become too long and windy. The ideal site structure is a tree with as few layers (vertical nodes) as possible. Even though Sciencelibrary.info is one of the most comprehensive science education sites on the internet, every core content page is exactly three clicks away from the main catalogue. And all resources related to each core page are only one click away from their mother page. The many thousands of items on the site are indexed by pages with the optimal 10-15 references, yet the site need never exceed its four layer depth.

Internal Cohesion

Cross-referencing between pages. Users who arrive in the bowels of a site should be made aware of available related material elsewhere on the site. Cross-referencing risks becoming convoluted and counter-productive, so careful planning of the site structure is necessary to ensure it remains and grows with repeating patterns that are sustainable. The best way to ensure this is to place material of equal function (e.g. whether an index or core content) on the same hierarchical level.

Global Navigation

This refers to the whole-site orientation of the user. Beyond ensuring that every page has an evident return to home button (a logo is an effective continuity for this), a 'breadcrumb' system is useful. This works like the stub and grow system on wikipedia.org, where the path from the homepage, through the various levels and category and sub-category pages, to the current page can be seen as a sequence.

The URL also acts as a type of breadcrumb trail if set up properly. Ideally, the user does not need to use the back button to navigate, since this defeats the purpose of link juice maximisation. Users should be able to follow a forward trail in a logical, clear way, rather than backtrack.

Link Building

One of the two essential aspects of SEO success is link building. In combination with great content that is well organised and easily indexed by the search engines, a well-developed set of links to the site from quality sources will ensure rankings in the SERPs.

Desert island
Unlinked sites are like desert islands

Backlinks are given a high priority in a search engine's algorithm for measuring the authority and popularity of a site. Without links, a site will not be found easily, and is like a island off the main shipping routes.

Good Link Development Strategies

Planning a site should include SEO in its early stages. A part of the SEO planning is how to create a structure and content which will encourage a continuous development of links. It should be borne in mind that a few high-quality links are worth more than hundreds of low-quality links from spurious sources diluting the overall impression and value of the site. For example, links that point to missing pages are negative to the SEO value, and sudden unsustained growths of links from blogs may suggest a sensationalist and therefore unreliable resource.

Nothing drives forward a link campaign better than great resources on the site which people like and want to pass on to their community. This content must be offering something different to longer-established competitors. This could be text, graphics, audio-visual material, or even some software tool which people cannot obtain easily elsewhere.

Gaining Authority

Authority is granted by external sites making references to the site. The more these referring sites are considered authoritative in the same or related field, the more authority value they pass. The search engine algorithm works like an academic citation system. A paper is given more importance if the most authoritative journals cite it. Academics will care little whether the paper also has many references from non-academic sources (they don't read them anyway).

In the same vein, if a site wishes to be taken seriously, it must submit to the rigours of peer review to gain acceptance from those whose opinions matter.

Deciding the Authority Profile for a site

In the initial cross-disciplinary brainstorming phase of SEO consultation, the marketing objectives of the site are established. The SEO consultant then designs a site to satisfy the demands from marketing for traffic which meets their profile. To do this, the site is planned in detail so that keyword distribution across pages matches expected organic search criteria, while the authority of the site is established through careful acquisition of good and relevant linking sites.

Site elements which support authority

As part of the analysis of a page's relevancy, the links to and from the page are not just counted, but analysed for their adherence to the page thematic profile the search engine has developed from its semantic analysis.

A fundamental aid for search engines in this task is careful use of attributes such as anchor, title, and alt texts. Putting 'Click here' is not very informative compared to 'Visit Andrew Bone's science news site', provided the hyperlink is related to the text around it.

Proximal text is therefore used to determine whether the link is really part of the page's discourse, or some unrelated (paid) link that will be an annoyance to the user rather than added value.

It is not a successful policy in the long term to attempt to cheat the search engines. They are continuously adapting their algorithms to defeat the newest schemes to raise SERP rankings unfairly. It has become accepted wisdom that a site can only hope to be ranked well for its core themes if it provides what visitors are seeking: quality returns for time invested in visiting.

Bad Link Generation Strategies

The quick path leads to a short journey

Received wisdom among SEO practitioners is that sites should endeavour to earn respect and authority within the internet community they reside through original quality content. The strategies adopted by poor-quality sites which lead to the downgrading of sites for authority and relevance include spamming and link farming:

Spamming

Spam is the opposite of good content. Broadly speaking, spam includes anything which annoys or distracts a visitor from his or her purpose for coming to the site. Unscrupulous spam tactics can even go as far as cloaking content, false download buttons, misleading links, lies and illegal tricks, such as phishing for personal security details. Spam is looked for by search engine crawlers, and if found, will result in the downgrading or even blocking of the site in SERPs, search engine results pages.

Needless to say, the more spammy a site is the less likely authoritative, quality sites are to provide links to it.

Unable to build legitimate link profiles, these sites turn to strategies such as link farming:

Link Farming

Link farming is the practice of attempting to pass on link authority (link juice) in exchange for payment or mutual benefits between consenting sites. It is frowned upon by search engines because it breaks their tenet that sites should be endeavouring to provide genuine enhanced value to visitors. Link farming tends to generate links en masse to poor or irrelevant sites, misleading the visitor.

'Free' template sites are often full of hidden external links, sapping the link juice from the site.

Erratic link acquisition rates

Link acquisition rates are also taken into consideration in the way search engines rank a site. A baseline assumption is that a good site does not have erratic link acquisition history. Rapid link acquisition followed by much slower rates may indicate a site that is not developing in step with its community. SEO practitioners should avoid quick-fix campaigns to gain backlinks illegitimately, such as hidden links (e.g. in page counters and other 'free' items) and paid links (link farming).

The Google PageRank system

Google uses an algorithm of scalable relevance. This means their intent is to remain focused on the desires of the user as a guide to how to tune their PageRank programme. There are numerous factors that Google has learned to incorporate in its algorithm, some in reaction to 'spamming', or attempts to unfairly manipulate search results by web designers. These hundreds of factors, with a little generous imagination, can ultimately be boiled down to two broad groups: relevancy and popularity. Both of these need to be attended to for a site to succeed in the SERP derby.

Popularity

This may equally refer to the number of visitors to a site as to the number of links to that page from other domains. For the SEO practitioner, the number of relevant hyperlinks should be the more dominant factor in his or her thinking. These imply a more serious acknowledgement of the importance of the site for the linker. Like YouTube videos, the raw number of views is no true indicator of the inherent satisfaction of the viewer.

A distinction needs to be made between 'page popularity' and 'domain popularity'. Some services offer their service from a single page (e.g. a search engine, news or meteological services), or frequent updates to their applications (such as video players and the Adobe Pdf Reader). These applications are updated regularly from a single page by people who do not normally extend their visit to the rest of the domain in question. As a result, there is a high page popularity but relatively low domain popularity.

Other domains, such as wikipedia.org, are heavily visited, but not for specific pages, so have high domain popularity, but low page popularity (except perhaps the homepage).

The best ranking will be achieved by a domain which can achieve the optimum combination of these two popularity metrics for its purposes. There is no easy solution, since this dynamic is dependent on many and diverse market and service specific factors.

Relevancy

The degree to which a recommended SERP page satisfies the intended query by the user. One of the most frightening things about SEO is that it is a double-edged sword. If visitors using certain keywords tend to bounce, or leave a domain quickly, Google will develop a profile that downgrades that domain for the keywords involved in the query.

Relevancy is also the relationship between a page and a specific theme. In this debate, the authority granted the site by the specialised internet community (i.e. established and authoritative sites which specialise in the same field as the domain in question) is a factor, as is the semantic analysis of the page and its content.

When a search engine sees a page it collects a large range of data concerning the page's content, structure and other factors, such as how parts of the content relate to other parts. The primary tool in this operation is the long-established science of information retrieval (IR). The two chief criteria used in SEO IR are relevance and importance. In library systems, citation analysis was used to determine whether an academic community considered the document to be important. In SEO, the equivalent is link analysis.

Content © Renewable-Media.com. All rights reserved. Created : September 11, 2014 Last updated :March 7, 2016

Latest Item on Science Library:

The most recent article is:

Trigonometry

View this item in the topic:

Vectors and Trigonometry

and many more articles in the subject:

Subject of the Week

Mathematics

Mathematics is the most important tool of science. The quest to understand the world and the universe using mathematics is as old as civilisation, and has led to the science and technology of today. Learn about the techniques and history of mathematics on ScienceLibrary.info.

Mathematics

Great Scientists

Edwin Hubble

1889 - 1953

Edwin Hubble, 1889 - 1953, was an American astronomer, who made the physical observations which established the Big Bang Theory and the expansion of the universe in astronomy.

Edwin Hubble, 1889 - 1953, American astronomer
Vitruvian Boy

Quote of the day...

Before I came here I was confused about this subject. After hearing your lecture, I am still confused, but on a higher level.

ZumGuy Internet Promotions

Transalpine traduzioni