Template by:
Free Blog Templates

Types of Links

  • One Way Links
A "one-way text link" means that a Web site links to your Web site but you don’t link back to that Web site. One-way links, also known as non-reciprocal links, are much more difficult to obtain than traditional reciprocal links. In the eyes of search engines, one-way text links have a more natural flow and appear less self-serving. As a result, they are very valuable to your business because the more one-way links you have, the better your chances of an improved search engine ranking, page rank and link popularity.

  • Reciprocal Links / Two Way Links
A reciprocal link is where website [A] links to website [B], which in turn links to website [A].
Links between two sites, often based on an agreement by the site owners to exchange links.

Reciprocal links are based on an agreement by two sites to link to each other. Reciprocal linking is often used by small/midsize sites as an inexpensive way to increase Web site traffic and link popularity.

  • Three way Links
Three-Way Link Exchange is somehow similar to two-way link exchange but the difference is instead of putting a link to another site on your own site, you will place its link on the other site you have.
Three-Way link exchange is a cultivated term not a generic term. This term is used in case when more than two pages are involved in linking. It means that Page A is linking to Page B and Page B is linking to Page C. But Page B is not linking to Page A and Page C is not linking to Page B.

This is usually done when you own two websites. One website has already many back links and page rank and the other one don’t have many back links and/or don’t have page rank. You put a link to another site on the website that has already many back links and/or page rank and in return, your link exchange partner will put a link of your site that doesn’t have any page rank yet.

Google Toolbar

Down Load the Google Tool Bar from

http://toolbar.google.com

Important Points
  • Page Rank can not be common for all the pages of the site.
  • Google is giving Page Rank to each and every page that is in Google’s Index.
  • Page Rank can be different for http://www.site.com & http://site.com
  • Page Rank counts on scale of 0 to 10
  • Google Page Rank is not having any direct relationship with Google SERP (Search Engine Ranking Position)

Google Page Rank

Page Rank is a function of Google that measures the quality of a website on a scale of 0 to10. The theory is that high quality sites receive a higher PageRank based on visitors and traffic your site receives. PageRank is a “vote”, by all the other pages on the internet.

Off page Optimization

Avoid things that Search Engines Hates

Dealing with Frames

A framed site is one in which the browser window is broken in to two or more parts, each of which holds a Web Page. Frames cause a number of problems as following:

  • Some search engines have trouble getting through the frame-definition or frameset page to the actual web pages.
  • If the search engine gets through, it indexes individual pages, not frame sets. Each page is indexed separately.
  • You cannot point to a particular page in your site. This may be a problem in the following situations:

Linking campaigns. Other sites can link to only the front of your site; they can’t link to specific pages during link campaign.

Pay-per-click campaigns. If you are running a pay-per-click (PPC) campaign, you cannot link directly to a page related to a particular product.

Placing your products in the shopping directories. In this case, you need to link to a particular product page.

Search Engine Spamming Techniques that should be avoided

Use of Invisible text:
Using invisible text is an extremely common search spamming practice in which a spammer uses a similar color for fonts as well as the background. Invisible text is used to stuff pages with keywords that are visible to search engines, but invisible to the viewers. All major search engines today can identify this kind of spamming easily and can penalize the site.


Stuffing keywords:
Keyword stuffing can be done in many ways. One most common way of doing this is by using invisible text. Other methods involve using keywords in very small fonts at the bottom of pages, using keywords in hidden tags (like no frames tag, alt tags, hidden value tags, option tags etc.) or stuffing the main content with repetitive keywords. Keyword stuffing is a trick that most search engines today are able to sniff out.

Link Spamming:
Link Spamming is the process of spamming search engines by getting thousands of inbound links from link farms, forums, blogs, free to all pages, unrelated websites or even by registering hundreds of domains and getting links from all of them generating a link empire. To put it in simple words link spamming is the process of getting inbound links through unethical practices solely for the purpose of ranking higher in search engines.


Most search engines give high level of importance to in-bound links and consider them as an indication that the site is credible. Participating in free for all and linking farms can get any site thousands of in-bound links, which can make them look important in the eyes of the search engines when they actually aren't. Most search engines today have come up with strict measures to deal with such kind of spamming and can even ban an involved website completely from their search listings.


Cloaking:
Simply put, cloaking is any process that involves presenting search engines with one set of information and the visitors with another. The copy presented to the search engines is highly optimized and hence the search but may rank it higher.

Creating Doorway Pages:
Doorway pages are pages that are highly optimized for search engines. They are similar to junk pages and contain nothing but keywords and irrelevant content. When a visitor enters such a page, his is either automatically redirected or asked to click on a JavaScript link.

It is not necessary that all doorway pages should always look this way. There are ways in which good and related information can be provided making the doorway pages an informative page instead of a page that contains nothing but junk. Only when one tries to take shortcuts does he relent to creating junk pages instead of offering informational content.

Page redirects:
Often people create spam filled Web pages intended for the eyes of search engines only. When someone visits those pages, they are redirected to the real page by META refresh tags, CGI, Java, JavaScript, or server side techniques. There are legitimate reasons for cloaking and similar techniques, but don't use them unless you know exactly what you are doing.

Duplicate Content:
Its Possible that content duplication is the single biggest SEO problem faced by websites today. Whether the duplication is interntional or not, presenting the same content to search engines under multiple URLs can cause a site to be ranked poorly or penalized. In some cases, it prevents indexing entirely.



Google Sitemap.xml

What is Google Sitemap.xml?

The Sitemap Protocol allows you to inform search engines about URLs on your websites that are available for crawling. In its simplest form, a Sitemap that uses the Sitemap Protocol is an XML file that lists URLs for a site. The protocol was written to be highly scalable so it can accommodate sites of any size. It also enables webmasters to include additional information about each URL (when it was last updated; how often it changes; how important it is in relation to other URLs in the site) so that search engines can more intelligently crawl the site.



Sitemaps are particularly beneficial when users can't reach all areas of a website through a browseable interface. (Generally, this is when users are unable to reach certain pages or regions of a site by following links). For example, any site where certain pages are only accessible via a search form would benefit from creating a Sitemap and submitting it to search engines.



This document describes the formats for Sitemap files and also explains where you should post your Sitemap files so that search engines can retrieve them.



Please note that the Sitemap Protocol supplements, but does not replace, the crawl-based mechanisms that search engines already use to discover URLs. By submitting a Sitemap (or Sitemaps) to a search engine, you will help that engine's crawlers to do a better job of crawling your site.


Using this protocol does not guarantee that your webpages will be included in search indexes. (Note that using this protocol will not influence the way your pages are ranked by Google.)

Robots.txt

Spiders and Robots Exclusion

Web Robots are programs that automatically traverse the Web's hypertext structure by retrieving a document, and recursively retrieving all documents that are referenced. This page explains how you can control what these robots do when visiting your site.

What is Robots.txt?

The Robots Exclusion Protocol is a method that allows Web site administrators to indicate to visiting robots which parts of their site should not be visited by the robot. When a Robot visits a web site, it first checks for the file robots.txt in the root directory; e.g. http://Stars.com/robots.txt. If it can find this file, it will analyze its contents to see if it may retrieve further documents (files). You can customize the robots.txt file to apply only to specific robots, and to disallow access to specific directories or files. Here is a sample robots.txt file that prevents all robots from visiting the entire site:-

# Tells Scanning Robots Where They Are and Are Not Welcome
# User-agent:           can also specify by name; "*" is for everyone
# Disallow:               if this matches first part of requested path,
# forget it
User-agent: *    # applies to all robots
Disallow: / # disallow indexing of all pages

The record starts with one or more User-agent lines, specifying which robots the record applies to, followed by "Disallow" and "Allow" instructions to that robot. To evaluate if access to a URL is allowed, a robot must attempt to match the paths in Allow and Disallow lines against the URL, in the order they occur in the record. The first match found is used. If no match is found, the default assumption is that the URL is allowed.

For example:

User-agent: webcrawler
User-agent: infoseek
Allow:    /tmp/ok.html
Disallow: /tmp

WebCrawler and InfoSeek are not allowed access to the /tmp/ directory, except to access ok.html. All other robots are allowed unrestricted access.

Sometimes robots get stuck in CGI programs, trying to evoke all possible outputs. This keeps robots out of the cgi-bin directory, and disallows execution of a specific CGI program in the /Ads/ directory:-

User-agent: *
Disallow: /cgi-bin/
Disallow: /Ads/banner.cgi
Robots.txt is a simple text file tha’s uploaded to the root directory of a website. Spirders request this file first, and process it before they crawl the site. The simplest robots.txt file possible is this: 
 
User-agent: *
Disallow:
That’s it, The first line identifies the user agent; an asterisk means that the following lines apply to all agents. The blank after Disallow: means that no part of the site is off limits.
 
This robots.txt file doesn’t do anything: all user agents are able to see everything on the site. It’s worth putting a robots.txt file on every website, even it it doesn’t restrict the content that spiders may access. Doing so will prevent the server from returning (and logging) a 404 Not Found error every time a spider requests robots.txt Although a missing robots.txt file foes no real harm from an SEO perspective, the 404 errors can be annoying to webmasters who are examining log files in an attempt to identify real problems.

H1 Tag

Title /heading of the page should be mentioned in H1 tag, the length of title should be limited to 60-80 characters. The standard usage of H1 tag is 3 times for the home page and once in the inner pages

Internal Links And Sitemap page Optimization

  • Creating Sitemap Page:
A sitemap is a web page that lists all the web pages on a website, typically organized in hierarchical way. Sitemap helps visitors and search engine bots to find pages on the site. It provides quick finding for search engine robot. Depending on the size of your website, it may actually link to all of your pages.
A truncated example version (For EDFED) would sit in the top left corner of his home page – and look like this.

  • Creating navigation structure

Creating navigation structure, that search engine can read:

Your navigation structure needs to be visible to the search engines. A navigation structure created with JavaScript will not work for the search engines.

  1. Add a site map page and link to it from your main navigation. It provides another way for search engines to find out all pages.
  2. Whenever possible, provide keywords in text links as part of the navigation structure.
  3. Even if your navigation structure is visible to search engines, you may want to have these bottom-of-page links as well. They are convenient for site visitors and provide another chance for the search engines to find your other pages. Another reason of that is basic text navigation.
Creating a good and proper directory structure is also very important. Keep your pages as close to the root domain as possible, rather than have a complicated multilevel directory structure. Create a directory for each navigation tab and keep all the files in that directory.
e.g. http://www.edfed.com/student-loan-consolidation/student-loan-consolidation.php

  • Naming Files
Giving the proper keyword or product related names to all the images, pages etc… and the page names should be in lowercase.
e.g.
http://www.edfed.com/student-loan-consolidation.php .

  • Avoiding Flash Animation
You should avoid using Flash Animations in the site because search engines cannot read the Flash Animations. However, if it is necessary to use Flash Animations in the site then there are a few things you can do to help with indexing:
  1. Make sure you use your TITLE tags and DESCRIPTION and KEYWORDS Meta tag.
  2. Put text between tags. You could use text; you can separate each keyword with comma and a space. Do not use both but use both a comma and no space or use a space and no comma.
  3. Make sure that most of the keywords in the tag are also in the body text.
  4. Don’t use a lot of repetition.
  5. Don’t use the same KEYWORD tag in all your pages.
  • Avoiding embedded text in images
You should avoid using the embedded text in images means you should use text base information in place of images which displays the information of content in it. And also use of large images should be avoided in the home page.
  • Adding body text
You need text in the page but it should not be more than 100 to 250 words. If you are putting article in the page and the article is of 1000 words, then it is ok. However, the 100-250 word range is good. That amount of content allows you to really define, what the page is about and will help the search engine understand what the page is about.
  1. Home page must contain the short information about all the facilities and products.
  2. Other pages should contain the information regarding that page with the important keywords related to that page.
  3. Keywords should used and repeated in content of all pages.
  • Creating Headers
HTML has several tags that define headers: H1, H2, H3 and so on. These headers are useful in search engine optimization, because when you put keywords into a heading, you are saying to search engine, “These keywords are so important that they appear in heading text” and search engine pays more attention to them. Weighing them more heavily than keywords in body text.
  • Formatting text
You can also tell the search engines that a particular word might be significant several other ways. If the text is displayed in some different way in the page means the search engine may assume that it is been set off for some reason. Following are few things you can set your keywords apart from the other words on the page:
  1. Make the text bold.
  2. Make the text italic.
  3. Uppercase the first letter in Each Word, and lowercase the other letters in the word.
  • Creating links
Links in your pages serve several purposes:
  1. They help searchbots find other pages in your site.
  2. Keywords in links tell search engines about the pages that the links are pointing at.
  3. Keywords in links also tell the search engines about the page containing the links.
  4. All the links must be text based hyperlinks.
  • URL Rewriting
URL rewriting takes the variables out of a dynamic site URL and remaps them into a shorter version. For example:
  1. OldURL: - http://www.example.com/index.php?this=1234&that=5678
  2. Shorter, rewritten URL : http://www.example.com/1234/5678
When a user request a URL such as http://www.example.com/1234/5678, that URL is rewritten by the server behind the scenes so that the cript receives the variables this =1234 and that =5678. The scripts that drive the site must be written so that they create the shorter URL for links. The server translates the requested URLs internally without showing the translated version to the visitors.

Anchor text


Anchor text is the hyperlinked words on a web page – the words you click on when you click a link. Anchor text usually gives your visitors useful information about the content of the page you're linking to. It tells search engines what the page is about. Used wisely, it boosts your rankings in search engines, especially in Google.

You can use anchor text in:

* External links – links from other sites
* Internal links – links on your pages
* Navigation maps
* Links on your main page. A very important spot.

On page Factors (Title Tags, Meta Tags, Alt Tags)

  • What is Meta Tags
An element of HTML coding on a website that is used by search engines to index a website. Most meta-tags are included within the 'header' code of a website and the most important tags are the title, description and keyword tags. Rules used by different search engines govern how such tags are used, how many characters they should contain, and how they should be formatted.

  • What is TITLE Tag?

The TITLE tag is one of the most important components as far as search engines are concerned. Because the site’s TITLE tag as the link and main title of the site’s listing on the search engine result page. Following are some rules about the TITLE tags:

1. Place your TITLE tags immediately below the HEAD tag.

2. Place 40 to 65 characters between the TITLE tags, including spaces.

3. Put the keyword phrase you want to focus on for this page at a very beginning of the TITLE.

  • What is Description Meta Tag:

An HTML tag that gives a general description of the contents of the page. This description is not displayed on the page itself, but is largely intended to help the search engines index the page correctly.

The DESCRIPTION Meta tag describes the web page to the search engines.

1. They read and index the text in the tag.
2. Some search engines grabs the text form of the DESCRIPTION tag and places it under the text form of TITLE tag so searcher can read your description.
3. Some times search engines may not use the DESCRIPTION you provide. However, it uses the description with the keyword from the body text of that page or it may use the description of some standard web directories.
4. Some smaller search engine uses the DESCRIPTION tag in the results.

Following are some rules about the DESCRIPTION tags

1. The DESCRIPTION Meta tag is very important so you should use it in your site.
2. Place your DESCRIPTION tag immediately below the TITLE tags.
3. Create a nice keyworded description of up to 250 characters, including spaces.
4. Duplicate your important keywords once in the description but not more than that.

  • What is Keyword Meta Tag:

This was important many years past, but this Meta tag is not so important these days. Some search engines may use it, but many don’t.

Following are some rules about the KEYWORDS tag

1. Limit the tag up to 300 characters, including spaces.
2. You can separate each keyword with comma and a space. Don’t use both but use either a comma or no space or use a space and no comma.
3. Make sure that most of the keywords in the tag are also in the body text.
4. Don’t use a lot of repetition.
5. Don’t use the same KEYWORDS tag in all your pages.

  • Using Other Meta tags

There are many other Meta tags but that not all are important but some of them are useful and that are as following:


Including image ALT text

When you are using image tag to insert images on the web pages. This tag can include the ALT= attribute, which means attribute text. Search engines also read ALT texts.

On page Optimization


The following comes under On page Optimization

Where you must use keyword phrases

  • Title tag - The title is particularly important and should include your primary keyword phrase and if possible your secondary phrase
  • Description tag - If your description contains the search term people enter, and it is the first text that Googlebot comes across, then you’ve a good chance that Google will display your description in the results
  • Headings and subheadings - The tags that are used throughout an article should contain keywords. So a heading of “Our new product range” is poor; “Our new range of vegetarian dog food” is better
  • Body copy - The writer should understand what the keyword phrases are and use them in natural language on the page
  • • Links on the page - The words used in live links tell the search engine what “this page” is about and also what the “linked-to page” is about
  • Alt text - For every image, write an alt attribute tag, good for both accessibility and optimization

Additional areas/places
  • References at the bottom of the article
  • Add a takeaway box
  • Link title
  • Testimonials from satisfied customers
  • Signature box at the end of the article
Reference
http://www.wordtracker.com/keyword-research-guide.pdf

Tips for Effective Keyword Research

  • · Start with a goal in mind

As with all marketing efforts you should have a goal for what you are trying to achieve. Whether it is to increase sales, gain newsletter subscribers, or get new prospect leads, this goal will help to set priorities keyword phrase selection and impact the overall success of your Search Engine Optimization efforts. Take the time to discuss and establish goals and priorities at the start.

  • · Think like a Prospective Customer

When it comes to search sites, it is the prospective customer who initiates the dialog (query) so it is important that you select keyword phrases that reflect what a prospective customer will enter in at the search site. Users often enter phrases are the legal or proper terms for a product or service, and sometimes the phrases they enter may be totally outside industry jargon. In either case, be sure to select keywords that are naturally intuitive to the customer.

  • Brainstorm
Brainstorming is a great way to start generating a list of possible keyword phrases. The brainstorming group should include those with direct customer access (sales reps, customer support representatives, product/content experts, and marketing – don’t forget to review your own log files for additional data). Try to generate as many words as possible and then narrow your list quantitatively using industry tools and qualitatively using your industry knowledge. When you brainstorm, remember to step out of yourself and into the mindset of the prospective customer.

  • Check your Content for Keywords
If the Keywords you optimize for are not used within the copy of your site, you will not be able to get rankings for those terms. Once you have established a list of keywords, review your site. Be prepared to add new content to your site that can be optimized to support the new mix of Keywords.

  • Goal is Conversions not Traffic
The goal of Search Engine Optimization, and your keyword research, is not to just get traffic, but to get valuable, ‘qualified’ traffic that can lead to actual conversions. You will find the different keyword phrases have a large difference in conversion rates and value for your site. Don’t forget that increased revenues will be the ultimate measure of your success.

  • Avoid short, common phrases
Short phrases (1 or 2 words) that are also common phrases can be too general and are not the best to optimize for. Common phrases can lead to a large portion of the traffic that is searching on similar terms but not looking for your product or service. The amount of time and effort needed to gain valuable rankings and traffic for those phrases will also be much greater than the time required for more targeted keyword phrases. Targeted keyword phrases may generate slightly less overall traffic, but with targeted keywords you will achieve ‘qualified’ traffic – leading to increased revenues.

  • Keyword Research is an ongoing process
Keyword research is a continuous process. Whatever you do, don’t stop. As your SEO program progresses, examine what words perform best, enhance your keyword list based on what you learn, go back and review old keyword ideas against current data and usage trends, and continue to review your site content for synergy with your keyword list. Ongoing keyword/Keyword research will uncover new keyword phrases, while keeping you abreast of your competition, and shifts in the search site industry.

Keyword Research Tools

http://inventory.overture.com/d/searchinventory/suggestion/

http://www.wordtracker.com/

http://tools.seobook.com/general/keyword/


The above are the 3 main Keyword Research tools.

Advantages of Keyword Research

  • Drive traffic to your site by using the words people use when they’re searching
  • Write great website copy by incorporating terms that people immediately identify with
  • Plan profitable pay-per-click campaigns by building up a broad range of keyword phrases that will capture your market
  • Develop great content ideas that directly address your customers’ needs
  • Understand your customers’ behavior and concerns by analyzing the words that they use
  • Measure the size of a potential online market by the number of searches conducted, and
  • Develop new revenue streams by using popular keywords to inspire new product and service ideas

What is Keyword Research?

Keyword Research can be defined as choosing the words, which describe your product or service as, seen from the viewpoint of your target market is the most important step in the optimization process.

Keyword Research is the process of defining the right mix of keywords, which are most likely to be used by potential customers at search engines and directories.

This Keyword research consists of a variety of inputs, both qualitative and quantitative, both creative and analytical. Keyword Research plays a crucial role in assuring that your prospective customers will find you when they are looking for your specific products and services.

Why do we need Keywords

Seo (search engine optimization) tips for keywords is one of the starting points for good search engine optimization.

Seo keywords is one of the very first things that you will do when you begin to optimize your web site for high natural search engine rankings. Although it is foundational, meaning that you practically have to do it, it keywords for seo is a very small portion of the work that needs to be done for good seo. It is one of the least impactful components of seo.

Keyword Research & Analysis