Template by:
Free Blog Templates

SEO TIPS LEARNING SEO TIPS

SEO Learning is an important thing for any webmaster.The SEO tips presented here are intended as a guide for webmasters who have only recently become aware of search engine optimization (seo). To use my SEO tips, you need not to be a technical wizard. These seo tips also include many search engine optimization techniques for more experienced web site designers as well as new seo web masters.

You can do your own basic search engine optimization, using my SEO tips as your guide. I have filled all the pages in the blog with easy-to-follow seo (search engine optimization) tips. If, after you've read all my search engine optimization (SEO) tips, you would like to have some experience on how to do SEO for your web site. Everyone loves good seo tips.

The following are some of the best, quick and good seo tips.

  • Content is king : You must have more original content and fresh content
  • Quality backlinks : Build a network of quality backlinks
  • PageRank is must but don’t be obsessed with PageRank
  • Design your web site without considering SEO.
  • When doing keyword analysis, do not focus on single keywords, focus on key phrases.
  • On every page of your site, you must use unique, keyword focused Title tag.
  • Site Map will help visitors and crawlers find what you have to offer.
  • When coming to blog posts optimization, your post title tag must be independent from your blog title.

The above are some of SEO Tips easy to follow.

By following the above search engine optimization (SEO) tips, you will do your own SEO for your site.

For Latest SEO Tips and Tricks, feel free to visit Good SEO Tips.

SEO - Basic Introduction of SEO ( Search Engine Optimization )


What is "SEO"?

Search engine optimization
(SEO also search optimization) is the process of editing and organizing the content on a webpage or across a website to increase its potential relevance to specific keywords on specific search engines.

(OR)

SEO can also be defined as follows

Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results.

This is done with the aim of achieving a higher organic search listing and thus increasing the volume of traffic from search engines.




The main use of SEO is

SEO helps to ensure that a site is accessible to a search engine and improves the chances that the site will be found by the search engine.

So, SEO main purpose is to get good ranking through Search Engines to a website.

Search engines are one of the primary ways that Internet users find web sites. That's why a web site with good search engine listings may see a dramatic increase in traffic. For this we have to do the entire SEO work.

Everyone wants those good listings. Unfortunately, many web sites appear poorly in search engine rankings or may not be listed at all because they fail to consider how search engines work. If you know the SEO and apply SEO for your site, then your site will easily get good search engine listings.

SEO Learning Tips for keyword placement in your website or blog

Keyword research and identification is a major factor in their optimization strategies. But if we don’t know where we have to place keywords in our website, then the whole work is a useless. So we should know where we have to place keywords in a website.

The following are some of the seo learning tips for placing keywords in our website.

Keywords in Title Tag:

The title tag is the best place to add your targeted keywords. The title tag should not be more than 5 or 6 words. Mostly 0 to 75 characters are preferable for any Title Tag. Your targeted keyword must appear at the beginning of your Title Tag.

Keywords in Heading Tag:

Adding Keywords in Heading Tag is another added advantage for a website or a blog. Make sure that your targeted keyword should appear in your Heading Tag for better results.

Keywords in Meta Tags:

Meta Tags are not much important for Google, but Yahoo, Msn and other search engines like Meta Tags also. So it is a good idea to place some keywords in Meta tags also.

Keywords in Alt Tags:

Search Engines doesn’t read images. Use some targeted keywords in your alt tags. But don’t stuff the alt tags with too many keywords. Search Engines considers that as spam.

Keywords in your Content:

Content is the king of any website. Try to write as much good and fresh content as possible. Make sure that your targeted keywords must appear in your content without repeating too many keywords. It is preferable to use 3 to 5 % of keyword density in your content.

The above are some of the best seo learning tips for keyword placement in your website or blog.

Benefits of Search Engine Optimization (SEO)

Search engines generate nearly 90% of Internet traffic and are responsible for 55% of e-commerce transactions. Search Engine Promotion has shown to deliver the highest ROI, compared to any other type of marketing, both online and offline. Search engines bring motivated buyers to you and hence contribute to increased sales conversions.

Search Engine Optimization (SEO) offers an affordable entry point for marketing your website and an effective way to promote your business online. Search Engine Optimization (SEO) makes for a long-term solution is your access to sustained free traffic and a source of building brand name and company reputation.

Benefits of seo
  1. Improves ranking among unpaid search engine listings.
  2. Rank high in the search engines
  3. Helps potential customers or clients to find the website
  4. Drives more traffic to the site
  5. Gain top positions on search engines for various keywords
  6. Increases the Website’s performance on the search engines in organic and natural means.

Keyword Proximity and Keyword weight

Keyword Proximity


Keyword proximity refers to the closeness between two or more keywords. In general, the closer the keywords are, the better.

For example:

How Keyword Density Affects Search Engine Rankings

How Keyword Density Affects Rankings In Search Engine

Using the example above, if someone searched for "search engine rankings," a web page containing the first sentence is more likely to rank higher than the second.

The reason is because the keywords are placed closer together. This is assuming that everything else is equal, of course.

Keyword weight

Keyword weight is the percentage or concentration of keywords on your page in relation to all other words on the page. A "keyword" can be either a single word, or a short phrase.

Keyword weight refers to the number of keywords appearing in the page area divided by the total number of words appearing in that area. Weight also depends on whether the keyword is a single word or a multi-word phrase.

Keyword Frequency and Keyword Prominence

Keyword Frequency


Keyword frequency refers to the number of times a keyword or keyword phrase appears within a web page.

The theory is that the more times a keyword or keyword phrase appears within a web page, the more relevance a search engine is likely to give the page for a search with those keywords.

In general, I recommend that you ensure that the most important keyword or keyword phrase is the most frequently use keywords in a web page.

But be careful not to abuse the system by repeating the same keyword or keyword phrases over and over again.

Keyword Prominence

Keyword prominence refers to how prominent keywords are within a web page.

The general recommendation is to place important keywords at, or near, the start of a web page, sentence, TITLE or META tag.

SEO vs SEO 2.0 Comparison

SEO

SEO 2.0

Un-Natural Linking: Gain backlinking by submitting to directories, link buying, and manually adding links from requests. Optimized for links.

Natural Linking: Gaining links through socializing, blogs, forums, automatic linking with our seo 2.o search exchange community. Optimized for traffic and sales conversions.

Quantity: by keyword stuffing and repetitive titles and descriptions, Designed for the search engine spiders in mind. Optimized for keywords

Quality: by making things completely unique using LSI content structure with no keyword stuffing. Designed with not only the search engines but focusing on the human viewer. Optimized for tags.

Competition: webmaster fight each other trying to gain the best advantages to gain top 10 positions

Cooperation: Web master help each other by linking to one another and build into a strong community so they each get better rankings.

Introverted: We’re not doing SEO, no we can’t show you our client lists so don’t ask, secretive SEO companies

Extraverted: Welcome our newest client “company X”, we are glad they decided to join our family.

Optimization: clicks, page views, visits

Innovation: sales conversions, ROI, company branding

Link Structure: Inbound links to the home page only. Links from the home page to the interior pages.

Link Infrastructure: Inbound links to all pages. Links form interior pages out to the home page.

Non-Authoritative: Building your site from the top down. Putting all your emphasis on your home page.

Authoritative: Building your site from the bottom up. Making each page just as important as the home page.

On-Page Optimization: Cleaning up code, adding keywords, and writing content.

Off-Page Optimization: Gaining links, joining networks, social bookmarking, and exchanging links.

SEO vs SEO 2.0: Top 15 Differences


The below are some of the differences between SEO and SEO 2.0


SEO

SEO 2.0

Link building, manually adding them, submitting static websites to directories, link exchange, paying for links

Getting links, enhancing it by certain actions like blogging, writing pillar content, creating link bait, socializing

On site optimization for spiders: example repetitive page titles concentrating (solely) on keywords

On site optimization for users. Example: Kick ass post titles

Competition: You compete with others to be on the first page/in the Google top 10 for keywords

Cooperation: You cooperate with each other submitting fellow bloggers to social media or voting for them, you link to them

Barter: You give me link and only then I will give you one

Giving: I link you regardless whether you link back, but in most cases you will, more than once

Hiding: We’re not doing SEO, we can’t show our client list publicly, impersonal SEO company

Being open: Welcome our new client xyz, we are proud to work together with them, Rand Fishkin and his team

keywords

tags

Optimization for links

Optimization for traffic

clicks, page views, visits

conversions, ROI, branding

DMOZ

del.icio.us

Main traffic sources: Google, Yahoo, MSN

Main traffic sources: Stumble Upon, Niche social news sites, Blogs

one way communication

dialog, conversation

top down

bottom up

undemocratic, who pays most is on top

democratic, who responds to popular demand is on top

50% automated

10% automated

technocratic

emotional

Keyword Density

Keyword density is the density of a particular keyword, or set of keywords, located on an individual webpage. It is a measure of how many times a keyword is repeated and compared to the overall content of a web page. If a keyword was listed 5 times out of a word count of 50 the keyword density would be 10%.If repeated too often though, it could lead to the page being penalized for spamming.

(or)

Keyword density refers to the ratio (percentage) of keywords contained within the total number of indexable words within a web page.

The preferred keyword density ratio varies from search engine to search engine. In general, I recommend using a keyword density ratio in the range of 2-8%.

Some of the Keyword Density Checker tools are as follows.

http://www.keyworddensity.com/

http://www.seochat.com/seo-tools/keyword-density/

http://tools.seobook.com/general/keyword-density/

http://www.iwebtool.com/keyword_density

http://www.webconfs.com/keyword-density-checker.php

Keyword Effectiveness Index (KEI )

The Keyword Effectiveness Index (KEI ) is a measure of how effective a keyword is for your web site. The derivation of the formula for KEI is based on three axioms:

  1. The Keyword Effectiveness Index (KEI ) for a keyword should increase if its popularity increases. Popularity is defined as the number present in the "Count" column of Word Tracker. This axiom is self-explanatory
  2. The Keyword Effectiveness Index (KEI ) for a keyword should decrease if it becomes more competitive. Competitiveness is defined as the number of sites which AltaVista displays when you search for that keyword using exact match search (i.e. you should use quotes around the keyword). This axiom is also self-explanatory.
  3. If a keyword becomes more popular and more competitive at the same time such that the ratio between its popularity and competitiveness remains the same, its KEI should increase. The rationale behind this axiom requires a more detailed explanation. The best way to do this is to take an example:
Suppose the popularity of a keyword is 4 and AltaVista displays 100 sites for that keyword. Then the ratio between popularity and competitiveness for that keyword is 4/100 = 0.04.

Suppose that both the popularity and the competitiveness of the keyword increase. Assume that the popularity increases to 40 and AltaVista now displays 1000 sites for that keyword. Then the ratio between popularity and competitiveness for that keyword is 40/1000 = 0.04.

Hence, the keyword has the same ratio between popularity and competitiveness as before. However, as is obvious, the keyword would be far more attractive in the second case. If the popularity is only 4, there's hardly any point in spending time trying to optimize your site for it even though you have a bigger chance of ending up in the top 30 since there are only 100 sites which are competing for a top 30 position. Each hit is no doubt important, but from a cost-benefit angle, the keyword is hardly a good choice. However, when the popularity increases to 40, the keyword becomes more attractive even though its competitiveness increases. Although it is now that much more difficult to get a top 30 ranking, spending time in trying to do so is worthwhile from the cost benefit viewpoint.

A good Keyword Effectiveness Index (KEI ) must satisfy all the 3 axioms. Let P denote the popularity of the keyword and C the competitiveness.
The formula that I have chosen is KEI = P^2/C, i.e. KEI is the square of the popularity of the keyword divided by its competitiveness. This formula satisfies all the 3 axioms:

  • If P increases, P^2 increases and hence KEI increases. Hence, Axiom 1 is satisfied.
  • If C increases, KEI decreases and hence, Axiom 2 is satisfied.
  • If P and C both increase such that P/C is the same as before, KEI increases since KEI can be written as KEI = P^2/C = P/C * P. Since P/C remains the same, and P increases, KEI must increase. Hence, Axiom 3 is satisfied.

Five Steps for Search Engine Optimization Success

SEO can be daunting. It’s an ongoing process that may not yield fast results, and there’s no way to guarantee specific ranking for your site. Here are five steps will simplify the process:

  • Review objectives and goals: Why do you want more search engine visibility? Are you trying to increase sales, gain leads, increase readership of your newsletter, and encourage visitors to use a store locator to find your physical locations or boost site traffic to increase advertising revenue?
  • Monitor the competition: Search engine marketing uniquely allows you to check out your competitor. If you have a gardening website, you can type “gardening” (and related terms such as “gardens,” “flowers” and “lawn care”) into the major search engines—Google, Yahoo, and MSN—and see your competition, which may surprise you. If you’re selling gardening products, you might assume your competitors are other retailers, but in search engines, you are also competing with news sites, personal websites from gardening enthusiasts and reference guides. Any sites that use your key words now count as your competition.
  • Research keywords: Studying the keywords your consumers use to find you and your competitors tells you how your customers think. Consider the words consumers use to find you and the words you want to be associated with.
  • Measure results: You can monitor your site traffic by keyword. Establishing current benchmarks allows you to see changes over time. Google offers a free Analytic tool, or you can invest in a more sophisticated system such as, Omniture.com,Coremetrics.com, WebTrends.com
  • Launch the program: Depending on the number of pages on your site, the goals you want to accomplish, and how well your site is already optimized, this can involve a few quick fixes or a lengthy, labor-intensive process. There are three main components to SEO

=>Technical changes: (e.g., update page titles to reflect the key you selected and design pages so search engines can easily read them)
=> Gain more links to your site:(such as from affiliates, partners, blogs, and other sources => Update the content on your site to reflect how you want search engines and consumers to see you: (e.g. adding richer product descriptions, posting article and reference materials, and encouraging consumers to submit) their own content

Uses of Blogs

Some of the Uses of Blogs are as follows:

  • Blogs build regular readership traffic.
  • Blogs are designed to publish and update contents easily once you have them set up, configured and running.
  • You can make money with your niche blogs in many ways, such

=> Publishing third party adds in your blogs ( eg. Google Adsense Publishing)
=> Recommending affiliate products and services in your blogs (eg.Amazon)

  • Unlike websites, blogs are interactive. Visitors, or blog readers,are usually allowed to post comments for a blog post (or article) to the blog owner.
  • Since blogs are usually updated regularly via blog posts and pages, blogs will rank higher in search engines as compared to websites.

Some of the Best SEO Learning Tips

Know your geographic market and ensure your domain has the correct TLD.


If your primary market is the UK it is important to use a .UK TLD and ideally have your domain hosted on UK based servers.

Include keyword in Domain name

Try to include keyword in your domain name. This may also show best results.

Create as much content about subject as you can.

Create as much good content as you can with keyword stuff. After that divide this into some 3 or 4 sections.

Find the Best Keywords

You should invest some energy into finding the best keywords. There are several SEO tools available on the Internet to help you find the best keywords. Tip: Don't be deceived by organizations that require you to register first. The two most popular resources are WordTracker and KeywordDiscovery.com.

Use your keywords as anchor text when linking internally

Anchor text helps tells spiders what the linked-to page is about. Links that say “click here” do nothing for your search engine visibility.

Your Website Title must be relevant

The title of your website must be relevant to the content on the site. Since Google only displays the first 66 or so characters, so keep the title length under 66 characters.

Optimize Your META Tags

META tags are hidden code read only by search engine webcrawlers (also called spiders). They live within the HEAD section of a web page.

The META tags you need to be the most concerned about are:

1. Description

2. Keywords

The length of these Meta tags mostly used is “0 to 250 “characters.

For keywords we have to use up to 1250 characters. But “0 to 250 characters” is mostly preferable.

Some of the Uses of Seo Learning Tips

When you follow the seo learning tips, then your site can get the following benefits

  • Seo Learning Tips can give traffic to your online business website which can aid it to gain popularity and visibility.
  • Seo Learning Tips are important and valuable tools that can make your online business be popular and it can aid to earn plenty of money.
  • With the help of seo learning tips, it can give you a lot of visitors at your website, which in time can be your customers.
  • If you make use of search engine optimization, your website can gain traffic and can be on the top rank of the major search engines and definitely, you can be lead from your competitors.
  • Seo Learning Tips uses plenty of methods that can make your websites continuously be on the top rank of the search engines listings. This can be possible with the aid of keyword or keywords rich articles.

Page tagging

Concerns about the accuracy of logfile analysis in the presence of caching, and the desire to be able to perform web analytics as an outsourced service, led to the second data collection method, page tagging.

In the mid 1990s, Web counters were commonly seen — these were images included in a web page that showed the number of times the image had been requested, which was an estimate of the number of visits to that page. In the late 1990s this concept evolved to include a small invisible image instead of a visible one, and, by using JavaScript, to pass along with the image request certain information about the page and the visitor. This information can then be processed by a web analytics company, and extensive statistics generated. This can be done remotely, by the web analytics company.

The web analytics service also manages the process of assigning a cookie to the user, which can uniquely identify them during their visit and in subsequent visits.

Advantages of page tagging

The main advantages of page tagging over logfile analysis are as follows.

  • The JavaScript is automatically run every time the page is loaded. Thus there are fewer worries about caching.
  • It is easier to add additional information to the JavaScript, which can then be collected by the remote server. For example, information about the visitors' screen sizes, or the price of the goods they purchased, can be added in this way. With logfile analysis, information not normally collected by the web server can only be recorded by modifying the URL.
  • The page tagging service manages the process of assigning cookies to visitors; with logfile analysis, the server has to be configured to do this.
  • Page tagging is available to companies who do not run their own web servers.

Logfile Analysis

Web servers have always recorded all their transactions in a logfile . It was soon realised that these logfiles could be read by a program to provide data on the popularity of the website. In the early 1990s, web site statistics consisted primarily of counting the number of client requests made to the web server. This was a reasonable method initially, since each web site often consisted of a single HTML file. However, with the introduction of images in HTML, and web sites that spanned multiple HTML files, this count became less useful.

Two units of measure were introduced in the mid 1990s to gauge more accurately the amount of human activity on web servers. These were page views and visits (or sessions). A page view was defined as a request made to the web server for a page, as opposed to a graphic, while a visit was defined as a sequence of requests from a uniquely identified client that expired after a certain amount of inactivity, usually 30 minutes. The page views and visits are still commonly displayed metrics, but are now considered rather unsophisticated measurements.


The emergence of search engine spiders and robots in the late 1990s, along with web proxies and dynamically assigned IP addresses for large companies and ISPs, made it more difficult to identify unique human visitors to a website. Log analyzers responded by tracking visits by cookies, and by ignoring requests from known spiders.
The extensive use of web caches also presented a problem for logfile analysis. If a person revisits a page, the second request will often be retrieved from the browser's cache, and so no request will be received by the web server. This means that the person's path through the site is lost. Caching can be defeated by configuring the web server, but this can result in degraded performance for the visitor to the website.

Advantages of logfile analysis

The main advantages of logfile analysis over page tagging are as follows.
  • The web server normally already produces logfiles, so the raw data is already available. To collect data via page tagging requires changes to the website.
  • The web server reliably records every transaction it makes. Page tagging relies on the visitors' browsers co-operating, which a certain proportion may not do.
  • The data is on the company's own servers, and is in a standard, rather than a proprietary, format. This makes it easy for a company to switch programs later, use several different programs, and analyze historical data with a new program. Page tagging solutions involve vendor lock-in.
  • Logfiles contain information on visits from search engine spiders. Although these should not be reported as part of the human activity, it is important data for performing search engine optimization.
  • Logfiles contain information on failed requests; page tagging only records an event if the page is successfully viewed.

Website Analysis & Reporting

Web Analytics
Web analytics is the measurement of the behaviour of visitors to a website. In a commercial context, it especially refers to the measurement of which aspects of the website work towards the business objectives; for example, which landing pages encourage people to make a purchase. Many different vendors provide web analytics software and services.

Web analytics technologies

There are two main technological approaches to collecting web analytics data. The first method, logfile analysis, reads the logfiles in which the web server records all its transactions. The second method, page tagging, uses JavaScript on each page to notify a third-party server when a page is rendered by a web browser.

1. logfile analysis
2. page tagging

Reporting Tools & Format
  • Website Ranking
  • Page crawling status
  • Status of Web Pages
  • Traffic Graph
  • Analysis of old and new data matrix

Different Ways of Link Popularity

Directory Submissions

  • Free Directories

Directories which allow submissions of our web sites for FREE.
Directory Lists

http://www.directorycritic.com/free-directory-list.html

http://www.submithelper.com/User/User_Directory.aspx
http://info.vilesilencer.com

  • Paid Directories

Directories which is taking money for submissions of our web sites
1.Yahoo Directory :-
URL - http://dir.yahoo.com/
Page Rank – PR8
2. MSN Directory
3.Business Directory
Advantages of Paid Directories

  • Less Number of Outgoing links
  • Google Page Rank Helps
  • Fast Approval
  • Search Engine Friendly Links

Directory Submission Process
=>
Search Best Relevant Category that suits to your site.
=> Look for Add URL / Submit URL / Submit Link / Add Here / Submit Here
=> General Fields which we need to submit.
=>Title
=>URL
=>Description
=>Name
=>E-mail Address
=>Checking for approval by mail or manually.

Anchor Text:
This is the text that is hyperlinked to send people to your website. Anchor text MUST contain a few of your main keywords because Google uses it to determine your websites relevance to a particular topic.

The website that links to you:
If the website that links to you is related to your site then that’s a huge bonus to you. So look for websites that are related to your website's topic.


Article Submissions

  1. Article Submissions to help you market your website and your business.
  2. Submit your article for distribution, publication and promotion.
  3. Helps for generating quality back links.
  4. Article should be more than 250+ words.
  5. Use different targeted keywords frequently in article and take the advantage of anchor text.

Article Submissions List
http://www.directorycritic.com/article-directory-list.html
Few Very Famous Sites
=>
http://www.ezinearticles.com
=>http://www.goarticles.com
=>http://www.articledashboard.com

* Need to Sign up for Submissions

Press Releases / Newswire

A simple 1-page press release can be the most powerful marketing tool. Here are top 3 reasons for using electronic news releases…
1. Media Relations
We can send your press releases via email to the journalists and websites that cover our industry. This will generate publicity for us in the form of news stories, product reviews, and links to our web site that will help in increase our credibility and sales.
2. Public Relations
By having electronic news release distributed through the Internet news channels such as Google News, Yahoo News and Topix.net, our message gets communicated to the members of our target market who read their news online. This generates more awareness of company, more traffic to website and more sales of product or service.
3. Search Engine Optimization (SEO)
This advantage is the biggest reason we distribute keyword optimized press releases (and articles).


Links

Why? Because a consistent publicity campaign can get you links from some of the best web sites on the Net. These links provide your web site with “Link Popularity” and the Google PageRank numbers you need for high search engine ranking and traffic. Plus, the links themselves can bring a significant about of qualified visitors to your site.

High Rankings

Many times your press release also ranks in the TOP 10 at Google within 1-2 weeks and provides targeted visitors to your web site for months and months to come. How’s that for a compelling reason to use press releases?!
Plus, you can (and should) post news releases on your web site on your “News Room” area. In fact, adding this keyword optimized press releases to your web site (in your News Room) will help with your natural rankings and traffic.

http://www.prweb.com :- The recognized leader in online news and press release distribution service for small and medium-sized businesses and corporate communications.
PRWeb pioneered Free Press Release Distribution and continues to set the standard for online news distribution.


Types of Links

  • One Way Links
A "one-way text link" means that a Web site links to your Web site but you don’t link back to that Web site. One-way links, also known as non-reciprocal links, are much more difficult to obtain than traditional reciprocal links. In the eyes of search engines, one-way text links have a more natural flow and appear less self-serving. As a result, they are very valuable to your business because the more one-way links you have, the better your chances of an improved search engine ranking, page rank and link popularity.

  • Reciprocal Links / Two Way Links
A reciprocal link is where website [A] links to website [B], which in turn links to website [A].
Links between two sites, often based on an agreement by the site owners to exchange links.

Reciprocal links are based on an agreement by two sites to link to each other. Reciprocal linking is often used by small/midsize sites as an inexpensive way to increase Web site traffic and link popularity.

  • Three way Links
Three-Way Link Exchange is somehow similar to two-way link exchange but the difference is instead of putting a link to another site on your own site, you will place its link on the other site you have.
Three-Way link exchange is a cultivated term not a generic term. This term is used in case when more than two pages are involved in linking. It means that Page A is linking to Page B and Page B is linking to Page C. But Page B is not linking to Page A and Page C is not linking to Page B.

This is usually done when you own two websites. One website has already many back links and page rank and the other one don’t have many back links and/or don’t have page rank. You put a link to another site on the website that has already many back links and/or page rank and in return, your link exchange partner will put a link of your site that doesn’t have any page rank yet.

Google Toolbar

Down Load the Google Tool Bar from

http://toolbar.google.com

Important Points
  • Page Rank can not be common for all the pages of the site.
  • Google is giving Page Rank to each and every page that is in Google’s Index.
  • Page Rank can be different for http://www.site.com & http://site.com
  • Page Rank counts on scale of 0 to 10
  • Google Page Rank is not having any direct relationship with Google SERP (Search Engine Ranking Position)

Google Page Rank

Page Rank is a function of Google that measures the quality of a website on a scale of 0 to10. The theory is that high quality sites receive a higher PageRank based on visitors and traffic your site receives. PageRank is a “vote”, by all the other pages on the internet.

Off page Optimization

Avoid things that Search Engines Hates

Dealing with Frames

A framed site is one in which the browser window is broken in to two or more parts, each of which holds a Web Page. Frames cause a number of problems as following:

  • Some search engines have trouble getting through the frame-definition or frameset page to the actual web pages.
  • If the search engine gets through, it indexes individual pages, not frame sets. Each page is indexed separately.
  • You cannot point to a particular page in your site. This may be a problem in the following situations:

Linking campaigns. Other sites can link to only the front of your site; they can’t link to specific pages during link campaign.

Pay-per-click campaigns. If you are running a pay-per-click (PPC) campaign, you cannot link directly to a page related to a particular product.

Placing your products in the shopping directories. In this case, you need to link to a particular product page.

Search Engine Spamming Techniques that should be avoided

Use of Invisible text:
Using invisible text is an extremely common search spamming practice in which a spammer uses a similar color for fonts as well as the background. Invisible text is used to stuff pages with keywords that are visible to search engines, but invisible to the viewers. All major search engines today can identify this kind of spamming easily and can penalize the site.


Stuffing keywords:
Keyword stuffing can be done in many ways. One most common way of doing this is by using invisible text. Other methods involve using keywords in very small fonts at the bottom of pages, using keywords in hidden tags (like no frames tag, alt tags, hidden value tags, option tags etc.) or stuffing the main content with repetitive keywords. Keyword stuffing is a trick that most search engines today are able to sniff out.

Link Spamming:
Link Spamming is the process of spamming search engines by getting thousands of inbound links from link farms, forums, blogs, free to all pages, unrelated websites or even by registering hundreds of domains and getting links from all of them generating a link empire. To put it in simple words link spamming is the process of getting inbound links through unethical practices solely for the purpose of ranking higher in search engines.


Most search engines give high level of importance to in-bound links and consider them as an indication that the site is credible. Participating in free for all and linking farms can get any site thousands of in-bound links, which can make them look important in the eyes of the search engines when they actually aren't. Most search engines today have come up with strict measures to deal with such kind of spamming and can even ban an involved website completely from their search listings.


Cloaking:
Simply put, cloaking is any process that involves presenting search engines with one set of information and the visitors with another. The copy presented to the search engines is highly optimized and hence the search but may rank it higher.

Creating Doorway Pages:
Doorway pages are pages that are highly optimized for search engines. They are similar to junk pages and contain nothing but keywords and irrelevant content. When a visitor enters such a page, his is either automatically redirected or asked to click on a JavaScript link.

It is not necessary that all doorway pages should always look this way. There are ways in which good and related information can be provided making the doorway pages an informative page instead of a page that contains nothing but junk. Only when one tries to take shortcuts does he relent to creating junk pages instead of offering informational content.

Page redirects:
Often people create spam filled Web pages intended for the eyes of search engines only. When someone visits those pages, they are redirected to the real page by META refresh tags, CGI, Java, JavaScript, or server side techniques. There are legitimate reasons for cloaking and similar techniques, but don't use them unless you know exactly what you are doing.

Duplicate Content:
Its Possible that content duplication is the single biggest SEO problem faced by websites today. Whether the duplication is interntional or not, presenting the same content to search engines under multiple URLs can cause a site to be ranked poorly or penalized. In some cases, it prevents indexing entirely.



Google Sitemap.xml

What is Google Sitemap.xml?

The Sitemap Protocol allows you to inform search engines about URLs on your websites that are available for crawling. In its simplest form, a Sitemap that uses the Sitemap Protocol is an XML file that lists URLs for a site. The protocol was written to be highly scalable so it can accommodate sites of any size. It also enables webmasters to include additional information about each URL (when it was last updated; how often it changes; how important it is in relation to other URLs in the site) so that search engines can more intelligently crawl the site.



Sitemaps are particularly beneficial when users can't reach all areas of a website through a browseable interface. (Generally, this is when users are unable to reach certain pages or regions of a site by following links). For example, any site where certain pages are only accessible via a search form would benefit from creating a Sitemap and submitting it to search engines.



This document describes the formats for Sitemap files and also explains where you should post your Sitemap files so that search engines can retrieve them.



Please note that the Sitemap Protocol supplements, but does not replace, the crawl-based mechanisms that search engines already use to discover URLs. By submitting a Sitemap (or Sitemaps) to a search engine, you will help that engine's crawlers to do a better job of crawling your site.


Using this protocol does not guarantee that your webpages will be included in search indexes. (Note that using this protocol will not influence the way your pages are ranked by Google.)

Robots.txt

Spiders and Robots Exclusion

Web Robots are programs that automatically traverse the Web's hypertext structure by retrieving a document, and recursively retrieving all documents that are referenced. This page explains how you can control what these robots do when visiting your site.

What is Robots.txt?

The Robots Exclusion Protocol is a method that allows Web site administrators to indicate to visiting robots which parts of their site should not be visited by the robot. When a Robot visits a web site, it first checks for the file robots.txt in the root directory; e.g. http://Stars.com/robots.txt. If it can find this file, it will analyze its contents to see if it may retrieve further documents (files). You can customize the robots.txt file to apply only to specific robots, and to disallow access to specific directories or files. Here is a sample robots.txt file that prevents all robots from visiting the entire site:-

# Tells Scanning Robots Where They Are and Are Not Welcome
# User-agent:           can also specify by name; "*" is for everyone
# Disallow:               if this matches first part of requested path,
# forget it
User-agent: *    # applies to all robots
Disallow: / # disallow indexing of all pages

The record starts with one or more User-agent lines, specifying which robots the record applies to, followed by "Disallow" and "Allow" instructions to that robot. To evaluate if access to a URL is allowed, a robot must attempt to match the paths in Allow and Disallow lines against the URL, in the order they occur in the record. The first match found is used. If no match is found, the default assumption is that the URL is allowed.

For example:

User-agent: webcrawler
User-agent: infoseek
Allow:    /tmp/ok.html
Disallow: /tmp

WebCrawler and InfoSeek are not allowed access to the /tmp/ directory, except to access ok.html. All other robots are allowed unrestricted access.

Sometimes robots get stuck in CGI programs, trying to evoke all possible outputs. This keeps robots out of the cgi-bin directory, and disallows execution of a specific CGI program in the /Ads/ directory:-

User-agent: *
Disallow: /cgi-bin/
Disallow: /Ads/banner.cgi
Robots.txt is a simple text file tha’s uploaded to the root directory of a website. Spirders request this file first, and process it before they crawl the site. The simplest robots.txt file possible is this: 
 
User-agent: *
Disallow:
That’s it, The first line identifies the user agent; an asterisk means that the following lines apply to all agents. The blank after Disallow: means that no part of the site is off limits.
 
This robots.txt file doesn’t do anything: all user agents are able to see everything on the site. It’s worth putting a robots.txt file on every website, even it it doesn’t restrict the content that spiders may access. Doing so will prevent the server from returning (and logging) a 404 Not Found error every time a spider requests robots.txt Although a missing robots.txt file foes no real harm from an SEO perspective, the 404 errors can be annoying to webmasters who are examining log files in an attempt to identify real problems.

H1 Tag

Title /heading of the page should be mentioned in H1 tag, the length of title should be limited to 60-80 characters. The standard usage of H1 tag is 3 times for the home page and once in the inner pages

Internal Links And Sitemap page Optimization

  • Creating Sitemap Page:
A sitemap is a web page that lists all the web pages on a website, typically organized in hierarchical way. Sitemap helps visitors and search engine bots to find pages on the site. It provides quick finding for search engine robot. Depending on the size of your website, it may actually link to all of your pages.
A truncated example version (For EDFED) would sit in the top left corner of his home page – and look like this.

  • Creating navigation structure

Creating navigation structure, that search engine can read:

Your navigation structure needs to be visible to the search engines. A navigation structure created with JavaScript will not work for the search engines.

  1. Add a site map page and link to it from your main navigation. It provides another way for search engines to find out all pages.
  2. Whenever possible, provide keywords in text links as part of the navigation structure.
  3. Even if your navigation structure is visible to search engines, you may want to have these bottom-of-page links as well. They are convenient for site visitors and provide another chance for the search engines to find your other pages. Another reason of that is basic text navigation.
Creating a good and proper directory structure is also very important. Keep your pages as close to the root domain as possible, rather than have a complicated multilevel directory structure. Create a directory for each navigation tab and keep all the files in that directory.
e.g. http://www.edfed.com/student-loan-consolidation/student-loan-consolidation.php

  • Naming Files
Giving the proper keyword or product related names to all the images, pages etc… and the page names should be in lowercase.
e.g.
http://www.edfed.com/student-loan-consolidation.php .

  • Avoiding Flash Animation
You should avoid using Flash Animations in the site because search engines cannot read the Flash Animations. However, if it is necessary to use Flash Animations in the site then there are a few things you can do to help with indexing:
  1. Make sure you use your TITLE tags and DESCRIPTION and KEYWORDS Meta tag.
  2. Put text between tags. You could use text; you can separate each keyword with comma and a space. Do not use both but use both a comma and no space or use a space and no comma.
  3. Make sure that most of the keywords in the tag are also in the body text.
  4. Don’t use a lot of repetition.
  5. Don’t use the same KEYWORD tag in all your pages.
  • Avoiding embedded text in images
You should avoid using the embedded text in images means you should use text base information in place of images which displays the information of content in it. And also use of large images should be avoided in the home page.
  • Adding body text
You need text in the page but it should not be more than 100 to 250 words. If you are putting article in the page and the article is of 1000 words, then it is ok. However, the 100-250 word range is good. That amount of content allows you to really define, what the page is about and will help the search engine understand what the page is about.
  1. Home page must contain the short information about all the facilities and products.
  2. Other pages should contain the information regarding that page with the important keywords related to that page.
  3. Keywords should used and repeated in content of all pages.
  • Creating Headers
HTML has several tags that define headers: H1, H2, H3 and so on. These headers are useful in search engine optimization, because when you put keywords into a heading, you are saying to search engine, “These keywords are so important that they appear in heading text” and search engine pays more attention to them. Weighing them more heavily than keywords in body text.
  • Formatting text
You can also tell the search engines that a particular word might be significant several other ways. If the text is displayed in some different way in the page means the search engine may assume that it is been set off for some reason. Following are few things you can set your keywords apart from the other words on the page:
  1. Make the text bold.
  2. Make the text italic.
  3. Uppercase the first letter in Each Word, and lowercase the other letters in the word.
  • Creating links
Links in your pages serve several purposes:
  1. They help searchbots find other pages in your site.
  2. Keywords in links tell search engines about the pages that the links are pointing at.
  3. Keywords in links also tell the search engines about the page containing the links.
  4. All the links must be text based hyperlinks.
  • URL Rewriting
URL rewriting takes the variables out of a dynamic site URL and remaps them into a shorter version. For example:
  1. OldURL: - http://www.example.com/index.php?this=1234&that=5678
  2. Shorter, rewritten URL : http://www.example.com/1234/5678
When a user request a URL such as http://www.example.com/1234/5678, that URL is rewritten by the server behind the scenes so that the cript receives the variables this =1234 and that =5678. The scripts that drive the site must be written so that they create the shorter URL for links. The server translates the requested URLs internally without showing the translated version to the visitors.