Template by:
Free Blog Templates

Web Design Guidelines

Designing a Web Site for Success

There are millions of web sites on the Internet today with thousands more being added each day. The competition is fierce and in order to be successful, you must stay one step ahead of the game.

One of the most important aspects of your success is your web site. Your web site is a direct reflection of you and your business. The appearance of your web site is the most important factor in determining your sites value. If your site doesn't look professional or pleasing to the eyes at first glance, its perceived value will be low. The perceived value of your web site will have a great impact on your chance of success or failure.

On the other hand, you may have a great web site, well designed and a quality product or service, but if it takes too long to load, the value will still be perceived as low. Why? Because you’re potential customer will not wait. Ultimately costing you business.

According to two surveys, conducted by Forrester Research and Gartner Group, ecommerce sites are losing $1.1 to $1.3 billion in revenue each year due to customer click-away caused by slow loading web sites.


The main page of your web site should load within 8 seconds or less with a 56K modem. To keep your load time down:
  • Avoid using large slow loading graphics
  • Avoid using large or too many animated graphics
  • Limit the number of banners to no more than two per page
  • If you must use Java, use it sparingly
  • If you're using Flash, provide your visitors with an alternative link to skip the intro
Another consideration of great importance is the Search Engines. You must specifically design your web site to rank high in the Search Engines. If your site isn't listed within the top thirty search results, your potential customers won't be able to find you. When optimizing your page for the Search Engines, it is essential to include all of the following:

Web Site Design Guidelines:

  1. Your main page should specifically let your visitors know exactly what you're offering. If your potential customer can't find your product or service, they definitely won't waste a lot of time looking for it. They'll go on to the next site and probably never return. They're visiting your site for a specific purpose. They want something your site offers.
  2. Create a page to display your "Privacy Policy" in regard to the personal information you collect from your visitors such as, email address, Internet Service Provider, etc.. Explain your reasons for collecting the information and let them know how the information will be used.
  3. Create a page about you and/or your company. Include your name, company name, photograph, biography, address, phone number and email contact information.
  4. Display your copyright information at the bottom of each page.
  5. Keep in mind, your visitors may enter your site from pages other than your main, so make sure you include good navigational links on every page. Place your navigation links together at the top, bottom, left or right side of the page. Use tables to neatly align your links and maintain a nicely organized and uniform appearance throughout. Try to keep the number of clicks required to get from your main page to any other page on your site down to four and place your company logo on each page.
  6. Use caution when selecting your background and text colors. Busy backgrounds make text difficult to read and draw the attention away from the text. In addition, always be consistent with your background theme on each page of your site. Keep in mind, colors affect your mood and will have an affect on your visitors as well. Bright colors such as yellow and orange, cause you to become more cheerful or happy, while colors such as blue and purple have a calming effect. Dark colors such as brown and black have a depressing effect. A good rule of thumb is to use colors based upon the type of effect you're trying to achieve.
  7. ALWAYS check and double-check your site for spelling errors and make sure your images and links are all working properly. If you have several errors, this will make your site appear to be unprofessional. If you are designing your site using an HTML editor, use spell check. Proper grammar is also very important.
  8. If you must use frames, use them sparingly. Frames, if not used properly, can make your site look unprofessional. Avoid making your visitors have to scroll from side to side to view your content. This can be very irritating and cause your visitors to leave.
  9. If you must use Java on your site, use it sparingly. Java can be slow and has a tendency to crash browsers.
  10. If you're using pop-up windows to display special offers or ezine subscription information, try to use a JavaScript that utilizes cookies. This way, the window will only be displayed to your visitors the first time they visit your web site.
  11. View your web site through different browsers and screen resolutions so you will see how your visitors will view your site. Visit:

    SiteOwner - Check your web pages for HTML validity and browser compatibility.
    http://www.siteinspector.com/

    NetMechanic - Provides a variety of free services for your web site including; browser compatibility testing, graphic file size reduction, link check, HTML check, load time check, spell check and more.
    http://www.netmechanic.com/
  12. (1) Continually add new content to your site. Give your visitors a reason to keep coming back.

    Web Design Mistakes to Avoid:

    - Animated bullets
    - Broken links and graphics
    - Busy, distracting backgrounds
    - Confusing
    - Different backgrounds on each page
    - Large fonts
    - Large scrolling text across the page
    - Large slow loading graphics
    - Large Welcome banners
    - Multiple banners and buttons
    - Multiple colored text
    - Multiple use of animated graphics
    - Multiple use of different fonts
    - No contact information
    - No Meta tags
    - Over powering music set to AutoPlay
    - Over use of Java
    - Pages scrolling to oblivion
    - Poor browser compatibility
    - Poor content
    - Poor load time
    - Poor navigation
    - Poor organization
    - Poor overall appearance
    - Poor use of frames
    - Poor use of mouse over effect
    - Poor use of tables
    - Pop up messages
    - Scrolling text in the status bar
    - Spelling/Grammar mistakes
    - Text difficult to read
    - Too many graphic and/or line dividers
    - Too many graphics
    - Too much advertising
    - Under construction signs



    If you've never designed a web page, it would be wise to become familiar with HTML. (Hypertext Markup Language.) A great place to start is NCSA Beginner's Guide to HTML:

    http://www.ncsa.uiuc.edu/General/Internet/WWW/HTMLPrimer.html

    Take some time to research and plan your web site. Your success depends upon it. The simple, well-designed sites make the sales.

Web Analysis Relation with SEO

Search Engine Optimization

The objective of Search Engine Optimization is to increase web visitor counts by ranking very high in the results of searches using the most appropriate keywords describing the content of your site. This relative ranking is often viewed as a struggle to best use a few keywords, instead of a struggle to out-do your competition. If you search on your target keywords, you will see the leading site in the rankings. All you need to do is to be better than that number one site. This page suggests ways to optimize and improve search engine results with ranking and placement advice, information, hints, tips, and clues to improve your search engine keywords relative to existing leaders. After all, better keyword ranking is your real objective.
It is not enough to simply add META tags and do search engine submission of your site to a million search engine indexes and directories. The first placement step in obtaining significant web visitor counts is to seek first-page search engine results. An early step is to build a great content-rich site. One of the last steps is the proper submission of your great site to the search engine or directory. In the middle is a step that is VITAL if you want to obtain front-page results. Most sites skim past this step because it is forgotten or too complex, but without competent Search Engine Optimization you are destined to be search engine fodder. The following FREE tools and advice describes how to design your keywords with Search Engine Optimization and ranking in mind.

There are no Search Engine Optimization secrets -- just ranking and placement methodologies to follow in order to beat your competition in obtaining a high ranking for desired search keywords. SEO training, content and link services are just one small part. This site targets improving search engine rankings by using a "follow the leader" approach to keyword selection and page wording. Once you know what keywords and search engine marketing services (not spam) worked for the "leaders", you can "beat the leader" and do even better! Proper Search Engine Optimization requires that you beat your competition, so knowing the keywords and criterion used by your competition is the most important first step. It will become obvious that good ranking excludes keyword spamming the search engine, and that with the careful selection of your keywords that you will fare well for a little effort. The Bruce Clay website offers help, hints, and tips for improving search engine results via a specific search engine keywords placement methodology.



Search Engine Optimization Overview

What is Search Engine Optimization? Search Engine Optimization (SEO) is the science of search as it relates to marketing on the web. It is mostly technical in nature, combining programming with business, persuasion, sales, and a love for competitive puzzle solving into a written form capable of maintaining desired revenue goals while achieving high rankings in the organic sections of search engine results pages. It is not just technical, nor copywriting, nor links, nor just searches engine submission, but an intricate blend of over a hundred variables into the fabric of a website. It is difficult to accomplish without a formal proven methodology and strong proprietary tools. We offer you a tutorial on all of that and more on these pages...

Before you start, you should understand that top 10 rankings with every single major search engine and directory can be obtained, although very few sites can get there and the effort is often beyond reason. Note: URL ranking results change week-to-week due to competition, so maintaining a top ranking requires constant keywords monitoring and information rework. Search Engine Optimization never rests, much like your competition.
"It is not the job of Search Engine Optimization to make a pig fly. It is the job of the SEO to genetically re-engineer the web site so that it becomes an eagle."
The key information on this page includes how to prepare both you and your site for the search engines, choosing the right keywords, how to analyze your competition, what is submission and how is it best accomplished, when to monitor your ranking, instructions for performing an analysis of your site results, complete with tools and aids. This site covers all basic and advanced strategies and the common mistakes to avoid.

Competetor Analysis

Competitor analysis in marketing and strategic management is an assessment of the strengths and weaknesses of current and potential competitors . This analysis provides both an offensive and defensive strategic context through which to identify opportunities and threats. Competitor profiling coalesces all of the relevant sources of competitor analysis into one framework in the support of efficient and effective strategy formulation, implementation, monitoring and adjustment.

Given that competitor analysis is an essential component of corporate strategy, Porter (1980, 1998) argued that most firms do not conduct this type of analysis systematically enough. Instead, many enterprises operate on what he calls “informal impressions, conjectures, and intuition gained through the tidbits of information about competitors every manager continually receives.” As a result, traditional environmental scanning places many firms at risk of dangerous competitive blindspots due to a lack of robust competitor analysis

Website Analysis & Competitor Analysis

  1. Competetor Analysis
  2. Web Analysis Relation with SEO
  3. Web Design Guidelines

Why is Search Engine Optimization (SEO) Important?

The Internet has provided WAYS to revolutionize how we live our daily lives. It has crawled into the different dimensions of human lives- business, communication, information dissemination, personal relationships. People have made a paradigm shift towards using the Internet to aid them in their daily activities.


With this context in mind, many people are continuously struggling to get noticed in the world of the Internet. Websites are growing like mushrooms everywhere, every time. How can one's website get past the millions of other websites and eventually be noticed by its target audience?
Search engine optimization aims to achieve the goal of getting more visitors to a website by helping it get higher rankings in the search engines. This simply means that search engine optimization's goal is to make a website appear on the first pages, if not the first page of a search done through the search engine.

There are two ways to be able to get noticed by search engines. One is through pay-per-click-advertisements. A good example of a pay-per-click system that is employed by search engines is the Google Adwords system. It has created a hype and has given Google around 5 billion dollars in terms of revenue per year. Webmasters can place their bids to be shown when a keyword is searched by a surfer. The highest bidders will get their sites to appear first when the search is being done.

The second way of GETTING high rankings from search engines is through organic searches. Search engines evaluate websites by using what they call "spiders." These programs scan the websites and collects information about them. They then collate the information and pass it on to the search engine. This area is primarily the main arena of search engine optimization. It utilizes a set of methods to be able to get search engines to list the website on high ranks.

Traffic

The main purpose of search engine optimization is to increase the traffic generated by a website. Websites are built to be seen by Internet surfers and search engines can help it achieve this goal. The power of the search engine should not be underestimated. It is one of the building blocks of the foundation of the Internet. A survey showed that 90% of all Internet users employ search engines to aid them in their Internet-related activities. Google, the dominant player in the search engine industry, generates 70% of all search-related Internet activity.

People and Search Engines are alike

Search engines behave like people. They like websites which contain substantive information about a certain topic. The best sites usually appear first in search engines because people like them as well as the search engines.

Search engine optimization does not only generate traffic, it helps maintain the traffic. The behavior of the search engine is indicative to the behavior of the people who visit the website. Search engine optimization leads to the optimization of a webpage or a website. It will lead to a website which is more organized and a website which contains substantive information.

The use of the Search Engine to be able to target one's target audience is one of the most effective Internet marketing strategies. It is not like other on-line marketing techniques (such as email marketing) which can lead to a lot of leakages in terms of targeting the right audience.
Search engines segment the market and connect the right people together. People search for topics which they are interested in and this is the main strength of search engines in connecting markets together.

Cost Effectiveness

One can do search engine optimization under the ASSUMPTION that he knows what he is doing. Search engine optimization is a full-time job and has a very long learning curve. This is why most people would resort to out-sourcing the job to experts who are good at what they do. One should be cautious, however, in hiring a search engine optimization company or consultant. Factors such as pricing and service should carefully be assessed before signing a deal. If done properly, search engine optimization is a very cost effective way of getting more people to know about one's products or to know about a certain issue or event that a website is disseminating.

SEO is very important for websites since it determines the position of the website in comparison with its competitors. It does not only generate traffic from the targeted audience but is also a cost-effective way of optimizing the website.

List of Major Search Engines

Google
http://www.google.com

Voted four times Most Outstanding Search Engine by all Search Engine Optimizers, Google has a well-deserved reputation as the top choice for those searching the web. The crawler-based service provides both comprehensive coverage of the web along with great relevancy. It's highly recommended as a first stop in your hunt for whatever you are looking for.
Google provides the option to find more than web pages, however. Using on the top of the search box on the Google home page, you can easily seek out images from across the web, discussions that are taking place on Usenet newsgroups, locate news information or perform product searching. Using the More link provides access to human-compiled information from the Open Directory,
catalog searching and other services.
Google is also known for the wide range of features it offers, such as cached links that let you "resurrect" dead pages or see older versions of recently changed ones. It offers excellent spell checking, easy access to dictionary definitions, integration of stock quotes, street maps, telephone numbers and more. See Google's
help page for an entire rundown on some of these features. The Google Toolbar has also won a popular following for the easy access it provides to Google and its features directly from the Internet Explorer browser.

In addition to Google's unpaid editorial results, the company also operates its own advertising programs. The cost-per-click AdWords program places ads on Google as well as some of Google's partners. Similarly, Google is also a provider of unpaid editorial results to some other search engines.

Google was originally a Stanford University project by students Larry Page and Sergey Brin called BackRub. By 1998, the name had been changed to Google, and the project jumped off campus and became the private company Google. It remains privately held today.

Yahoo
http://www.yahoo.com

Launched in 1994, Yahoo is the web's oldest "directory," a place where human editors organize web sites into categories. However, in October 2002, Yahoo made a giant shift to crawler-based listings for its main results. These came from Google until February 2004. Now, Yahoo uses its own search technology.

Overture was formerly called GoTo until late 2001. More about it can be found on the Paid Listings Search Engines page. Overture purchased AllTheWeb in March 2003 and acquired AltaVista in April 2003. Now Yahoo owns these, gained as from its purchase of Overture.
Technology AltaVista and AllTheWeb was combined with that of Inktomi, a crawler-based search engine that grew out UC Berkeley and then launched as its own company in 1996, to make the current Yahoo crawler. Yahoo purchased Inktomi in March 2003.

MSN Search
http://search.msn.com

Formerly one of Search Engine Watch's top choices, MSN Search is definitely one to watch. The service was previously powered by LookSmart results and gained top marks for having its own team of editors that monitored the most popular searches being performed to hand-pick sites believed to be the most relevant. The system worked well.

Ask Jeeves
http://www.askjeeves.com

Ask Jeeves initially gained fame in 1998 and 1999 as being the "natural language" search engine that let you search by asking questions and responded with what seemed to be the right answer to everything.

AllTheWeb
http://www.alltheweb.com

Powered by Yahoo, you may find AllTheWeb a lighter, more customizable and pleasant "pure search" experience than you get at Yahoo itself. The focus is on web search, but news, picture, video, MP3 and FTP search are also offered.

AOL Search
http://aolsearch.aol.com (internal)
http://search.aol.com/(external)

AOL Search provides users with editorial listings that come Google's crawler-based index. Indeed, the same search on Google and AOL Search will come up with very similar matches. So, why would you use AOL Search? Primarily because you are an AOL user. The "internal" version of AOL Search provides links to content only available within the AOL online service. In this way, you can search AOL and the entire web at the same time. The "external" version lacks these links. Why wouldn't you use AOL Search? If you like Google, many of Google's features such as "cached" pages are not offered by AOL Search.

HotBot
http://www.hotbot.com

HotBot provides easy access to the web's three major crawler-based search engines: Yahoo, Google and Teoma. Unlike a meta search engine, it cannot blend the results from all of these crawlers together. Nevertheless, it's a fast, easy way to get different web search "opinions" in one place.
HotBot's "choose a search engine" interface was introduced in December 2002. However, HotBot has a long history as a search brand before this date.

Teoma
http://www.teoma.com

Teoma is a crawler-based search engine owned by Ask Jeeves. It has a smaller index of the web than its rival crawler-competitors Google and Yahoo. However, being large doesn't make much of a difference when it comes to popular queries, and Teoma's won praise for its relevancy since it appeared in 2000. Some people also like its "Refine" feature, which offers suggested topics to explore after you do a search. The "Resources" section of results is also unique, pointing users to page that specifically serve as link resources about various topics. Teoma was purchased by Ask Jeeves in September 2001 and also provides some results to that web site.

AltaVista
http://www.altavista.com

AltaVista opened in December 1995 and for several years was the "Google" of its day, in terms of providing relevant results and having a loyal group of users that loved the service.
Sadly, an attempt to turn AltaVista into a portal site in 1998 saw the company lose track of the importance of search. Over time, relevancy dropped, as did the freshness of AltaVista's listings and the crawler's coverage of the web.

Gigablast
http://www.gigablast.com

Compared to Google, Yahoo or even Teoma, Gigablast has a tiny index of the web. However, the service is constantly gaining new and interesting features.

LookSmart
http://www.looksmart.com

LookSmart is primarily a human-compiled directory of web sites. It gathers its listings in two ways. Commercial sites pay to be listed in its commercial categories, making the service very much like an electronic "Yellow Pages." However, volunteer editors at the LookSmart-owned Zeal directory also catalog sites into non-commercial categories for free. Though Zeal is a separate web site, its listings are integrated into LookSmart's results.

Lycos
http://www.lycos.com

Lycos is one of the oldest search engines on the web, launched in 1994. It ceased crawling the web for its own listings in April 1999 and instead provides access to human-powered results from LookSmart for popular queries and crawler-based results from Yahoo for others.

How Search Engines Rank Web Pages


Search for anything using your favorite crawler-based search engine. Nearly instantly, the search engine will sort through the millions of pages it knows about and present you with ones that match your topic. The matches will even be ranked, so that the most relevant ones come first.

Of course, the search engines don't always get it right. Non-relevant pages make it through, and sometimes it may take a little more digging to find what you are looking for. But, by and large, search engines do an amazing job.

As WebCrawler founder Brian Pinkerton puts it, "Imagine walking up to a librarian and saying, 'travel.' They’re going to look at you with a blank face."

OK -- a librarian's not really going to stare at you with a vacant expression. Instead, they're going to ask you questions to better understand what you are looking for.

Unfortunately, search engines don't have the ability to ask a few questions to focus your search, as a librarian can. They also can't rely on judgment and past experience to rank web pages, in the way humans can.

So, how do crawler-based search engines go about determining relevancy, when confronted with hundreds of millions of web pages to sort through? They follow a set of rules, known as an algorithm. Exactly how a particular search engine's algorithm works is a closely-kept trade secret. However, all major search engines follow the general rules below.

Location, Location, Location...and Frequency

One of the the main rules in a ranking algorithm involves the location and frequency of keywords on a web page. Call it the location/frequency method, for short.

Remember the librarian mentioned above? They need to find books to match your request of "travel," so it makes sense that they first look at books with travel in the title. Search engines operate the same way. Pages with the search terms appearing in the HTML title tag are often assumed to be more relevant than others to the topic.

Search engines will also check to see if the search keywords appear near the top of a web page, such as in the headline or in the first few paragraphs of text. They assume that any page relevant to the topic will mention those words right from the beginning.

Frequency is the other major factor in how search engines determine relevancy. A search engine will analyze how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages.

Spice in the Recipe

Now it's time to qualify the location/frequency method described above. All the major search engines follow it to some degree, in the same way cooks may follow a standard chili recipe. But cooks like to add their own secret ingredients. In the same way, search engines add spice to the location/frequency method. Nobody does it exactly the same, which is one reason why the same search on different search engines produces different results.

To begin with, some search engines index more web pages than others. Some search engines also index web pages more often than others. The result is that no search engine has the exact same collection of web pages to search through. That naturally produces differences, when comparing their results.

Search engines may also penalize pages or exclude them from the index, if they detect search engine "spamming." An example is when a word is repeated hundreds of times on a page, to increase the frequency and propel the page higher in the listings. Search engines watch for common spamming methods in a variety of ways, including following up on complaints from their users.

Off The Page Factors

Crawler-based search engines have plenty of experience now with webmasters who constantly rewrite their web pages in an attempt to gain better rankings. Some sophisticated webmasters may even go to great lengths to "reverse engineer" the location/frequency systems used by a particular search engine. Because of this, all major search engines now also make use of "off the page" ranking criteria.

Off the page factors are those that a webmasters cannot easily influence. Chief among these is link analysis.

By analyzing how pages link to each other, a search engine can both determine what a page is about and whether that page is deemed to be "important" and thus deserving of a ranking boost. In addition, sophisticated techniques are used to screen out attempts by webmasters to build "artificial" links designed to boost their rankings.

Another off the page factor is click through measurement. In short, this means that a search engine may watch what results someone selects for a particular search, then eventually drop high-ranking pages that aren't attracting clicks, while promoting lower-ranking pages that do pull in visitors. As with link analysis, systems are used to compensate for artificial links generated by eager webmasters.





How Search Engines Work


The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.


Crawler-Based Search Engines

Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found.
If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.

Human-Powered Directories

A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted.

Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.


Hybrid Search Engines Or Mixed Results


In the web's early days, it used to be that a search engine either presented crawler-based results or human-powered listings. Today, it extremely common for both types of results to be presented. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search is more likely to present human-powered listings from LookSmart. However, it does also present crawler-based results (as provided by Inktomi), especially for more obscure queries.


The Parts Of A Crawler-Based Search Engine


Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes.

Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information.

Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine.

Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant.


Major Search Engines: The Same, But Different


All crawler-based search engines have the basic parts described above, but there are differences in how these parts are tuned. That is why the same search on different search engines often produces different results. Information on this page has been drawn from the help pages of each search engine, along with knowledge gained from articles, reviews, books, independent research, tips from others and additional information received directly from the various search engines.