Top Level Category

Table of Contents

  1. Contents of Top Level Category
    1. BuildTraffic.net Membership FAQs (Frequently asked questions concerning...)
    2. BuildTraffic.net SEO Tools (Overview of our Search Engine Optimization...)
    3. Guaranteed Top 10 FAQs (Answers to frequently asked questions...)
    4. Instant Guaranteed Website Traffic FAQs (Answers to frequently asked questions about our...)
    5. Search Engine Optimization Articles (This section contains vital information on the...)

Top Level Category

Parent category to all other categories
There are no articles in this category.

BuildTraffic.net Membership FAQs

Frequently asked questions concerning BuildTraffic.net memberships.

How long have you been in business?

We have been providing Internet Marketing services for almost 15 years. Over the past decade and a half we have learned what works and what does not when it comes to having an effective Internet presence. We also know it does not have to cost you fortune to BuildTraffic to your website; as some Internet marketing companies would like you to believe. That is why we pride ourselves on providing you the most cost effective Internet marketing solutions available.

What type of support do you offer?

Unlike other SEO companies we are not hard to get hold of. We offer toll-free phone, help desk, live chat and email support.

Do I have to sign a long term contact with your company?

No, unlike other SEO companies we do not lock you into long term contracts. All of our memberships are sold on a subscription basis (paid up front): month to month, quarterly, every 6 months or yearly. Should you decide you no longer want to promote your website you can cancel at anytime from within your admin under billing. Simply login click billing then select cancel autobilling.

Do you offer discounts if I have more than one website?

Yes, the more website you have to promote the greater your savings.

Yes, You can find more information on our pricing plans at:

https://buildtraffic.net/index.fpl?action=signup&op=pricing

What are the payment options? How can I pay?

We accept  Master Card, Visa, American Express, Discover, and PayPal. If you would like to use a Check or Money Orders please contact us for details.

How long does it take to get listed once I submit?

Once you submit it can take 4-6 weeks to get listed on most search engines. However, the majority of our clients get listed in a few weeks. Once you are listed you will need to use our SEO tools in order to improve your rankings on the search engines.

While you are waiting you can also take advantage of our "Guaranteed Top 10" and "Instant Website Traffic". Both of these services are available to our members in their admin. These services will allow you to start getting traffic to your website within 12 hours!

How can I get my lost password/username?

You can have your login information emailed to you by filling out the password lookup form at:

https://buildtraffic.net/index.fpl?action=member::lookup

How do I change my password?

Log into your account and then click on "My Profile" and then you will be able to change it.

Do you have an affiliate program?

Yes, you can find more information on our affiliate program at:

https://buildtraffic.net/index.pl?action=affiliate_signup&op=info


BuildTraffic.net SEO Tools

Overview of our Search Engine Optimization (SEO)tools.

SEO Analyzer

SEO Analyzer (additional fees may apply)
The SEO Analyzer is a powerful tool that checks over 20 known elements that affect your rankings in the search engines. The tool will crawl your web site, similar to a search engine, and report on the strengths and weaknesses of the website's overall structure and then offer suggestions for improvement. By following the detailed information provided you will be able to optimize your web pages for top search engine rankings. It is like having your own SEO expert with you 24/7 guiding you through the entire search engine optimization process.

 

Automatic Search Engine Submitter

Automatic Search Engine Submitter
Our submitter is one of the most powerful promotional tools of our service. With more than 100,000 sites to submit to, Your site gets indexed with ease! Just schedule when you want to be submitted and the system does the rest for you! Not only this but our built in spider will grab all the URLs from your site and add them to the submission queue. This spider will also extract your keywords and site title directly from your site saving you lots of time.

 

Automatically add all the URLs, titles, descriptions, and keywords from your site to the submission queue.


 

You also get to individually select the sites you want to submit to.
Each site is broken down by its category.


You can then use the system to automatically submit your site at any time or interval you choose.

 


You will also get a detailed report of all submissions.
You can select to get this report from your browser or via email.



Guaranteed Search Inclusion

Guaranteed Inclusion (additional fees apply)
Using our guaranteed inclusion system you will get a guaranteed Top 10 listing on over 200 search engines within 6-8 hours. You will also only be charged a flat fee with no bidding and no pay per click charges. Simply select the keywords you want to get listed for, design your ad, and choose the duration you want to be listed for
(3 or 12 months). That's it! Getting a top 10 listing on so many search engines has never been so easy.

 

Web Site Monitors

Our monitor tool allows you to monitor your website ranking, link popularity, and Google PageRank.
You can set it up so you received detailed emailed reports either monthly or weekly or simply login to check your current standings. This allows you to analyze your optimization techniques and see what is working and what is not over time.

 

Add Ranking 

Ranking Monitor Report

Add Link Popularity Monitor


Link Popularity Monitor Report

Add Page Rank Monitor

Page Rank Monitor Report

Link Popularity Checker

Link popularity checker 
Link popularity is the number of other Web sites that contain links to your site. Many factors can effect the search results returned by search engines. A major factor in search engine ranking is "linkage", that is, other web sites with links pointing to your site. A web site with a larger amount of "other sites" linking to it would score a higher relevance ranking than one with fewer links.

 

Check Link Popularity

Link Popularity Report

Google PageRank

Google PageRank
PageRank is an exclusive technology of Google which evaluates the popularity of your Website's pages by a note between 0 and 10.

 

Check Your PageRank

Link Relevancy Checker

Link Relevancy Checker
Find out in one click why pages rank the way they do. This tool presents this data in four different tables where, usually in a matter of seconds, you can see for yourself why pages rank the way they do. Want to be ranked #1 on the search engines for your targeted keyword? Then simply run the Link Relevancy Checker on the Website that is currently ranked #1 for that keyword - then make sure your site is at least one link better!

 

Link Relevancy Check Results

 

Keyword Density Analyzer

Keyword Density Analyzer
The tool will analyze a web page for select keywords. This tool has many options which will help analyze a competitors site, and show you why a given site may be achieving a higher ranking than you. Simply the best way to gain a higher rank!

 

Simply enter your url and a competitor who out ranks you to find our why

Keyword Density Results

Search Engine Rankings

Search Engine Ranking
Find out how your pages are doing in the search engines! This highly sophisticated tool will search the world engines via a keyword or phase finding out just how and where your pages are ranked.

 

Select the keywords, URL, and engines you want to check rank for

Ranking Report

 

Search Engine Coverage

Search Engine Coverage
Find out how many of your pages are listed in different engines.

 

 

Search Engine Coverage Results

Broken Link Check

Broken Link Checker
This tool will spider your website for any broken links. Broken links can adversely affect your search engine rankings

 

 

Keyword Generator

Keyword Generator

Keyword Generator
Our keyword generator is designed to query the world search engine databases for the most used keyword queries used by web surfers! This tools allows you to get the Top keywords and phrases from the major search engines at the click of a button. You can then use these keywords to create both meta tags and doorway pages to maximize traffic to your website! If you don't have good keywords, you won't get traffic!

 

Generate hundreds of keywords automatically

 

Top Keyword Finder

Top Keyword Finder
This powerful tool gives you the ability to find out what keywords are being 'most searched' in the major search engines. Why is this important? If you're competing with 1000 other sites to sell your widget, wouldn't it be nice to know what keywords people are using to find your competitors widgets the most? You can then use those keywords in your meta tags and doorway pages. This is one of the most underutilized tools on the Net, but one of the most powerful for driving traffic to your web site!

 

Find keywords easily without any competition


Guaranteed Top 10 FAQs

Answers to frequently asked questions concerning our Guaranteed Top 10 program.

What is a Guaranteed Top 10 Listing?

A Guaranteed Top 10 Listing is a paid listing where you pay a flat fee (NO PAY PER CLICK FEES) for an ad box that will appear in our network of over 250 search engines.

How long is my listing active for?

Your ad will remain active unless automatic renewal is cancelled, and you will be billed every 3 months or 1 year (depending on the term you selected at time of purchase).

How much do I pay per click or impression?

There is no pay per click or impression charge, just the single flat fee per listing.

How long does it take for my ad to be displayed?

Your ad will be added to our search results for the keyword phrase you selected within 6 to 8 hours of processing your order.

How can you guarantee me a top 10 listing?

We limit the number of listings that can be purchased and rotate the listings that are displayed so that each ad is treated equally, and has an equal chance at appearing at the top of the featured listings. This means that your listing should appear a minimum of one out of every three searches on that keyword phrase.

Are Google, Yahoo and Bing included?

No these are Pay Per Click search engines. These search engines charge you $1, $2, $5, and even $10 PER CLICK for a sponsored listing. With our submissions you will be indexed by Google, Yahoo Bing and ALL major engines. However, with our Guaranteed Top 10 Program you will be guaranteed a first page listing in our network consisting of over 250 smaller search engines that collectively receive over 1 BILLION searches per month. Keep in mind a customer is still a customer if they come from a major search engine or a smaller search engine. So why pay costly pay per click fees for the same keyword targeted visitor?

Is my ad displayed the same way on all search engines in your network?

On most engines, your ad will appear as a Google-type ad box in the right column of search results, but on some engines, it will appear as a Recommended Listing above regular search results or in the right or left columns of search result pages. Additionally on some of the directory sites your listing will only appear if your keyword phrase matches one of their directory categories.

I've looked and still can't find my ad, where is it?

If you completed the submission of your ad and it is not being displayed in our search results after 8 hours for your selected keyword term, fill our a support request.

Instant Guaranteed Website Traffic FAQs

Answers to frequently asked questions about our Instant Guaranteed Website Traffic.

Why should I purchase traffic from BuildTraffic.net?

We send quality traffic at the most competitive rates possible with the fastest service and absolutely no hidden fees. Your campaign will go live within 12 hours, but in most cases, much faster. We are the largest traffic provider on the Internet and have been providing Internet marketing for close to 15 years.

Note: Unlike other companies who sell "traffic" or "visitors" we NEVER use software, bots, autosurfs, or redirected domains to send you website traffic. All of our traffic comes from our network of high quality websites via full browser exit windows / pop-unders.

How do you generate your traffic?

All of our traffic is generated through pop-unders or exit windows. We have a publisher network of over 100,000 high traffic websites. In exchange for a portion of the revenue we generate these website owners place HTML code on their sites that causes the pop-under window when someone visits their site.

What is a pop-under / exit window?

A pop-under or exit window is like a pop-up except that it loads underneath the current page rather than over it. This form of advertising is much more superior because it is less intrusive.

Do you offer bulk discounts?

Yes, please contact us concerning bulk traffic purchases or information on our traffic reseller program.

What size are your pop-unders displayed?

Our pop-unders are delivered at full browser windows for maximum viewing quality. The pop will automatically adjust itself to fit the viewers max screen resolution.

What if a visitor has a pop-up blocker?

If a user has a pop-up blocker, our traffic server script will not execute. This means that you will never get counted off for a visitor using a pop-up blocker since no pop will be displayed to that person.

Is your traffic unique?

Yes, the traffic that we deliver is 24 hour unique. Meaning you will never receive more than one visit from the same IP address within a 24 hour period.

Do you offer stats or any kind or reports with my campaign?

Yes, you will get real-time stats to show you the progress of your campaign.

How can I make my campaign more effective?

Optimize your site to make it load quickly or design a page that not only loads quickly, but makes use of things such as special offers, good color schemes, etc. Anything that may appeal to your target audience. Remember if you have too much information on your landing page this will distract the visitor.  Remember the visitor will make a decision about staying on your site within a split second. Keep your landing page simple and to the point!

I have just ordered, when will my campaign start?

Your campaign will go live within 1 day, but in most cases, much quicker... normally within 30 minutes.

How long will my traffic campaign run?

It depends on the number of visitors that you order and what you set your daily cap at. We guarantee will will deliver all visitors within 30 days if you max your daily cap.

What is a daily cap?

A daily cap is the maximum number of visitors we will send to your site within a 24 hour period. You will always receive less than your daily cap, but never over.

How do I pause my traffic campaign?

You can pause your traffic campaign simply by setting the daily cap to zero.

Will I make more sales?

This is dependent on factors such as your site"s sales copy, design, load time, color scheme, and overall presentation. With this type of traffic it is best to keep your landing page as simple and to the point as possible.

Why does my hit counter not match up with your stats?

There are many reasons that a visitor will not register with your server.

* A user cancels out the window before your site loads
* The user is on a slow connection and the request times out
* The user is behind an ISP with a proxy ex. AOL
* The users's browser served the request from cache instead of from your server
* Your site was unavailable at the time of request
* Your tracking software could not handle concurrent connections
* Your tracking software does not update in real time

Should there ever be a major discrepancy open a support ticket and attach a copy of your "raw server logs" available from your web host and we will be more than happy to issue a credit if one is due! Please note that we must have your actual server access log files as 3rd party tracking is known not to be accurate.

Are there any traffic campaign restrictions?

Yes, the page that you use for your campaign cannot contain any of the following:
* Additional pops including javascript prompts, windows that load with your website, windows that load when a user exits your website, automatic download boxes, etc..
* Sound (background or flash)
* Frame breaking code
* Scripting that alters a user's browser
* Adult or illegal content

Search Engine Optimization Articles

This section contains vital information on the Search Engine Optimization Process.

The Value of Web Marketing

The Internet is changing the rules of business. As trite as these words may seem now, they are true. Like radio and television, the Internet has brought about an undeniable shift in how business is conducted. Unlimited information, instantaneous communication, and a market more vast than anything before have become available to both massive conglomerates and small home businesses. However, one underlying factor remains the same: advertisement. While the traditional business plan of a "Brick and Mortar" company has always included extensive advertisement, the same cannot be said about many Internet businesses. Many webmasters go to great lengths to craft wonderful web sites, putting little or no thought into advertising them. And any business that doesn't advertise is doomed to failure. Web sites need to advertise and promote themselves to create traffic on their site, to stay in business, and to grow.

Businesses on the Internet share a kind of equality that Brick and Mortar businesses do not. When a visitor arrives at a website, they really have no idea how large or small that company may be. In the Brick and Mortar world, it is easy to distinguish between the two--the larger will have the bigger store, flashier advertising, and a more expansive inventory. On the Internet however, small companies can project an uncharacteristically large presence by creating and properly promoting their web sites. While the Brick and Mortar retail world is consolidating and merging towards giant discount oriented retailers, the Internet is teeming with thousands of small, successful companies who might not otherwise be able to compete, or even start-up, in the Brick and Mortar world.

No matter how large or small your company is, you need to advertise. Unfortunately for most small businesses, Dot-com or otherwise, traditional advertising methods require immense capital and human resources. This is exactly why many smaller businesses fail--they spend too many of their resources in trying to compete on somebody else's turf.

The most obvious medium of advertisement is television. However, television's effectiveness in attracting customers is questionable. Its effectiveness lies in achieving brand recognition. Yet at this point most businesses are trying to increase traffic to their websites and gain more customers--brand recognition can wait until after the IPO. Television advertising is also the most expensive medium. More than one previously unknown Dot-com spent all of their available resources on a Super Bowl spot. They may or may not have been successful, but nevertheless, they spent millions of dollars for 30 seconds of airtime. Most businesses are more shrewd when it comes to spending their dollars, and they want more than just thirty seconds of exposure.

If you thought that targeting an Internet-based audience would be more effective at driving traffic to a website, you would be right. Banner ads and mass-mailings are much less expensive than television. However, they have their limitations as well. While remarkably cheaper than a TV spot, a small banner ad on a major portal website can still cost over six thousand dollars a month. Furthermore, it is generally agreed that banners are not a cost effective way to bring visitors to a website. Most potential customers see banners as just a nuisance. Mass mailings have a similar drawback. If you opt out of spending money for your own list, you can buy space in someone else's. But you'll be competing for the customers attention with whoever else bought space. Another disadvantage that banner ads and mass-mailings have is that they aren't targeting the people who are most likely to be interested in the content you have to offer. They may reach a wide audience, but most of that audience will recoil in horror, yell quot;Spam!", and delete your expensive ad before having the opportunity to realize that you are exactly what they're looking for. While both traditional methods can and do work, they require a lot of what most of us don't have: money.

The most cost-effective advertisement would have to be both cheap and targeted at exactly the right customer. Years ago this was something of a pipe dream--today it is a reality. The Internet is the most comprehensive source of information in human history. But like any library of knowledge, it must be catalogued and organized to be used effectively. And therein lies the perfect solution: search engines. Like the card catalog of a library, search engines are a customer's way of sifting through the Web to filter out what they're looking for. And what better way to make sure you are found then to have an influence on what they find? Search engine listings meet both of our criteria in terms of focus and affordability. No other form of advertising is so focused that the customer is actually searching for you. A search engine user is a highly receptive and targeted audience because you are not trying to sell them on something they don't already want. They have come looking for you and they already want what you specifically have to offer. Furthermore, the great majority of search engine listings are free. Return on investment couldn't be better-- with none of your dollars spent, the very first dollar returned is profit.

According to the Georgia Institute of Technology, 88% of Internet users find new web sites through search engine listings1. WebCMO data shows that in a side-by-side comparison of different forms of promotion, search engine listings are the number one way to generate traffic on websites2. Search engine listings send droves of visitors to your site and they are free. I have personally seen websites where traffic has increased ten-fold as a result of good search engine positioning. Nothing could be better, but there is a catch.

Getting listed on a search engine below 499 other websites simply won't work. You need to get a listing near the top of your category to collect all the traffic a search engine can deliver. But the good news is that you can dramatically improve your positioning with a little bit of elbow grease. All it takes is some key modifications to your website and a little thought. It's not that hard and you can do it. Using a system like BuildTraffic to guide you through this process is the ideal way to simplify your work. It guides you through every aspect of the Search Engine Positioning process so you don't have to be a tech-savvy guru to get excellent results. With good search engine listings a small business can project a large image on the Internet and get the kind of traffic that so many big business sites get.

Search Engine Marketing Guide

Search Engine Marketing Guide

TABLE OF CONTENTS

 

Introduction to search engines
Search engines, directories and PPC-engines

Optimizing your website
Search engine optimization
Links and navigation
Text and META-data
Popularity measurement
Common problems

Submit and index pages
Getting indexed
Submitting to search engines
Excluding pages from getting indexed

Monitor and analyze results
Traffic analysis
Ranking analysis

INTRODUCTION

Search engines and directories can be one of the most effective ways to acquire visitors to your website. This document is a proven approach to effectively promoting your site through search engines and directories.

When completed, you should be able to:
  1. List your site with search engines and directories.
  2. Make your site visible (indexable) to search engines and directories.
  3. Prevent problems that could cause your site to be delisted or banned from a search engine or directory.
  4. Begin receiving visitors from search engine and directory referrals.

Search engine and directory systems are constantly changing. The following was written and reviewed by leading search engine experts with the most current information available.

back to top

INTRODUCTION TO SEARCH ENGINES

Search engines, directories and PPC-engines

Search engines are usually categorized into the following groups:
  1. Search engines
  2. Directories
  3. PPC-engines - pay per click search engines

Search engines use computer programs ("spider") to browse through the entire Internet, read pages, and automatically index the content into large searchable databases.

Directories are human powered hierarchical structures of links to websites. The largest directories are Yahoo, LookSmart, and ODP.

PPC-engines have greater similarities to other online advertising products than traditional search engines. The biggest PPC engines are Google, Overture, eSpotting, FindWhat, and Kanoodle. In PPC-engines, users bid for the position they want in the search results. The highest bidder receives the top result, the next highest bidder receives the second result, and so on. You can learn more about PPC-engines at Pay Per Click Search Engines and Search Engine Watch.

The rest of this tutorial will focus on search engines and how to optimize your web sites to become more visible and to attract more relevant, interested visitors. We hope you find this is a valuable tutorial on how to build a search engine-friendly website.

Work out a long term strategy

Generally speaking search engine optimization is a long-term strategic tool rather than a tool for short-term campaigns.

There are many ways to increase the number of targeted visitors your website receives from searches. You can optimize your website, use inclusion programs, pay for business express reviews in directories, purchase sponsor links or bid for positions in PPC-search engines.

For each of these options there are even more options. What optimization strategy do you implement? What PPC-engines can benefit your business and what tools do you choose to use?

This tutorial will focus on search engine optimization. However, we encourage you to look at other available methods of acquiring qualified targeted visitors from search engines.

back to top

Three steps on the way to success

The following will detail three key steps involved in effective search engine optimization:
  1. Optimizing your website
  2. Submit and index pages
  3. Monitor and analyze results

After the tutorial we have included a FAQ of search engine optimization problems.

OPTIMIZING YOUR WEBSITE

Search engine optimization

Every search engine has its own algorithm that determines the ranking of documents within search results. A search engine algorithm is a complex set of computer-based rules for how to interpret and rank web pages when users do a search for a particular word or phrase. For obvious reasons, search engines do not publicly reveal the detailed logic behind such algorithms.

There are lots of similarities in the way search engine algorithms work. To learn more about how search engines work or how to improve your rankings, visit Search Engine Watch to find information on search engines and their rules.

The three most important elements of search engine optimization are:
  • Links and navigation
  • Text and META-data
  • Popularity measurement

Keep your website online

With the billions of web pages search engines have to visit each month they usually only have time to request each web page once. If your web server is not accessible at the time the search engines visit your website, your web pages won't get indexed. Don't panic!

Most search engines crawl the web on a regular basis. If some of your web pages have been dropped because your server was down, they will probably get back into the index the next time the search engine visits your website.

Introduction to search engine spiders

Search engines use small computer programs to browse through and read web pages on the Internet. Such programs are often referred to as "search engine spiders" or "search engine robots". When a search engine spider visits a page it first reads the text on the page and then tries to follow links from that page off to other web pages and websites.

You can think of a search engine spider as a very simple form of automated browser. It traverses the Internet like a human would do with a browser - just many times faster and fully automated.

Unlike ordinary browsers, search engine spiders do not "view" web pages - they simply "read" the content and click the links. Search engine spiders do not enjoy or appreciate the great design and "cool" images on your web pages.

Validate HTML

HTML code is the most fundamental part of every web page on the Internet. In a browser, you don't see the actual HTML code, but only the final result - the web page as it should look.

Search engine spiders, however, can't see the web site's final result, and instead, focus on the raw HTML code.

Most browsers can interpret HTML code that has errors and still show a nice looking page. However, you never know how a search engine spider will interpret the same errors. It may skip the part that uses incorrect syntax, or it may skip the entire page or website. There is no way to predict what the search engine spider will do when it encounters code it doesn't understand.

If search engines cannot decode your website, follow the links, and read the text, your site will not get indexed. It is not enough to validate your web pages by looking at them in your browser. The HTML syntax needs to be correct - you need to validate your HTML.

TIP:
Some editors have built-in HTML validators. Use them! If you are using Search Engine Submission, you will have access to the built-in HTML syntax checker. The syntax checker is a simple version of a full HTML-validator and will serve as a useful tool to find and correct errors in your HTML code.

If you do not have access to any of these validators, we strongly recommend using the free HTML validator or one of the many validators you can find on the Internet.

Stay away from "alternative HTML"

Some search engine optimization "experts" will advise you to use non-standard HTML to squeeze in extra keywords in hidden places in your HTML code. Do not do that! Search engines penalize websites that use hidden keywords either by de-ranking the websites or erasing them completely from their index. Always use valid HTML.

back to top

Links and navigation

This section examines how to build links that search engine spiders can follow.

A good navigation structure makes it easier for search engines spiders and users to browse your website.

If it is easy for users to navigate your site, they will most likely return. If it is easy for search engines to follow your links it is more likely that they will index many of your web pages.

The more web pages you have indexed, the more likely it is that one of them will be found in a search, thus leading a prospect to your website.

Use hyper linking that search engines can follow

Search engine spiders can't follow every kind of link that webmasters use. In fact, the only kind of hyperlinks that you can be certain all search engine spiders can and will follow are standard text-based links.

In HTML code, text-based links are written like this:

<a href="http://www.domain.com">Link text</a>

The link will look like this on a web page: Link text

All other forms of links can pose a problem for search engine spiders.

TIP:
If you are using a WYSIWYG editor or content management system, it will usually output the correct HTML code for you when you insert a text link. You can verify that it is indeed a text link by highlighting the link text in your browser. If you can highlight it letter by letter like most other text on a web page, then you should be safe.

Image maps or navigation buttons

Sometimes designers want to use graphics for navigational links. There are two common ways of making graphical links that are very different: Image maps and Navigation buttons

Image maps can have different areas of one image linking to different web pages. That is a very common way to build top- or side-navigation bars or country maps.

Most search engines do not read image maps, so you should always add additional text-based links on each page for search engine spiders to follow. The text-based links will also be good for visitors who do not use a graphic capable browser, or are surfing with graphics turned off to save bandwidth.

Navigation buttons are single images that each link to a specific web page. Most search engines spiders will follow this type of link. However, if you want to be absolutely sure that all search engines can and will follow all your links, we still recommend adding pure text-based links in addition to any kind of graphical links.

Flash, Macromedia and Java Applets

Flash, Macromedia, and Java Applets have something in common: They all require the user to have a special program installed on their computer to open and play the files. Most new browsers come with the necessary programs installed to play the most common file types. However, search engine spiders usually do not open such files.

We recommend that you do not build your navigation links using any of these techniques. If you do, you should only serve the high-tech version of your website to users that have the right plug-ins installed, and have a low-tech version ready for all other users. The low-tech version should include regular text-based links. This will work well with the search engine spiders as well as the users that do not have these plug-ins installed.

TIP:
If you are using techniques to present navigational links that require the user to have a certain plug-in installed, you can add standard navigation text-links for the search engine spiders to follow in the footer of each web page.

Java Script and DHTML

JavaScript is widely used to perform simple client side operations. The script is loaded into the users' browser along with the HTML and can then be executed without having to contact the server again. That is great for a lot of things like shopping carts and dynamic navigation schemes. However, there is one big problem: Search engine spiders do not read JavaScript!

Thus, do not put important things like your main navigation in Java Script. And if you do, make sure that the same links are also available as standard text-links somewhere on your web pages.

TIP:
If you are using JavaScript or DHTML to present navigational links, you can add standard navigation text-links for the search engine spiders to follow in the footer of each web page.

Non-HTML formats (PDF, MS Word etc)

Even though some search engines have started to read and index non-HTML file formats such as Microsoft Word, Microsoft Excel, and Adobe PDF (Portable Document Format) files, none of the search engine spiders will follow links within these formats. If you want search engines to follow links on your website, they should always be placed in your regular web pages - your HTML documents.

Do not place text in such formats if you want people to find it in search engines. At least make the content accessible in HTML format too - as a regular web page. Not all search engines will index these non-HTML formats and many will make more errors indexing them.

back to top

Text and META-data

If you want good rankings in search engines, you need good content. Content is King! The more good content you have the higher your chance of being found. It's like a lottery. For each piece of valuable content you have, you have a ticket in the "search lottery". The more tickets you have, the higher your chance of having someone click on your link.

On a web page, some text can be seen as part of the visual presentation and some text is "hidden", including META-tags, title-tags, comment-tags (something programmers use to help them navigate the code) and ALT-tags (text descriptions of pictures).

Search engines have reduced the amount of importance they place on META-tags, with the exception of the title-tag. More weight is placed on the visual text - the text that users will see when they arrive at the web page.

The title-tag is technically not a META-tag. It is the most important HTML tag on your site. The title-tag displays the site's name, which will appear in the top of the browser. All major spider-driven search engines consider the keywords found in this tag in their relevancy calculations. You should pay special attention to this tag, as it carries a lot more weight than most other objects of a web page. You can read more about how to write good title-tags in the next section.

Introduction to keyword research

Before optimizing your website, you need to know what keywords your target group is using. Your keywords are the words and phrases that people might use to find your products, brands, services, or information, via search engines.

You can probably develop a few ideas very quickly. If you run a travel portal, keywords such as: "travel", "vacation", and "holiday" might be good ideas. In fact, a deep research into the topic of "travel" would probably show more than 100,000 different keywords people use when they search for travel information.

Does that mean you have to target all those keywords? No, absolutely not. You should use such research only as a good starting point. Use it to learn the search behavior of your audience. Find out what they call things, how they identify subjects, how precisely or broadly they generally search.

You may discover that the majority of searches for "travel" come from people using combinations of "travel" and a certain city, country, or region. Or you may verify that people search for "travel" more than "vacation".

All this research should be used in the way you tailor your web pages and the way you write your text, titles and META-tags. If most people in your target group use the term "travel" to describe your product, so should you.

TIP:
Your most important keywords will often include some of the following:
  • Your company name, including synonyms and abbreviations
  • All product and brand names that you feature on your website
  • Countries, regions, or cities you have an association with - often in combination with the one of the words above
  • Relevant generic terms for your business, brands, or services (e.g. car, house or boat)
  • Combinations of 2 and 3 keywords - many people search for phrases rather than single words

We recommend using our Keyword Generator; this tool wil lhelp you find the best keywords to use in your site.

Adding keywords to your website

Now that you know the keywords you want to target, it is time to implement the keywords on your web pages. The big question for many people is: How do I do that?

Basically you implement your keywords by using them on your web pages. It's that simple! The first and most important thing to do is use the words or phrases throughout the text on your web pages.

Some people confuse adding keywords to a web page with the META-keyword tag. As you will learn in the next section, the META-keyword tag does not count much in how well a web page ranks, so adding your keywords here and nowhere else won't help you at all.

Each section of your web page carries a different weight in the overall ranking process. In the next section you will learn the parts of your web pages that require special attention.

Focus on a few keywords for each web page

In most cases a web page will only rank well for a few keywords - the one that search engines determine to be the most important for that page. If you run a travel portal you cannot expect to rank well for the thousands of travel related keywords people are using if you only optimize your front page. Instead, you have to optimize each of the web pages on your website that have content people are looking for - the keywords you have developed through your research.

Important places to use your keywords

Search engines weight keywords according to their placement on a web page. There are some spots that are more important than others - places where you should always remember to use the most important keywords for each web page.
  • Page titles
    The title of your page is the single most important place to have your keywords.
  • Headlines
    Headlines carry more weight than the rest of the text on your pages. This is because text found in headings usually identifies a particular theme or section of content. The headlines have to be formatted using the HTML-tags <H1> to <H6> to be identified by search engines.
  • Body copy
    Most people forget this is the most obvious place a search engine looks for relevant content. You have to use the keywords in the main viewable text on your web page. This is what the users will actually see, whether human or machine. If the keywords are not on the viewable page, then they should probably not be in any other area of the web page.
  • Links
    The words that are hyper linked on your web pages are sometimes weighed more heavily than the rest of the words in the main body text. So if you want to rank well for "pet shop Boston" you should use that phrase as a hyperlink somewhere on your page.
  • META-tags
    META tags should contain the keywords that appear on the page. As a general rule, if it is on the page, include it in the META-tags. However, the page will not rank well on their use alone. You can read more about META-tags in the next section.
  • ALT-texts
    The ALT-tag is also called "alternative text" and is used to describe images. You can see the text when you move your mouse over an image on a web page (that is, if they have added the ALT-tag). Some search engines read and index the text in ALT-tags but the weighting given varies from engine to engine.
Once again, always remember to only use keywords that are relevant to each of your web pages.

Introduction to page titles

The page title is the single most important piece of information on your page when it comes to achieving good rankings in search engines. The words you use in your title carry much more weight than anything else on your page. So use this limited space wisely.

Page titles are written this way in HTML code:

<title>Page title here</title>

The page title-tag is placed inside the header tag in top of the HTML page:

<head>
<title>Page title here</title>
</head>

TIP:
You should limit your page titles to 50-60 characters as most search engines do not read or list more than this.

How to write good page titles

First, make sure that all your web pages have unique titles. Go through each of your web pages and write a title that makes use of the most important keywords. The page title appears at the very top of your browser; it may or may not be the title on the front of your page.

Keep in mind that the page title is what the users will see as the hyper linked title in search results, so the phrase should trigger users to click on the link and visit your site!

The goal is to make titles that make people click and make use of your primary keywords for each page. If you want a page to rank well on "dental services Chicago" make sure to use those exact words in the title. Maybe a title like "Dental Services in Chicago - open 24 hours a day" would work well for you.

TIP:
To find the right balance between the use of your keywords and writing a catchy title that makes users click, we suggest you write a few titles for each page. Don't think too much about each of them - just write them down on a piece of paper. When you compare them side by side, it will often be easier to pick the best title.

Introduction to META-tags

META-tags are pieces of text placed in the header of your web pages. Text in a META-tag is used to describe the page. META-tags are often referred to as META-data, which means "data about data", or in this case, data about your page.

META-tags can help search engines better understand and present links to your website. Thus, it is highly recommended that you fill out the META-tags on ALL of your web pages.

META-tags that are important to use

There are many kinds of META-tags and standards, but the only two you need when optimizing a website are META-description tag and the META-keywords tag.

The two META-tags are written this way in HTML code:

<META NAME="DESCRIPTION" CONTENT="Put your description here ….">
<META NAME="KEYWORDS" CONTENT="Put your keywords here ….">

The META-tags are placed inside the header tag at the top of the HTML code and just below the page title:

<head>
<title>Page title here</title>
<META NAME="DESCRIPTION" CONTENT="Put your description here ….">
<META NAME="KEYWORDS" CONTENT="Put your keywords here ….">
</head>

Adding META-tags to your web pages

How to add META-tags to your web pages depends on how you maintain your website. You should refer to the manual of the editor or content management system you are using for further detailed instructions.

In most cases there will be a simple box somewhere with two fields to fill out: The META-description and the META-keywords, or just "description" and "keywords". Your program will usually take care of inserting the necessary HTML code in the right place for you.

If you are coding your web pages by hand or in a simple HTML editor, make sure you do not make any syntax errors. If the META-tags have not been coded correctly search engines will not be able to read and use them.

Writing good META-descriptions

The META-description is used by some search engines to present a summary of the web page along with the title and link to the page.

Not all search engines use the META-description. Some search engines use text from the regular text on your web pages and some use a combination. All you can do is make sure you have valid META-descriptions on ALL your web pages so that the search engines that use them have something to read.

TIP:
You should limit your META-descriptions to 150-200 characters as most search engines do not read or list more than this.

Writing good META-keywords

META-keywords are very often misused and misunderstood. Some webmasters put hundreds of keywords in this tag in the completely unrealistic hope of receiving good rankings for everything listed. Some webmasters even include keywords that have absolutely nothing to do with the actual web page or website.

That has forced the search engines to lower the META-keywords importance in the determination of what a web page is about. However, some search engines still read and use the META-keywords. Thus, remember to write relevant META-keywords for all your web pages.

You should write a unique set of keywords for each web page. You should NEVER copy the same set of keywords for all your web pages.

Do not add keywords that are not 100% directly related to each web page. To be safe, only use keywords that are found in the visual text on your web pages.

It is often discussed whether or not to use commas between keywords. Most search engines will not care as they remove the commas before reading the keywords, but some search engines might still use the comma to find exact keyword and keyword phrase matches.

In any case, the META-keyword tag does not carry much weight in the overall ranking algorithm for any of the major search engines and you will never get penalized for either using or not using commas.

Some people use commas because it makes it easier to read in the raw code. This can be helpful if you want to edit the META-tags by hand at a later time.

TIP:
The limit for the META-keyword tag is 1000 characters but you should never add that many keywords. Insert the 3 to 5 most important keywords for each given web page in the META-keywords tag - no more! The more keywords you use, the less weight they each carry.

Will META-tags give me top rankings?

No, META-tags are not magic bullets. You will not achieve top rankings on keywords if you only use them in your META-tags.

However, we still recommend you fill out your META-tags as some search engines do read and use the data. In those search engines it will give you a better chance of receiving relevant visitors if you have correct, optimized, and relevant META-tags on all your pages. Remember, only those words that appear in the main body of the page should be in the META-tags.

back to top

Popularity measurement

For a search engine to understand which pages on the Internet are the most relevant to search users it is simply not enough to look at the content on each web page. Most search engines today analyze both link structures and click-streams on the Internet to determine the best websites and web pages for any given query.

Convincing people to link to your website is not something you do overnight. It takes time. By implementing a strategy to improve your link-popularity, you can gain a long-term advantage. Similar to building a good reputation or a trusted brand, it takes time. But once you have that "trust", the value is high.

Getting the most relevant websites to link to you will not only be good for your link-popularity, as measured by search engines, it will most likely lead to many good visitors coming directly from the referring websites.

Tip:
The following list can serve as an inspiration for where to look for relevant links
  • All the major and local directories, such as Yahoo, ODP or LookSmart
  • All trade, business or industry related directories
  • Suppliers, happy customers, relevant sister companies, and partners
  • Great combinations (if you sell knifes and he sell forks, then link to each other)
  • Related but not competing sites
  • Competitors (some companies link to all major competitors)

Link-popularity

The link popularity element measures the number of links between web pages on the Internet. There are two kinds of links to focus on: inbound links and outbound links - links to your website and links from your website.

You can look at inbound links as a form of voting. If site A has a link to site B, A is voting for B. The more votes B gets from other sites, the more link-popularity B gains.

Each website that votes (i.e. has outbound links) is a delegate as they themselves have a number of voters behind them - the websites that link to their website. The more voters they have behind them, the more weight their vote carries on other sites. So the most popular websites have the most link-popularity to pass on.

Some search engines can "understand" the theme of related web pages. For these search engines, inbound links from thematically related websites might get a better scoring.

Outbound links are a bit different. They work similar to references in a scientific report. If you have references to the right authorities in your profession, it shows that you know what you are talking about. You acknowledge the "masters" and thereby put yourself in a category above the ones that don't.

TIP:
Not all links count the same! Links from recognized authorities in your industry count more than links from a small private website on a free host.

Do not use free submission software or services to submit to hundreds of thousands of free directories and search engines just to gain more inbound links. Links from most of these places won't do you any good and there is even a risk that some of the links you get this way will harm your rankings.

Do not participate in organized exchanging of unrelated links between websites (link farming), to boost you link-popularity factor. Most search engines consider that to be spam. Instead, focus on getting inbound links from relevant major players in your field - they are the only ones that really count.

Internal link popularity

In addition to inbound and outbound links, link structure on your own website has an important role in determining the value of each of your web pages.

The pages throughout your website that you most often link to will gain the most internal link-popularity. If one of your web pages has 500 internal pages pointing to it and another only has 10, the first web page is more likely to be valuable to users as there are more pages "voting" for that page.

TIP:
Typically a website will have a navigation bar of some kind that points to the 5-8 most important sections of the website. This navigation bar will be on all web pages on the website and therefore boost internal link-popularity on those sections. Make sure to include links to the web pages you most want to rank well in your cross-site navigation bar and make sure that the pages the navigation bar link to have good targeted content. The pages you link to in the navigation bar will be easier to rank well in search engines.

Click-popularity

Clicks are another way of measuring popularity. Of the millions of results that search engines serve, only a few of them get clicked. Sometimes people come back to the search result shortly after they click, and click another link or make a new search.

If analyzed, these clicks can tell the search engine what users find most important, and where they find the best answers to their searches. However, click-popularity is not widely used and not a dominating factor.

Also, click-popularity is one of the parameters of search engine rankings that you should not try to manipulate in any way! Don't try to boost your own click-popularity by doing hundreds of searches and then clicking on all your own links. It won't work. Search engines that apply this method are way too smart for that.

back to top

Common problems

There are a number of things that can cause a search engine to be unable to read and index your website or specific web pages correctly. This section will provide a brief introduction to some of the most important things to watch out for.

If you want to learn more about how to solve each of the issues described below, visit Search Engine Watch.

Frames

Frames have always been a problem for search engines and some users to understand. Some studies show that users have a problem performing simple standard tasks that they would otherwise be able to solve with a similar website not using frames.

Jacob Nielsen is a well-known and respected usability guru that has some rather extreme views on the use of frames. Read his article "Why Frames Suck (Most of the time)" to understand the arguments against using frames.

When a website uses frames, users can rarely bookmark specific pages deep inside the website. If they use the browsers bookmark function and click it later to return they will end up on the front page - even if they did the book-marking from another web page deeper in the website.

If users can't bookmark pages within your site, neither can the search engines. If users can only bookmark your front page - as is the case with most websites using frames - that is also the only page most search engines will link to.

Some search engines might try to browse and index individual pages on your website outside the normal frame set. However, that will leave you with a whole new set of problems. If the search engines send visitors directly to content pages that are supposed to be viewed inside a frameset it is usually difficult, if not impossible, for users to navigate further into your website.

To solve this, you can implement various sorts of scripts to re-fetch the content page in the right frameset if it is requested outside the frameset. You should refer to your programmers and JavaScript references for further instructions of how to do this.

Some search engines read the content in the NOFRAMES section of a frameset. The NOFRAMES section is a special part of the HTML code that is presented to users with browsers that cannot read frames (e.g. search engines, very old browsers, and homemade viewers).

The code looks like this:

<noframes>Place content here</noframes>

Browsers and search engines that don't support frames can read the content you place between the two NOFRAMES-tags.

To use the space the best way you should …
  • Fill the section with the same text that you have on the page visitors see. Just use the raw text without images and never use text not visible to the users.
  • Remember to also add links in the NOFRAMES section for the search engines to follow. Use the same navigational links as you do on the page you show your visitors, but remember to keep them as normal text-links.
Using the two tips above does not in any way guarantee that your web pages will be indexed and rank well. Many search engines will not read the content in the NOFRAMES-section.

There are more advanced options available if you want a frames-based website indexed and ranked well. The easiest solution, however, is still to get rid of the frames.

If you want to use more advanced methods to make your frames-based website more search engine friendly, consult with an experienced search engine optimization expert or firm.

Two more ways of solving the problems of using frames are:
  • Information pages
  • Inclusion programs
Information pages are static HTML pages (no frames) with information about key products or services provided on your website - one information page for each key topic. An information page is optimized with simple HTML, but otherwise looks and feels like the rest of the website. Creating such pages can in some cases produce search engine friendly entry points to your website.

If you can make additional entry points, you can use Pay For Inclusion programs to have the pages indexed in major search engines. You can use the inclusion programs even if the website is framed - as long as you can provide a unique URL for each target or entry page.

TIP:
Some search engines might compare the content you have in the NOFRAMES-section with what is presented to normal users. If the search engines do not find the same text in the NOFRAMES-section as they do on the user pages you might get penalized for spam.

Dynamic websites and web pages

Dynamic websites are very difficult for search engines to read because they use long and strange URLs.

A traditional static website is comprised of a number of individual files usually ending with .html - e.g. index.html, products.html etc. Each page is a unique file and usually has unique content.

A dynamic website very often only has one or a few files called "templates". The template dictates how to present the content but holds no content itself. All the content is stored in a database. To show a page on a dynamic website, you need to tell the template what to load from the database. Technically, that is done by adding "parameters" to the file name.

If the template is called "pages.asp" and you want to load the content with ID #54, the URL could look something like this:

www.YourDomain.com/pages.asp?page_id=54

That part is not so complicated. However, there is often more than one parameter attached to the URL as the database needs to have more information about sort order, navigation etc. This is when it gets complicated for the search engines.

There is simply no way for a search engine to be sure what parameters identifies a new page and what parameters are just a sort order of the content, a navigation setting, or something that does not justify indexing the new URL as a new unique web page.

There are many other problems related to having dynamic websites and websites built on content management systems indexed in search engines. It is unfortunately not possible to cover all of it in this tutorial.

The good news is that more and more search engines are now starting to include dynamically created pages in their results.

Flash and Macromedia Director

Search engines do not read Flash and Macromedia files. Content and links in any of these formats will not be accessible by search engines.

Search engines basically only read regular text on a web page. They don't read text and follow links within special formats such as Java Applets or any other formats that require the user to have a certain program installed.

You can read more about Flash, Macromedia and Java Applets in the Links and Navigation section.

Image maps or navigation buttons

Not all search engines read image maps. Most search engines read navigation buttons. If you use image maps you should add the same links as you have in the image maps as regular text-based links.

You can read more about image maps and navigation buttons in the Links and Navigation section.

JavaScript and DHTML navigation

Search engines do not read JavaScript. You should not put important content or navigation inside JavaScript unless you also have the same links and content duplicated as regular text and text-based links.

IP delivery, agent delivery, cloaking, and personalization

It has become more and more common for websites to implement some degree of personalization, regional targeting, or other forms of individualization in the way web pages are served to individual users.

In its most simple form it can be a program on the server that checks what browser people are using, and serves up a specially tailored version for that browser. The same kind of program can also be used to check if people have specific plug-ins installed so that they get a version of the website they can read.

A more advanced use would be to check what country the user is coming from to serve a local version. Some portals and search engines have been doing that and many other cross-national websites do it. There are many legitimate reasons to do so including business strategies, marketing, and legal issues with products only allowed in certain countries.

The same techniques can also be used to serve different content to search engine spiders and normal human visitors. These techniques are usually referred to as "cloaking".

In most cases search engines do not like the use of cloaking. Thus, do not use cloaking unless you have a very good reason to do so.

Cookies

Spiders do not read cookies. Therefore, if your website uses cookies, make sure that all the web pages you want to have indexed can be accessed without accepting the cookie.

A cookie is a small text file that a web server can save in a users' browser for future retrieval when that same user returns. It can be used to store login information or other preferences to make it easier or better for users to use a particular website.

Cookies are very safe to use in the sense that they cannot be read or shared across different users or websites. If a cookie is set on your browser, the website that wrote it is the only website that can read it. Also, other users of a website cannot get to the information in your cookie - it can't be transferred or shared.

Thus, even if the search engine spiders did accept cookies there would be no way for them to re-fetch it in the users' browser when they click a link in a search result. Consequently there is really no point in writing cookies for the search engines or sending cookies to the search engines.

Tip:
You can turn off cookies in your own browser to test if your website can be accessed without them. Refer to your manual or help files of the browser you are using for instructions on how to turn off cookies. Usually these instructions will appear under "advanced options".

back to top

SUBMIT AND INDEX PAGES

Getting indexed

Search engines do not always include all web pages from a web site. Usually they will only include a sample of your pages - the ones they determine to be most valuable.

Some of your web pages will be more important to have indexed than others. A product information page is far more important to have indexed than a contact form, as it is more likely someone will search for your products than your contact information.

Search engines do not always find the right pages to index by themselves. Sometimes they need a little help and guidance. In this section you will learn how to submit your web pages to search engines to get them indexed.

back to top

Submitting to search engines

Submitting a website to search engines is not always enough to be included or continue to be included. Most search engines analyze the link structures and click streams on the Internet to determine which web pages are most important for them to keep in their indexes.

If no other site is linking to your website it will be harder - in some cases, even impossible - to get indexed. You should therefore try to get as many of the biggest and most relevant websites to link to you.

How to submit to search engines

Most search engines have a form you must fill out to submit your website. You will usually find the link in the bottom of the front page labeled "Add URL".

Generally you should only submit your front page. The search engines will follow links from that page to the rest of your website. However, it is a good idea to submit all of the pages from your website. BuildTraffic.net makes this easy by spidering your domain and retrieving all our your pages for submission.

However, if you have important sections of your website that are not directly accessible through the regular navigation you can also submit them. If you have a site map (a page with links to all the web pages on your website) you can submit that too, to help the search engine spiders find all your content.

TIP:
The easiest way to get your website into the search engines is by having it included in the major directories first, such as Yahoo, ODP, and Looksmart. Most search engines will consider a website that is included in the major directories to be of higher value and therefore put a lot more focus on getting such websites included.

Is your site indexed?

In most of the major search engines you can verify if your website or specific web pages have been indexed. This is not the same as your ranking, but serves as the basis for a user to find your site. If your website is not indexed it will never rank and never be found.

For example in Google just do a search for your domain name. If you appear in the results then you are listed.

Do not over-submit

You should never submit a web page to a search engine if it is already indexed. Unless you have made changes to the content of your website or have noticed that your site is no longer listed.

back to top

Excluding pages from getting indexed

If there are pages or directories on your website that you do not want search engines to index you can exclude them using one of two different methods:
  • Robots.txt
  • META-robots code

Robots.txt

Robots.txt is a file that you place in the root of your web server. The file uses a simple syntax to exclude specific types of users - in this case search engine spiders - from parts of your website. You can either exclude specific search engine spiders or all spiders. To exclude all search engine spiders from all directories on your web server, you should write the following:

User-agent: *
Disallow: /

Note: This would disallow everything including your home page!

Learn more about how to write robots.txt files at SearchTools.com.

TIP:
We recommend that you validate your robots.txt file before uploading it. There is no way to predict how a search engine will interpret a robots.txt file with errors. You can use the free validator at Search Engine World.

META-robots

META-robots are small pieces of code you can place in the header of your HTML documents. You can use META-robots tags if you don't have access to your web server's root or if you want to exclude single pages on your website. You can read more about how to use the META-robots code at SearchTools.com.

back to top

MONITOR AND ANALYZE RESULTS

Traffic analysis

There are various ways you can track and monitor the traffic that comes to your website as a result of your search engine optimization. With some of the methods you will be able not only to see how much traffic you received, from what search engine and what keywords people are using to find you, but also how users behave after they land on your website and how many of them actually end up as customers.

All that information is available to you! Some of it is quite easy to get access to. As you scale up and want more information and better and faster reporting you will have to spend more resources, on computers, software programs and not the least - the time to handle it.

In this section we will give you a very brief introduction to each of the main ways you can monitor and analyze the traffic on your website and the results of your search engine optimization.

TIP:
You will find a good and more in-dept tutorial about traffic analysis at DigitalEnterprise.org.

Collecting data, mining data and reporting results

There are three distinct parts to traffic analysis:
  • Collection of data
  • Mining/analysis of data
  • Reporting results
First, you need to collect the visitor data that you want to use as a basis for your mining and reporting. Each of the three methods described in the tutorial have their own advantages and disadvantages.

Second, you need to filter the data - or data mine it - to extract the information you need and make the calculations you want for your reports.

Finally, you need to make some graphical reports of the data you have collected and filtered. This can be real-time or close to real-time reports through a web-browser or printed reports in Word or PDF format.

Sometimes traffic analysis packages come bundled with two or all three elements included. Trackers and network sniffing products usually include all three parts, and server log analysis tools include the last two.

Statistics are not absolute

There are no absolute numbers and standards on the Internet with regards to statistics and traffic analysis. Each method of collecting data multiplied by the endless number of ways to process and mine it makes it virtually impossible to compare the results of analysis across websites.

You should use traffic analysis to track and monitor trends over time for your own website. Traffic analysis can also be used to monitor real-time statistics to tune live ad campaigns.

Trackers - hit counters

Trackers, or hit counters, are essentially small pieces of code placed in a hidden part of your HTML code. Each time your page is loaded the script calls up another server whose database records that your site had a visitor. The tracking server can also record anonymous information about the visitor that requests the web pages at your server.

From a web interface you will be able to follow the traffic on your website in almost real time. The types of reports and features you get access to vary a lot from one tracker to another. With most of them you can see what search engines people come from, what keywords they used, and in some cases what search engine spiders have been visiting your website.

The advantages of trackers:
  • Easy to get started
    All you have to do is copy and paste a piece of code into your pages and you are ready.
  • No computers or software necessary
    You don't have to invest in either hardware or software - it is all on the computer that hosts the tracker.
  • Count cached page views
    Even if a visitor loads the web page from cache (proxy cache at the ISP or local cache in the browser), and not from your server, it will trigger the script and count as a view. All views therefore will be counted.
The disadvantages of trackers:
  1. No raw data
    You get the reports you get - that's it. As you do not have the raw data that is used for the reporting, you cannot perform your own data mining or tweak any numbers. You will have to live with the reporting that comes with the package.
  2. Dependency of others
    With a tracker you are dependant on another company's service. If they fail and loose the data, there is nothing you can do about it.
  3. The best trackers are expensive
    If you run a large website with many visitors, the best trackers can become very expensive. In that case you can always consider other options.
TIP:
You can sign up for a free tracker at ServuStats.com

Server Monitor Log Analysis

The Server Monitor is a small application that sits on the server and counts who is requesting what from the server. The Server Monitor stores the information in a log file - commonly known as a "server log file". The log file is a standard text file with one line of text for each action.

This log file can be analyzed using a number of dedicated applications. One of the most well known producers of such applications is Web Trends.

If you are running your own servers or have a professional hosting solution, you should be able to get access to the raw server monitor log files. In this case you can run your own reports using a log file analyzer. However, for large websites log files can become very large, taking up mega bytes of data every day.

Some service providers run server monitor log analysis off a central server and will provide you with access to online reports. They generally include good standard reports, but you will not be able to perform your own filtering.

Advantages:
  • Standard format
    Server monitor logs are in a standard format - at least, most servers write in a format that all analysis and reporting programs can read and understand. This, and the fact that there are many reporting tools available, makes it a very flexible solution. Using the same log files, you can easily switch from one reporting package to another.
  • Cheap to get started (for a small website)
    For a small website with a reasonable amount of daily log data this can be a cheap way to get started with traffic analysis.
  • Impressive reporting
    Even the small and inexpensive reporting tools have an impressive selection of printable reports with graphs, tables, and explanations.
Disadvantages:
  • Huge log files on large websites
    If you run a website with millions of page views a month you will have extremely large log files - several mega bytes of data a day. All that data has to be transferred, converted, filtered, data mined, and processed for reporting which will require lots of bandwidth and computer power to perform.
  • One more program is one more problem
    The server monitor application is one more program that has to run on the server. The more programs you run, the higher the risk of a crash. If you use trackers or network package sniffing you can turn off the server monitor application and save up to 20% of the resources on the server and limit the risk of the application crashing.
  • Not real time reporting
    Due to the huge amount of flat text file data that needs to be executed, real time reporting or even close-to real time reporting is often not an option. Usually you will have to accept reports aggregated every 24 hours or less.
  • Not all visitor actions are available to the Server Monitor
    Some actions, like the "stop request" (when a visitor hits the browsers STOP button before the web page has loaded), are not recorded in the server monitor log.

Network package sniffing

With this technique the raw data packages on your network are monitored as they pass by. There is no need to insert scripts in your web pages or have the server monitor application produce log files. It is all handled on a network level.

This gives you several advantages over all other monitoring methods. First, you get access to data that neither a server monitor or a tracker can pick up, such as "stop requests" (when a user hits the STOP button before the page is loaded). You will also be able to access extremely large amounts of visitor data much faster. In fact, with some network sniffing products you can get statistics in real time - for any size website.

Network sniffing in itself is just a way of collecting visitor data - like a server monitor producing a log. However, network sniffing products often come bundled with some kind of reporting software as well as modules to output the data as traditional server log files or export it directly to a database for further data mining and reporting.

Advantages:
  • Easy to set-up and run
    No scripts to implement or server logs to maintain.
  • Free up resources on your web server
    When you use network package sniffing you can turn off your server monitor application. This will free up 15-20% of the resources on your servers - that's how much power it usually takes to write the logs.
  • Access to all data
    Network package sniffing is the only method that provides access to all the visitor data.
  • True real time data access
    Network sniffing is the only way to get real time data access and reporting features. When a user sends a request to your server, you actually monitor the data package before it hits your server so you can react on it before your server answers the user.
Disadvantages:
  • Can be a problem to use on secure servers
    If most of your website is on secure web servers, this may not be the right solution for you. Talk with the supplier of the network sniffing solutions you are interested in to find out more about how they handle secure servers.
  • Can be expensive
    For a small site network sniffing can be expensive compared to other methods. For large sites it can be cheaper.

back to top

Ranking analysis

Don't get obsessed with rankings! Yes, it is important to have good rankings for your primary keywords but there are better ways to monitor the results of your search engine optimization campaign than ranking reports.

One of the best ways is through traffic analysis as described in the last section. With the right reporting tools, you will be able to see what search engines visitors came from and what keywords they used to find you. That is a lot more useful to determine the true value of your work.

If and when you do check your rankings, you should not check thousands of keywords for your website 10 times a day. This places an unreasonable load on the search engines servers, which the search engine companies do not like.

Within the services you get using BuildTraffic.net, you can see the average ranking on the keywords that brought you visitors.

Rankings change all the time

Rankings change all the time. Sometimes you don't get the same results from a search engine if you test it at two different locations at the very same time.

If you want to make ranking reports using automated tools or services you should be aware that they only reflect the situation the second the report was made and from the location it was connected. From another location at another time, the results might look very different.

back to top

What Are Search Engines and How Do They Work?

How do search engines work?

Search engines use automated software (known as robots or spiders) to follow links on Websites, harvesting information as they go. When someone submits a query to a search engine, the engine returns a list of sites, ranking them on their relevance to the keywords used in the search. How search engines assess your site and determine the relevance of the words often depends on how the specific search engine works. Some engines, such as Excite, use artificial intelligence to recognize concepts that frequently appear together. Other search engines, list the more popular sites first. There is no way to guarantee that your site will come up on top in a search. After all, we can't all come up on top!

Which search engines should I register with?

All of them, well most of them wouldn't hurt. It is often quoted that the top 7 search engines account for 95% of web traffic combined; Yahoo!, Google, Bing, AltaVista, Excite, Infoseek, HotBot, Lycos and WebCrawler. There are hundreds for you to choose from.

How long does it take to be listed?

Don't expect your site to show up in search engines immediately. It can take anything from 24 hours to 6 weeks or more! It depends on the search engine. Most search engine crawlers typically retrieve only a few pages from each site on each visit and visits can be weeks apart.

How do I get my Website to the top of search engines?

I have read so much that if your Website doesn't appear within the top 10 of search engine listings, then people aren't likely to find you. This may be true, but it's not the only place where people can find you and with millions of Websites out there, we can't all be in the top 10! I have found a lot of new Websites by links from other Websites There is no simple trick that will get your Website to the top of search engines. Also the search engines change the way they prioritize the listings every few months, so you can spend all your time trying to beat the system.

What are Search Engines looking for ?

Search engines rank web sites on 5 main criteria:

1) Web page Title
First, and probably most important, is the title of your web page. If the title of your home page for example says simply ABC Company Home Page - you're going to find your listing at the very bottom of any search result list. Each of your web pages has a title (you'll notice it in a blue bar, across the top of your browser window). The title of each page should, in one sentence or less, describe the contents of the page with as much detail as possible. For example, don't just put "Homepage ABC Golf Store", instead use something like, "ABC Golf - retailer of golf shoes, new golf balls, used golf balls, designer clubs and more!"

2) Description
Next in importance is the description meta tag. This tag is not seen by visitors but is embedded within the html code of your web pages. The tag should describe the contents of the current page - and where possible, use some of the same keywords that were used in the web page title (this will increase your relevancy ranking in many search engines). The description is often found in search engine listings. First they list your title, then they follow the title with your description. So, think of the page title and the description as a small advertisement. It has to be specific and catchy enough that a use! r will click on your listing instead of someone else's listing. In both the description and title tags, where possible, you should try to use unique words and phrases - this will help to differentiate your pages from competitors' pages.

3) Keywords
The keyword meta tag isn't used by all search engines, however it can still be quite effective. Search engines use keywords to categorize web pages. However, simply putting in a long list of words won't do you much good (if they are too general they will be ignored, and if you have too many your page may be black-listed (i.e. removed from the search engine). Instead of using keywords, you should use short specific phrases. For example don't use "golf accessories, golf balls, golf clubs" - instead use "custom golf tees, white golf gloves, used golf balls, golf clubs for left handed players"

4) Popularity
Some search engines will give more popular web pages a higher ranking. How do they determine popularity? The more pages that have a link to your page, the more popular the search engines will consider your page to be. And, if the linking pages have similar content to your page (i.e. they are about the same subject matter -using similar keywords, and descriptions), you'll also get a relevancy boost. Wherever possible, you should see if other web site owners (non-competitive, of course) would be willing to put a link to your site. Often they'll link to you, if you link to them. Or, if you provide information on your site that would be useful to their readers, they're likely to create a link to your page anyway.

5) Page Content
It's important to make sure that your pages contain a lot of text. Search engines can't index images - so even if some of your content is displayed graphically, you should also have it presented in a text format. The content of your pages should contain some of the same words that you used in the page title, description, and keyword meta-tags. This increases your relevancy positioning. In fact, if it seems to the search engines that the body of your page is not related to the page title or description they will penalize you by giving you a much lower ranking or dropping your listing all together. Also, keep important keywords in the content section of your page as close to the top of your web page as possible. The search engines will give your page a boost in the rankings because they will assume your page is more relevant to those keywords than other web pages that contains the exact same keywords, but lower down in the body of the page text.

Now that you have a basic understanding of what the search engines are looking for you need the tools that will allow you to use your knowledge to skyrocket your listings to the TOP of Search Engines.

BuildTraffic.net offers search engine optimization tools that will get you a top ranking in the search engines.

These are the same tools used by the "Search Engine Experts".

Search Engine Optimization Tutorial

Part 1 - Why Search Engines Are Important To You . Learn how much traffic you could be getting.
Part 2 -
The Basics. What's the difference between Yahoo and Alta Vista, and more?
Part 3 -
How to get listed. What to do and what you can expect to happen.
Part 4 - How to improve your ranking. Just being listed isn't enough. Learn tricks to get a top ranking.

Part 1 - Why Search Engines Are Important To You

You've built your site. You've tested it. You're ready for business. So where are all the people? You need to get the word out about your great site, but with limited resources and no advertising budget, you need some way to attract visitors. What can you do? Try the No.1 way people find out about sites - Search Engines! Surveys show that over 85% of internet users find new Web sites by using search engines. Other surveys show that after email, search engines are the most popular activity on the web.

Instead of blowing your limited resources on banners and ads galore, use search engines to send motivated visitors to your site. Click through rates on banner ads continue to drop, while search engine traffic is on the rise and growing everyday.

Search engine optimization( techniques to improve how your Web site ranks in the search engines) has been called the cheapest and most effective marketing tool available. Expect to pay professional firms that specialize in search engine optimization $2000 for a small site - $10,000 is common for big companies. If you are in an extremely competitive business, plan on shelling out $25,000 and up. Oh, and don't be surprised if you have to sign a six month or even a year contract. That's the norm for a professional shop. You have an option to "do it in-house". With a little help from BuildTraffic, you can do search engine optimization yourself, improve your site's search engine ranking, and save yourself a lot of money, frustration, and time.

Search engine traffic is the kind of traffic you want. Why?

Traffic you receive from search engines is already targeted. Visitors arriving at your site from search engines have entered a keyword relevant to your site, so they are already interested in your product or service. This is the best source of potential customers you can have.

Search engines are the number one way users find new sites. Surveys show that over 85% of users rely on search engines to locate information on the Web. If you optimize your site to do well on the engines, then register your site with search engines, you should see increased traffic to your site.

Search engines are free to users and users know where and how to use them. One of the first things a novice to the Internet learns is how to use Yahoo (actually Yahoo is a directory, but we'll discuss the difference later).

Part 2 - The Basics

Before you can position your site to do well in the search engines, you need to understand search engine fundamentals. A search engine is a giant database that lists sites on the Internet. You access the database when you enter keyword searches and receive a list of relevant sites.

Search sites are the Internet's Yellow Pages.

Think of a search engine as a giant, automated version of the yellow pages. If you need information on "party planning" from the yellow pages, there are several steps to retrieving it.

You can repeat this process online using a search engine.

Search Engines Versus Search Directories.

People use both search engines and directories without ever realizing there is a difference between the two.

What is a Search Engine?

Search engines send automated computer programs (called robots or spiders) to crawl the Internet in search of Web pages. Basically these spiders follow links to travel from URL to URL. When they visit your Web site, the robot indexes (or records) the text of your page or pages (if it is a deep crawling spider) and stores it in the search engine's index. Later, when a user enters a search query at the search engine's Web site, the search engine scans Web pages in its index for pages that provide the best match.

In theory, the search engine spider is supposed to be able to find all the sites on the Internet. However, since new sites are being added daily, it's risky to assume that the spider will find you. Expedite the process by submitting your URL to the search engines.

What is a Directory?

Unlike the automated search engine process, each entry in a directory is first reviewed by a human being. You submit a short description to the directory for your entire site, or editors write one for sites they review. A keyword search will only look for matches in these descriptions, so be careful how you describe your site. Techniques to receive a high search engine rating will not work with a directory. While good content is necessary for search engines, both good content AND visual appeal are mandatory in human-edited directories. Remember, manual review takes time! The typical time lag between submission of a site and its actual listing in Yahoo is five months. You can speed up the process at some directories, but expect to pay for that service. This trend will probably continue.

Search Engine and Directory Hierarchy.

There are literally thousands of search engines on the Internet, but naturally you're most concerned about your ranking on the high-traffic sites. Some of the smaller search engines may not bring you a lot of traffic, but your listing gives you another source of links (which can help in your overall link popularity building). If you can do well on the sites listed below, you will probably do well on others too.

Top Search Engines Google bing AltaVista Iwon Lycos HotBot Teoma Ask Jeeves WiseNut Overture Top Search Directories Yahoo - The biggest search site of all. Do well here. LookSmart Open Directory Zeal AOL

Part 3 - How to get listed

Don't wait to be discovered! Submit your URL directly to the search engine or directory.

Search Engine submissions. There are two ways to submit to search engines and directories, manually or using an automated submission tool. Here is a summary of both methods.

Bottom Line: Submit your site to directories by hand. Save time by using a submission tool to submit to search engines.

How often do I submit?

Your best strategy is to submit weekly until your site gets listed. Check your listing frequently. If your site disappears suddenly, you may be the victim of a search engine database omission. Search engines frequently have multiple versions of their databases and they aren't always in sync. You may be listed in one version of their database and not in another. Your only recourse is to resubmit your site.

To see if you have even been picked up in a search engine, go to the search engine's site and do a search with your company's domain name as the search query.

How long does it take to get listed?

These times vary with search engines and directories. The search engines will optimistically report a very short time, but most sites get listed within the following times:

Google

4 - 6 weeks

Lycos

4 - 6 weeks

Bing

6 weeks

The reality is that most search engines and directories are very backlogged and slow to get listings added to their databases. Don't be surprised if you experience waits much longer than these. As mentioned earlier, you may wait over 5 months to see your site get listed on Yahoo!, if it gets listed at all. We recommend that if your site isn't listed within the time periods above, that you resubmit your site to the search engine where you are not listed.

Is being listed in the search engines good enough?

Unfortunately no.

A listing won't automatically increase your traffic. A good ranking may. You want to be listed on the first three pages of the search engine results page. Most web surfers aren't patient enough to look more than the first 20-40 listed links. In fact there is a considerable drop off just going from the first to the second page.

Part 4 - How to improve your ranking.

How do search engines rank pages?

Search engines use a ranking algorithm to determine the order in which matching web pages are returned on the results page. Each web page is graded on the number of the search terms it contains, where the words are located in the document, and other criteria that changes frequently.

All search engines have a different method of ranking. That's why you might rank number 1 on one engine and number 25 on another. Robots look for relevance and rank results on a secret ever-changing algorithm. Some look at TITLE, some look at META tags, some look for link popularity. Search engine optimization means optimizing the Web site for the best possible positioning based on the page's keywords and description.

BuildTraffic's Search Engine Tools is a search engine optimization package that will walk you through the whole optimization process. BuildTraffic has numerous experts who have conducted extensive tests to help identify what techniques work to better your ranking in the search engines. Additionally, we have identified what can get you into trouble with search engines.

General tips to get a good ranking.

1. Create a good site with good content.

This is critical, especially as search engines grow in sophistication. If your site contains worthwhile material, users will return to your site and will recommend it to others. Other sites will link to you - which will in turn help you by improving your link popularity.

2. Pick keywords visitors will actually use on a search engine query.

 

3. Include keywords in your TITLE tag.

Pages with keywords appearing in the TITLE are assumed to be more relevant to the topic than those without.

4. Use keywords in META Keyword and Description tags.

Using META tags will not hurt you in search engines that don't use them, and they can definitely help you in search engines that do index them. While they are not as important as the TITLE tag, META tags can give you the edge over your competition since most web sites don't even use them.

5. Use your keywords throughout your page.

Search engines will check to see if the keywords appear near the top of a web page, such as in the headline or in the first few paragraphs of text. They assume that any page relevant to the topic will mention those words right from the beginning.

6. Have a good keyword density on your page.

Keyword density is derived by dividing the frequency of that word by the total words on the page. Frequency is a major factor in how search engines determine relevancy. A search engine will analyze how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages. This can turn into a balancing act as too high a density can be considered spam by some engines. Usually you are safe if your keyword density falls between 1 - 5 %.

7. Continually work on improving your link popularity.

Listings on popular Web sites can increase your traffic significantly. They do this in two ways:

Most search engines use link popularity as relevance criteria. For example, the Google search engine (not their new directory) is based almost entirely on link popularity.

Summary

Success with search engines and directories is not one magical thing you do specifically. It is the culmination of your whole strategy. It is a time consuming, labor intensive activity with great rewards. Bottom Line: Use BuildTraffic's Search Engine Tools at least once a month to help you get more traffic from the search engines.

BuildTraffic's Search Engine Tools is a search engine optimization power house. It's a mega tool bundle that can help you improve your search engine ranking and bring more traffic to your site. Our system walks you through the whole search engine optimization process. From selecting your most effective keywords and building META tags, to optimizing your pages for peak search engine performance, to submitting your pages to the search engines of your choice and to tracking your site's performance in the search engines over time. BuildTraffic has the one stop solution that will greatly improve your search engine performance.

Search Engine Spam: Useful Knowledge for the Web Site Promoter

Before getting started on using gateway pages and other HTML techniques
to improve your search engine ranking, you need to know a little about spam
and spamdexing. Spamming the search engines (or spamdexing) is the practice
of using unethical or unprofessional techniques to try to improve search
engine rankings. You should be aware of what constitutes spamming so as
to avoid trouble with the search engines. For example, if you have a page
with a white background, and you have a table that has a blue background
and white text in it, you are actually spamming the Infoseek engine without
even knowing it! Infoseek will see white text and see a white page background,
concluding that your background color and your page color are the same so
you are spamming! It will not be able to tell that the white text is actually
within a blue table and is perfectly legible. It is silly, but that will
cause that page to be dropped off the index. You can get it back on by changing
the text color in the table to, say, a light gray and resubmitting the page
to Infoseek. See what a difference that makes? Yet you had no idea that
your page was considered spam! Generally, it is very easy to know what not
to do so as to avoid being labeled a spammer and having your pages or your
site penalized. By following a few simple rules, you can safely improve
your search engine rankings without unknowingly spamming the engines and
getting penalized for it.

What constitutes spam?

Some techniques are clearly considered as an attempt to spam the engines.
Where possible, you should avoid these:
























 

Gray Areas

There are certain practices that can be considered spam by the search engine
when they are actually just part of honest web site design. For example,
Infoseek does not index any page with a fast page refresh. Yet, refresh
tags are commonly used by web site designers to produce visual effects or
to take people to a new location of a page that has been moved. Also, some
engines look at the text color and background color and if they match, that
page is considered spam. But you could have a page with a white background
and a black table somewhere with white text in it. Although perfectly legible
and legitimate, that page will be ignored by some engines. Another example
is that Infoseek advises against (but does not seem to drop from the index)
having many pages with links to one page. Even though this is meant to discourage
spammers, it also places many legitimate webmasters in the spam region (almost
anyone with a large web site or a web site with an online forum always has
their pages linking back to the home page). These are just a few examples
of gray areas in this business. Fortunately, because the search engine people
know that they exist, they will not penalize your entire site just because
of them.

What are the penalties for spamdexing?

There is an inappropriate amount of fear over the penalties of spamming.
Many webmasters fear that they may spam the engines without their knowledge
and then have their entire site banned from the engines forever. That just
doesn't happen that easily! The people who run the search engines know that
you can be a perfectly legitimate and honest web site owner who, because
of the nature of your web site, has pages that appear to be spam to the
engine. They know that their search engines are not smart enough to know
exactly who is spamming and who happens to be in the spam zone by mistake.
So they do not generally ban your entire site from their search engine just
because some of your pages look like spam. They only penalize the rankings
of the offending pages. Any non-offending page is not penalized. Only in
the most extreme cases, where you aggressively spam them and go against
the recommendations above, flooding their engine with spam pages, will they
ban your entire site. Some engines, like HotBot, do not even have a lifetime
ban policy on spammers. As long as you are not an intentional and aggressive
spammer, you should not worry about your entire site being penalized or
banned from the engines. Only the offending pages will have their ranking
penalized.

Is there room for responsible search engine positioning?

Yes! Definitely! In fact, the search engines do not discourage responsible
search engine positioning. Responsible search engine position is good for
everybody - it helps the users find the sites they are looking for, it helps
the engines do a better job of delivering relevant results, and it gets
you the traffic you want!

As a webmaster, you should not be too afraid that you are spamming the search
engines in your quest for higher search engine rankings. No question about
it, though, spam is something that every webmaster should understand thoroughly.
Fortunately, it is easy to understand it. So learn the rules, re-examine
your web pages, resubmit to the engines, then create gateway pages to get
better ranking on the engines, using the rules above. If you need any more
information on search engine spamming and search engine positioning, see
http://www.buildtraffic.net We wish you the best of fortune in your
web promotional efforts!

The Ultimate Keyword Optimization Guide for Search Engine Positioning

 

We all know it and we all do it. Whenever the typical web user needs to
find something on the web, he or she will almost always instinctively go
to one of the top search engines and run a search. Then, he or she will
have a look at the first 30 search results returned (usually the first three
pages of results), hardly ever looking beyond that. If nothing looks appealing,
they will run another search using a variation of the keywords they used
on the first search. Again, they will look at the first 30 results. If they
still find nothing of interest, they may switch to a different search engine
and repeat the process. This, believe it or not, is the typical web navigation
behavior of at least 86% of all the 110 million web users out there. The
question is, does your web site capitalize on this behavior? If the answer
is yes for you, then no matter what a user searches for that has any relation
at all to your products, they should find you even if they hadn't thought
of finding you! That translates to a very high amount of daily traffic to
your site. Is this what is happening for you?



We bet the answer is no. Very few web sites really capitalize on this behavior.
However, you can learn how to do so. For this tutorial, we shall assume
that you are in the business of selling wedding gowns. You have a web site
that does a good job of displaying your wedding gowns, but like most other
web site owners, your web traffic is really low and clients are simply not
finding you. You have done everything you can think of, but your pages hardly
come up on the search engines. What next? Well, get ready to learn one of
the most effective online marketing strategies.



Gather Your Keywords


We know that the web user will think of a keyword or phrase and type that
into a search form on an engine. That is his or her first step, and so it
should be ours. This is a very important step! Get a piece of paper or open
up your text editor and keep writing all the keywords and phrases that you
think of as you go through the steps outlined below. Selecting the right
keywords or keyword phrases will skyrocket your traffic. Research has shown
that most people use phrases of 2 to 3 keywords when searching. People have
come to learn that just typing in one keyword in a search will get them
thousands of records that they just do not need. Therefore, they combine
two or three keywords in their searches.



Your list should end up having about 50 or more keywords and phrases. You
must write down every word that a person might use to look for your site
PLUS words they may use to look for other related information or products.
This is no easy task. It involves some research and a fair bit of thought.
Remember, thousands of people out there looking for wedding gowns may not
all think alike, look for wedding gowns directly, or even realize at the
time that they can get wedding gowns on the net! To get a good list, here
are the steps that you should go through. Keep your pen and paper handy
so that you keep writing as you go along each step.














Generating Link Popularity

How Does Link Popularity Work?

When some search engines index a page, they will check their database for how many pages are linking to that page (or domain). This will give the page a 'link popularity' rating. In some cases, such as Google, it will also use the link popularity of the site that is linking to that page to determine the quality of the link. For example, a link from C-net would push the rating much higher than a link from a personal page.

When a search is performed, the engine will first scan for relevant results. Once it has the results, it will use the link popularity rating to sort the results. You can see great examples of this by performing searches in Google. Most of the results for general query's are from high traffic sites. For example, the top 3 results on Google for an 'entertainment' search are Netscape, Lycos, and Yahoo.

Getting Links

So how is it that you get links?

First, the Website must have quality content. A site with good content will get plenty of links from around the Web through no extra effort by the webmaster.

Another great way to get "quality" links is to get listed in search directories such as Yahoo and the Open Directory. These count as links from high traffic Websites, and will increase your link popularity significantly. Getting a listing in the Open Directory will get your link on several Websites, such as Lycos, AOL, and Altavista, since the data is syndicated on many engines.

You can find out more about getting listed in directories here.

Maintaining an affiliate program is also helpful. Many search engines will follow affiliate links and they will factor into the site's link popularity. An affiliate program can effectively give a site thousands of links on the Web.

Giving Links to the Engines

There could be thousands of pages linking to a site, but they don't help unless the search engines are aware of them.

So What's the Big Deal About Meta Tags?

When anyone publishes a web page, it really doesn't matter if it's a Business
related site or just your own personal page, it's not much fun if you don't
have any visitors, right?

Of course someone who is trying to build an income online is very much more
interested in this than one who is only playing with a "home page".

Ya gotta' have traffic (i.e.: visitors) if you ever hope to make any sales.

According to some industry reports, better than 50% of visitors find your
site by doing a "Search" with the Search Engine they like the best.

 

Most everyone has their favorite so you never know how someone might find
you.

 

Every one of these Search Engines have unique ways of handling the additions
of new sites. Some utilize Meta Tags and some do not, but here's the catch
- they can change the way they do any or all of this at any time they decide
to, without any prior notice.

 

In my opinion, you can drive yourself nuts trying to make sure you are in
the "top" whatever.

There are a whole bunch of services out there loudly proclaiming a "guaranteed"
listing in the "top 10" by using their service. Let's get real here, there
can only be 10 sites in any "top 10" with any Search Engine. If you multiply
that by the number of Search Engines, then maybe... you might be listed
in the top ten with one. And that's a very BIG MAYBE!


The best thing you can do is try to make your site as Search Engine (SE)
friendly as possible. One of the ways to do this is by the use of Meta Tags.

 

Meta Tag Format

 

The first time we heard about Meta Tags, we knew that this was most decidedly
something we needed to do, but not one of these references told us what the
danged things were supposed to look like, what the proper format was, what
all we should include, and on and on went the questions in our mind.

So for your information here is the proper format to use for your Meta Tags:-
First, of course you'll have the beginning of your web document. Doesn't
matter if you do this yourself or use a WYSIWYG editor.



<html>

<head>

<title>(Here's where YOUR title would go) </title>



Your "Title" should have no more that 60-60 characters.



Right after that is where the Meta Tags go. Here is the complete format
for these...



<meta http-equiv="title" content="Your Info">



(if you've developed a good title, that describes your site well, insert
the very same thing in between the quote marks where I've inserted "your
info". Do leave the quote marks in place.)

<meta name="resource-type" content="document">

<meta name="revisit-after" content="30 days">

<meta name="classification" content="consumer">

 

These can stay this way. What they are telling the various engines is that:-
this is an HTML document, you request they visit every 30 days to update
the changes you've made, your information is to be classified as "Consumer"
information.

<meta name="description" content="YOUR Info">

 

You can repeat your title info here, again if it's a complete title that
really targets what your site is about. However, here you can elaborate
more, up to 150 characters.

 

<meta name="keywords" content="YOUR KEY WORDS">

This one is really important! What would YOU search for using a SE and trying
to find your site? What is the main focus of your site? Do you offer some
free stuff? Information? What kind? Sell something? - What?

 

Spend some time deciding what the real key words should be for your site,
up to 1000 characters.

Really this should be what you do first. Then build your "Title", and other
information around these. Do NOT repeat key words more that 1 to 3 times.

Many search engines will consider this "Keyword Stuffing" and will not list
your site at all.

 

<meta name="robots" content="all">

 

This tag is asking the robot to list all of your site. Some engines list
each page and some only list your entry or index page. But, if this tag
is present, the "robot" will list the maximum it's been programmed to list.

<meta name="distribution" content="global">

This tag tells the robot that your site is of interest to the whole world
and not just "regional" or "limited" in some other way.

<meta name="rating" content="general">

This tag rates your site. Similar to the Movie or TV rating of programs.
If your site is an "Adult" site, you'd better rate it that way.

If your site is intended for children you can check out these locations
and register it for a better rating:- Register with SafeSurf, Weburbia or
RSAC. We believe there is also an "icon" they'll allow you to include on
your site.

<meta name="copyright" content="1998 - 2012">

This is of course self explanatory. We've left our info in place because
we've been on the internet with copyright material since 1995. If your site
is new this year, delete the 1995.

 

<meta name="web author" content="YOUR info">

Did you create your site, or did someone else do it for you? That's what
will decide what goes here.

</head>

 

All of this information should be included between the </title> tag
and the </head> tag.

After that comes the <body> tag and then of course all the great information
you've lovingly gathered together for your visitors.

 

Hope this helps to explain the importance of Meta Tags. Every page you develop
should include these tags formatted with relevance to the page they're on.

One big NO-NO we want to mention here - Don't copy and use these tags from
other sites! Remember, everything you see on someone else's site is copyright
material even if you don't see the symbol or any reference to copyright.

It's an unwritten law and you can be sued for using their information without
prior permission. We do as much "peeking" at source codes as anyone out
there does. We even save some we see on our desktop, but all the code we
save is for studying only. It's a great way to learn more about what I'm
trying to do.

The Magic Keywords


What will your potential visitor enter into a search engine to find your
site? If you can find these magic keywords, phrases real people will use,
then optimize your pages for them, you will have taken a key step toward
generating hits. If you use the wrong words, you will waste a good deal
of effort and achieve next to nothing.

A guy has been working with an ex-IRS agent who can be of significant help
to those with tax problems. But he has decided to search for clients only
in the area in which he lives, the Santa Clarita Valley in Southern California.
It is a snap to get a #1 position on most search engines with such phrases
as Santa Clarita Tax Expert, Santa Clarita Tax Solutions, and so forth.
And he did so. But he is not getting any hits.

The problem is in two parts. Many people who live in the Santa Clarita Valley
do not know that they do. Even those who do tend to feel they live in Los
Angeles. Secondly, many do not know how to spell Santa Clarita. So his first
place position is meaningless, unless he turns to advertising in locally
circulated newspapers, magazines, and newsletters. This can cost bucks,
and he could have done this without the effort it took to build his site.

Discovering what potential visitors might enter to find your site is a challenging
problem, one often overlooked in advice regards position on search engines.
One way to begin is to list a few words you feel will work, go to your favorite
search engine, enter them, and see what comes up. Any phrase that generates
a lot of unconnected listings is not likely a good candidate.

When you find something that ranks your competitors high in the list, check
out the sites. Once the page has fully loaded, take the option in your browser
to view the page source code. Find the keyword meta statement near the top
of the page, and check those listed. Add as appropriate to your list. Also
check the page content to see which keywords are sprinkled throughout it.
These may be the most important ones. In particular, see how the keyword
you used to get this page is handled. You may find clues as to how best
to use it on your page.

What's Next?

Assuming that you have found what potential visitors will enter when they
want a product or service such as yours. But you do not *know* these are
the phrases real people will use. You do not know you have the magic keywords.

We have a suggestion. It is not a guaranteed solution, but we have used
it successfully. It goes like this.

We write a good description of the product or service we want to sell, maybe
half a page. We describe what it is, what it does, and how one will benefit
from it. We write much as we would when producing an ad. However, we do
all possible to *avoid* the keywords we feel will be used.

Next we pester everyone we know, asking what they might enter to find this
product. And we give it time; not everyone is as interested in our problem
as we are.

When we have collected replies, we go back and pester these same people
with a list ranked with the most common suggestions up top, including phrases
we found that were not mentioned. We ask them to pick four or five they
feel are best.

We have found some really neat keywords in this way, phrases we would never
have discovered on our own. We hope you can make it work for you.

We sense this is an aspect of search engine positioning often overlooked.
It is easy for us to pick a phrase related to your business and get you
top position on at least some search engines. It is meaningless, though,
unless people actually enter that phrase.

The Truth About Robots - Giving the Robots What They Want

So how do you make the search engine robots give your site a better rating
than all the other millions of Web sites trying to do the same thing? Simple,
give them what they want. You can't trick them or make them think that you
are better than you are. Think about a visit from the eyes of a robot. He
finds a site, usually from links embedded in Web pages, then loads the text
from the first page.



He looks for the META tags and pulls out the keywords and description. If
not there he takes the first 200 or so characters of text and uses them
as a description.



The Title is extracted.



He extracts the pure text from the page (strips out the HTML coding). He
takes out the common words leaving what he feels may be keywords. (Most
do not do this last step.)



He now extracts the hyperlinks collating them into those that belong to
this Website and those that don't (He visits these later as this is how
he finds new Web sites).



He may do the same with the email addresses.



He goes on to the next page and so on until he has visited all of the pages
in your Web site.

Now he stores all of this information.

He now knows how many pages you have, how many 'outside hyperlinks in your
site', and can give your site a score based on how it is set up. These are
the basics.


What do they do with the info? When someone comes to search a phrase or
keyword, another search routine program takes over using the information
the robot found. A person types in the keywords and the search program returns
the 256,000 pages matching their keywords. BUT they also consider the following:
How old is the Web site or how long has the engine known about it? How large
is the Web site? Was it properly constructed? How many hyperlinks are there
to outside Web sites?



VERY IMPORTANT! How many hyperlinks are located on other Web sites to this
site. The older and better the Web site the more links to it.



These robots know when you are cheating. You can't trick them. It is so
simple for the robot developer to incorporate code to negate the tricks.
What about scoring keywords only once or twice per page or area like meta,
title, etc.? Is this page close in size to all the other portal pages? How
many Web pages in the same directory have the word "index" in
them? Does this site have a lot of content? Is any text the same color as
the background? Are there links to outside sites? Each page can be checked
and compared against what the robot feels is a statistically normal page.
These are computers you know.



You need a lot of pages with normal content. Instead of spending the time
to make fake pages, give the real ones content. This will also give your
visitors something to come back to. CONTENT.

The Truth About Robots - Robot Travel

If there is one thing you have learned about robots, it is that there is
absolutely no pattern to them. Most robots are stupid and wander randomly.
For example, 50% of robot hits to my sites ask for the robots.txt page and
then go away never asking for anything else. Then they come back a week
later, ask for the same thing and then go away, again. This happens over
and over again for months. You will never never figure it out. What are
they doing? If they wanted to see if the Web site was really a Web site,
they could just Ping it. This would be much faster and much more efficient.
They seldom visit another page and if they do, they ask for one other page
every visit or so. Some come in and issue rapid-fire requests for every
page in the Web site. How rude! You have to quit worrying so much about
robots. It takes 6 months before they request enough pages to do you any
good. We really quit thinking about them a long time ago. Build a lot of
pages correctly, and, if you have reciprocal links to them, the robots will
find them someday.

Try this: Go to AltaVista and type into the search box "link:YourSite.com"
(Leave off the www). This will list the reciprocal links to your Web site.
Try link:crownjewels.com and you get 136 links to it. Think about this now:
The robots say to themselves, "Here is a site that must be popular or why
would so many Web sites SIMILAR to it have it's link on their pages?" Remember
that only SIMILAR sites with SIMILAR THEMES would probably have a link to
your site. They give more importance to this than you submitting your link
to them. Wouldn't you?

Go to heavily trafficked sites matching your Web site's Themes and use AltaVista
to find out how many reciprocal links they have. This will prove to you
we are right.

Search engines are nothing more than a measure of reciprocal links to your
site. The problem is, you are constantly having to fight for your positioning
in the search query listings. Forget about that. Leave the fighting to people
who are able to spend 24 hours a day trying to trick everybody. Quit trying
to compete with the large organizations pouring millions into their marketing.
Completely forget about Search Engines after submitting to them and go after
the reciprocal links. The Search Engines will then believe you are a heavily
visited site because you will be. You will now be getting the traffic you
so richly deserve.

 

Search engine visitors to your site, are oftentimes not qualified visitors.
Too many visitors pop into your home page for 2 seconds and then leave.
You know how it is. We all do it when we are using the search engines. Either
it wasn't the information we were looking for, or they had this huge graphic
on this stupid portal page, which just took forever to load. These visitors
shouldn't even count, but they get counted as 12-18 hits in your server
logs. Hits are requests to the server. One page request can incur a lot
of hits: requests to the page itself plus the graphics, each count as a
hit.

 

Reciprocal links bring in qualified visitors. These are visitors who were
already on a Web site which had matching Themes to yours. They already have
a good idea of what type of site you are. They will come into your site
and actually stay awhile. These visitors should count as double credit,
they are so good.

We know which type of visitor we would rather have.

 

How do you get people to WANT to put your link on their Web sites? Why would
a similar site put a link to your site on theirs? Simple, you have similar
Themes. You are similar, but not competition.

 

There is one very important lesson to be learned from this crazy robot behavior.
You need to make the navigation in your Web site so easy that a visitor
can find any page within 2 clicks of your home page. One way of doing this
is installing hidden DotLinks. Dotlinks are little periods that are linked
to other pages which are not really noticeable on your page if you put it
as a period. Although they are not easily seen by the human eye, they are
a link that a robot can follow in your Web site. When you do this, robots
can find your pages faster and more easily.

Finding Your Best Search Keywords

When you want to find something on the web, chances are good that you start
by typing in a keyword or phrase into a search engine. When someone goes
looking for products, services, or content like your site's, you want them
to see your site at the top of the search results. The way to achieve this
is by strategically positioning your site to maximize your search engine
ranking on the most important keywords and phrases. But how to decide which
keywords matter most to your site?


Tried and True Methods


There are several ways to find your strategic keywords, but there are a
few obvious sources. First of all, think of what words you would use to
search for your site's content. The next obvious place to look is in the
"keywords" META tags and page titles of other web sites. Finally, look at
newsgroups, mailing lists, and discussion groups on similar subjects to
make sure you haven't missed any keywords. Simply browsing the subject lines
of a few days worth of newsgroup postings will probably give you all the
keywords you need.


Now, there's a Better Way...


So, you've got a list of keywords written down. Now it's time to find out
which ones really matter to web surfers. We'll accomplish this by taking
advantage of a "pay per click" search engine and a "pay for placement" site.
For each keyword or phrase on your list, try it out to see how popular that
search is.


After finding out how popular each keyword or phrase on your list really
is, you can focus your search engine positioning efforts in the right place.
Nearly every web site operator in your field of interest is trying to maximize
their ranking on the most popular keywords. You may not be able to reach
anywhere near the top of the list for your most prized keywords. Now that
you know how popular each one is, though, you can work your way down the
list until you start to hit pay dirt.


Always try to rank well on the most popular terms, but don't forget that
sometimes the best gold is a little farther down the mine shaft. We've gotten
quite a few hits from the term "Web site promotion toolkit," but it may
take us months to see a top 20 position for "Web site promotion." While
we tweak our doorway pages for the most popular search terms in our field,
we're still getting traffic from several search engines. And that's the
way this game is played.
Powered by KnowledgeBuilder