Yes, You can find more information on our pricing plans at:
Once you submit it can take 4-6 weeks to get listed on most search engines. However, the majority of our clients get listed in a few weeks. Once you are listed you will need to use our SEO tools in order to improve your rankings on the search engines.
While you are waiting you can also take advantage of our "Guaranteed Top 10" and "Instant Website Traffic". Both of these services are available to our members in their admin. These services will allow you to start getting traffic to your website within 12 hours!
You can have your login information emailed to you by filling out the password lookup form at:
Yes, you can find more information on our affiliate program at:
SEO Analyzer (additional fees may apply)
The SEO Analyzer is a powerful tool that checks over 20 known elements that affect your rankings in the search engines. The tool will crawl your web site, similar to a search engine, and report on the strengths and weaknesses of the website's overall structure and then offer suggestions for improvement. By following the detailed information provided you will be able to optimize your web pages for top search engine rankings. It is like having your own SEO expert with you 24/7 guiding you through the entire search engine optimization process.
Automatic Search Engine Submitter
Our submitter is one of the most powerful promotional tools of our service. With more than 100,000 sites to submit to, Your site gets indexed with ease! Just schedule when you want to be submitted and the system does the rest for you! Not only this but our built in spider will grab all the URLs from your site and add them to the submission queue. This spider will also extract your keywords and site title directly from your site saving you lots of time.
Automatically add all the URLs, titles, descriptions, and keywords from your site to the submission queue.
You also get to individually select the sites you want to submit to.
Each site is broken down by its category.
You can then use the system to automatically submit your site at any time or interval you choose.
You will also get a detailed report of all submissions.
You can select to get this report from your browser or via email.
Guaranteed Inclusion (additional fees apply)
Using our guaranteed inclusion system you will get a guaranteed Top 10 listing on over 200 search engines within 6-8 hours. You will also only be charged a flat fee with no bidding and no pay per click charges. Simply select the keywords you want to get listed for, design your ad, and choose the duration you want to be listed for (3 or 12 months). That's it! Getting a top 10 listing on so many search engines has never been so easy.
Ranking Monitor Report
Add Link Popularity Monitor
Link Popularity Monitor Report
Add Page Rank Monitor
Page Rank Monitor Report
Link popularity checker
Link popularity is the number of other Web sites that contain links to your site. Many factors can effect the search results returned by search engines. A major factor in search engine ranking is "linkage", that is, other web sites with links pointing to your site. A web site with a larger amount of "other sites" linking to it would score a higher relevance ranking than one with fewer links.
Check Link Popularity
Link Popularity Report
PageRank is an exclusive technology of Google which evaluates the popularity of your Website's pages by a note between 0 and 10.
Check Your PageRank
Link Relevancy Checker
Find out in one click why pages rank the way they do. This tool presents this data in four different tables where, usually in a matter of seconds, you can see for yourself why pages rank the way they do. Want to be ranked #1 on the search engines for your targeted keyword? Then simply run the Link Relevancy Checker on the Website that is currently ranked #1 for that keyword - then make sure your site is at least one link better!
Link Relevancy Check Results
Keyword Density Analyzer
The tool will analyze a web page for select keywords. This tool has many options which will help analyze a competitors site, and show you why a given site may be achieving a higher ranking than you. Simply the best way to gain a higher rank!
Simply enter your url and a competitor who out ranks you to find our why
Keyword Density Results
Search Engine Ranking
Find out how your pages are doing in the search engines! This highly sophisticated tool will search the world engines via a keyword or phase finding out just how and where your pages are ranked.
Select the keywords, URL, and engines you want to check rank for
Search Engine Coverage
Find out how many of your pages are listed in different engines.
Search Engine Coverage Results
Broken Link Checker
This tool will spider your website for any broken links. Broken links can adversely affect your search engine rankings
Our keyword generator is designed to query the world search engine databases for the most used keyword queries used by web surfers! This tools allows you to get the Top keywords and phrases from the major search engines at the click of a button. You can then use these keywords to create both meta tags and doorway pages to maximize traffic to your website! If you don't have good keywords, you won't get traffic!
Generate hundreds of keywords automatically
Top Keyword Finder
This powerful tool gives you the ability to find out what keywords are being 'most searched' in the major search engines. Why is this important? If you're competing with 1000 other sites to sell your widget, wouldn't it be nice to know what keywords people are using to find your competitors widgets the most? You can then use those keywords in your meta tags and doorway pages. This is one of the most underutilized tools on the Net, but one of the most powerful for driving traffic to your web site!
Find keywords easily without any competition
We send quality traffic at the most competitive rates possible with the fastest service and absolutely no hidden fees. Your campaign will go live within 12 hours, but in most cases, much faster. We are the largest traffic provider on the Internet and have been providing Internet marketing for close to 15 years.
Note: Unlike other companies who sell "traffic" or "visitors" we NEVER use software, bots, autosurfs, or redirected domains to send you website traffic. All of our traffic comes from our network of high quality websites via full browser exit windows / pop-unders.
There are many reasons that a visitor will not register with your server.
* A user cancels out the window before your site loads
* The user is on a slow connection and the request times out
* The user is behind an ISP with a proxy ex. AOL
* The users's browser served the request from cache instead of from your server
* Your site was unavailable at the time of request
* Your tracking software could not handle concurrent connections
* Your tracking software does not update in real time
Should there ever be a major discrepancy open a support ticket and attach a copy of your "raw server logs" available from your web host and we will be more than happy to issue a credit if one is due! Please note that we must have your actual server access log files as 3rd party tracking is known not to be accurate.
The Internet is changing the rules of business. As trite as these words may seem now, they are true. Like radio and television, the Internet has brought about an undeniable shift in how business is conducted. Unlimited information, instantaneous communication, and a market more vast than anything before have become available to both massive conglomerates and small home businesses. However, one underlying factor remains the same: advertisement. While the traditional business plan of a "Brick and Mortar" company has always included extensive advertisement, the same cannot be said about many Internet businesses. Many webmasters go to great lengths to craft wonderful web sites, putting little or no thought into advertising them. And any business that doesn't advertise is doomed to failure. Web sites need to advertise and promote themselves to create traffic on their site, to stay in business, and to grow.
Businesses on the Internet share a kind of equality that Brick and Mortar businesses do not. When a visitor arrives at a website, they really have no idea how large or small that company may be. In the Brick and Mortar world, it is easy to distinguish between the two--the larger will have the bigger store, flashier advertising, and a more expansive inventory. On the Internet however, small companies can project an uncharacteristically large presence by creating and properly promoting their web sites. While the Brick and Mortar retail world is consolidating and merging towards giant discount oriented retailers, the Internet is teeming with thousands of small, successful companies who might not otherwise be able to compete, or even start-up, in the Brick and Mortar world.
No matter how large or small your company is, you need to advertise. Unfortunately for most small businesses, Dot-com or otherwise, traditional advertising methods require immense capital and human resources. This is exactly why many smaller businesses fail--they spend too many of their resources in trying to compete on somebody else's turf.
The most obvious medium of advertisement is television. However, television's effectiveness in attracting customers is questionable. Its effectiveness lies in achieving brand recognition. Yet at this point most businesses are trying to increase traffic to their websites and gain more customers--brand recognition can wait until after the IPO. Television advertising is also the most expensive medium. More than one previously unknown Dot-com spent all of their available resources on a Super Bowl spot. They may or may not have been successful, but nevertheless, they spent millions of dollars for 30 seconds of airtime. Most businesses are more shrewd when it comes to spending their dollars, and they want more than just thirty seconds of exposure.
If you thought that targeting an Internet-based audience would be more effective at driving traffic to a website, you would be right. Banner ads and mass-mailings are much less expensive than television. However, they have their limitations as well. While remarkably cheaper than a TV spot, a small banner ad on a major portal website can still cost over six thousand dollars a month. Furthermore, it is generally agreed that banners are not a cost effective way to bring visitors to a website. Most potential customers see banners as just a nuisance. Mass mailings have a similar drawback. If you opt out of spending money for your own list, you can buy space in someone else's. But you'll be competing for the customers attention with whoever else bought space. Another disadvantage that banner ads and mass-mailings have is that they aren't targeting the people who are most likely to be interested in the content you have to offer. They may reach a wide audience, but most of that audience will recoil in horror, yell quot;Spam!", and delete your expensive ad before having the opportunity to realize that you are exactly what they're looking for. While both traditional methods can and do work, they require a lot of what most of us don't have: money.
The most cost-effective advertisement would have to be both cheap and targeted at exactly the right customer. Years ago this was something of a pipe dream--today it is a reality. The Internet is the most comprehensive source of information in human history. But like any library of knowledge, it must be catalogued and organized to be used effectively. And therein lies the perfect solution: search engines. Like the card catalog of a library, search engines are a customer's way of sifting through the Web to filter out what they're looking for. And what better way to make sure you are found then to have an influence on what they find? Search engine listings meet both of our criteria in terms of focus and affordability. No other form of advertising is so focused that the customer is actually searching for you. A search engine user is a highly receptive and targeted audience because you are not trying to sell them on something they don't already want. They have come looking for you and they already want what you specifically have to offer. Furthermore, the great majority of search engine listings are free. Return on investment couldn't be better-- with none of your dollars spent, the very first dollar returned is profit.
According to the Georgia Institute of Technology, 88% of Internet users find new web sites through search engine listings1. WebCMO data shows that in a side-by-side comparison of different forms of promotion, search engine listings are the number one way to generate traffic on websites2. Search engine listings send droves of visitors to your site and they are free. I have personally seen websites where traffic has increased ten-fold as a result of good search engine positioning. Nothing could be better, but there is a catch.
Getting listed on a search engine below 499 other websites simply won't work. You need to get a listing near the top of your category to collect all the traffic a search engine can deliver. But the good news is that you can dramatically improve your positioning with a little bit of elbow grease. All it takes is some key modifications to your website and a little thought. It's not that hard and you can do it. Using a system like BuildTraffic to guide you through this process is the ideal way to simplify your work. It guides you through every aspect of the Search Engine Positioning process so you don't have to be a tech-savvy guru to get excellent results. With good search engine listings a small business can project a large image on the Internet and get the kind of traffic that so many big business sites get.
Monitor and analyze results
Search engines and directories can be one of the most effective ways to acquire visitors to your website. This document is a proven approach to effectively promoting your site through search engines and directories.When completed, you should be able to:
Search engine and directory systems are constantly changing. The following was written and reviewed by leading search engine experts with the most current information available.
INTRODUCTION TO SEARCH ENGINES
Search engines, directories and PPC-enginesSearch engines are usually categorized into the following groups:
Search engines use computer programs ("spider") to browse through the entire Internet, read pages, and automatically index the content into large searchable databases.
PPC-engines have greater similarities to other online advertising products than traditional search engines. The biggest PPC engines are Google, Overture, eSpotting, FindWhat, and Kanoodle. In PPC-engines, users bid for the position they want in the search results. The highest bidder receives the top result, the next highest bidder receives the second result, and so on. You can learn more about PPC-engines at Pay Per Click Search Engines and Search Engine Watch.
The rest of this tutorial will focus on search engines and how to optimize your web sites to become more visible and to attract more relevant, interested visitors. We hope you find this is a valuable tutorial on how to build a search engine-friendly website.
Work out a long term strategyGenerally speaking search engine optimization is a long-term strategic tool rather than a tool for short-term campaigns.
There are many ways to increase the number of targeted visitors your website receives from searches. You can optimize your website, use inclusion programs, pay for business express reviews in directories, purchase sponsor links or bid for positions in PPC-search engines.
For each of these options there are even more options. What optimization strategy do you implement? What PPC-engines can benefit your business and what tools do you choose to use?
This tutorial will focus on search engine optimization. However, we encourage you to look at other available methods of acquiring qualified targeted visitors from search engines.
Three steps on the way to successThe following will detail three key steps involved in effective search engine optimization:
After the tutorial we have included a FAQ of search engine optimization problems.
OPTIMIZING YOUR WEBSITE
Search engine optimizationEvery search engine has its own algorithm that determines the ranking of documents within search results. A search engine algorithm is a complex set of computer-based rules for how to interpret and rank web pages when users do a search for a particular word or phrase. For obvious reasons, search engines do not publicly reveal the detailed logic behind such algorithms.
There are lots of similarities in the way search engine algorithms work. To learn more about how search engines work or how to improve your rankings, visit Search Engine Watch to find information on search engines and their rules.The three most important elements of search engine optimization are:
Keep your website online
With the billions of web pages search engines have to visit each month they usually only have time to request each web page once. If your web server is not accessible at the time the search engines visit your website, your web pages won't get indexed. Don't panic!
Most search engines crawl the web on a regular basis. If some of your web pages have been dropped because your server was down, they will probably get back into the index the next time the search engine visits your website.
Introduction to search engine spidersSearch engines use small computer programs to browse through and read web pages on the Internet. Such programs are often referred to as "search engine spiders" or "search engine robots". When a search engine spider visits a page it first reads the text on the page and then tries to follow links from that page off to other web pages and websites.
You can think of a search engine spider as a very simple form of automated browser. It traverses the Internet like a human would do with a browser - just many times faster and fully automated.
Unlike ordinary browsers, search engine spiders do not "view" web pages - they simply "read" the content and click the links. Search engine spiders do not enjoy or appreciate the great design and "cool" images on your web pages.
Validate HTMLHTML code is the most fundamental part of every web page on the Internet. In a browser, you don't see the actual HTML code, but only the final result - the web page as it should look.
Search engine spiders, however, can't see the web site's final result, and instead, focus on the raw HTML code.
Most browsers can interpret HTML code that has errors and still show a nice looking page. However, you never know how a search engine spider will interpret the same errors. It may skip the part that uses incorrect syntax, or it may skip the entire page or website. There is no way to predict what the search engine spider will do when it encounters code it doesn't understand.
If search engines cannot decode your website, follow the links, and read the text, your site will not get indexed. It is not enough to validate your web pages by looking at them in your browser. The HTML syntax needs to be correct - you need to validate your HTML.
If you do not have access to any of these validators, we strongly recommend using the free HTML validator or one of the many validators you can find on the Internet.
Stay away from "alternative HTML"Some search engine optimization "experts" will advise you to use non-standard HTML to squeeze in extra keywords in hidden places in your HTML code. Do not do that! Search engines penalize websites that use hidden keywords either by de-ranking the websites or erasing them completely from their index. Always use valid HTML.
Links and navigationThis section examines how to build links that search engine spiders can follow.
A good navigation structure makes it easier for search engines spiders and users to browse your website.
If it is easy for users to navigate your site, they will most likely return. If it is easy for search engines to follow your links it is more likely that they will index many of your web pages.
The more web pages you have indexed, the more likely it is that one of them will be found in a search, thus leading a prospect to your website.
Use hyper linking that search engines can followSearch engine spiders can't follow every kind of link that webmasters use. In fact, the only kind of hyperlinks that you can be certain all search engine spiders can and will follow are standard text-based links.
In HTML code, text-based links are written like this:<a href="http://www.domain.com">Link text</a>
The link will look like this on a web page: Link text
All other forms of links can pose a problem for search engine spiders.
Image maps or navigation buttons
Sometimes designers want to use graphics for navigational links. There are two common ways of making graphical links that are very different: Image maps and Navigation buttons
Image maps can have different areas of one image linking to different web pages. That is a very common way to build top- or side-navigation bars or country maps.
Most search engines do not read image maps, so you should always add additional text-based links on each page for search engine spiders to follow. The text-based links will also be good for visitors who do not use a graphic capable browser, or are surfing with graphics turned off to save bandwidth.
Navigation buttons are single images that each link to a specific web page. Most search engines spiders will follow this type of link. However, if you want to be absolutely sure that all search engines can and will follow all your links, we still recommend adding pure text-based links in addition to any kind of graphical links.
Flash, Macromedia and Java AppletsFlash, Macromedia, and Java Applets have something in common: They all require the user to have a special program installed on their computer to open and play the files. Most new browsers come with the necessary programs installed to play the most common file types. However, search engine spiders usually do not open such files.
We recommend that you do not build your navigation links using any of these techniques. If you do, you should only serve the high-tech version of your website to users that have the right plug-ins installed, and have a low-tech version ready for all other users. The low-tech version should include regular text-based links. This will work well with the search engine spiders as well as the users that do not have these plug-ins installed.
Thus, do not put important things like your main navigation in Java Script. And if you do, make sure that the same links are also available as standard text-links somewhere on your web pages.
Non-HTML formats (PDF, MS Word etc)Even though some search engines have started to read and index non-HTML file formats such as Microsoft Word, Microsoft Excel, and Adobe PDF (Portable Document Format) files, none of the search engine spiders will follow links within these formats. If you want search engines to follow links on your website, they should always be placed in your regular web pages - your HTML documents.
Do not place text in such formats if you want people to find it in search engines. At least make the content accessible in HTML format too - as a regular web page. Not all search engines will index these non-HTML formats and many will make more errors indexing them.
Text and META-dataIf you want good rankings in search engines, you need good content. Content is King! The more good content you have the higher your chance of being found. It's like a lottery. For each piece of valuable content you have, you have a ticket in the "search lottery". The more tickets you have, the higher your chance of having someone click on your link.
On a web page, some text can be seen as part of the visual presentation and some text is "hidden", including META-tags, title-tags, comment-tags (something programmers use to help them navigate the code) and ALT-tags (text descriptions of pictures).
Search engines have reduced the amount of importance they place on META-tags, with the exception of the title-tag. More weight is placed on the visual text - the text that users will see when they arrive at the web page.
The title-tag is technically not a META-tag. It is the most important HTML tag on your site. The title-tag displays the site's name, which will appear in the top of the browser. All major spider-driven search engines consider the keywords found in this tag in their relevancy calculations. You should pay special attention to this tag, as it carries a lot more weight than most other objects of a web page. You can read more about how to write good title-tags in the next section.
Introduction to keyword researchBefore optimizing your website, you need to know what keywords your target group is using. Your keywords are the words and phrases that people might use to find your products, brands, services, or information, via search engines.
You can probably develop a few ideas very quickly. If you run a travel portal, keywords such as: "travel", "vacation", and "holiday" might be good ideas. In fact, a deep research into the topic of "travel" would probably show more than 100,000 different keywords people use when they search for travel information.
Does that mean you have to target all those keywords? No, absolutely not. You should use such research only as a good starting point. Use it to learn the search behavior of your audience. Find out what they call things, how they identify subjects, how precisely or broadly they generally search.
You may discover that the majority of searches for "travel" come from people using combinations of "travel" and a certain city, country, or region. Or you may verify that people search for "travel" more than "vacation".
All this research should be used in the way you tailor your web pages and the way you write your text, titles and META-tags. If most people in your target group use the term "travel" to describe your product, so should you.TIP:
Your most important keywords will often include some of the following:
We recommend using our Keyword Generator; this tool wil lhelp you find the best keywords to use in your site.
Adding keywords to your websiteNow that you know the keywords you want to target, it is time to implement the keywords on your web pages. The big question for many people is: How do I do that?
Basically you implement your keywords by using them on your web pages. It's that simple! The first and most important thing to do is use the words or phrases throughout the text on your web pages.
Some people confuse adding keywords to a web page with the META-keyword tag. As you will learn in the next section, the META-keyword tag does not count much in how well a web page ranks, so adding your keywords here and nowhere else won't help you at all.
Each section of your web page carries a different weight in the overall ranking process. In the next section you will learn the parts of your web pages that require special attention.
Focus on a few keywords for each web pageIn most cases a web page will only rank well for a few keywords - the one that search engines determine to be the most important for that page. If you run a travel portal you cannot expect to rank well for the thousands of travel related keywords people are using if you only optimize your front page. Instead, you have to optimize each of the web pages on your website that have content people are looking for - the keywords you have developed through your research.
Important places to use your keywordsSearch engines weight keywords according to their placement on a web page. There are some spots that are more important than others - places where you should always remember to use the most important keywords for each web page.
Introduction to page titlesThe page title is the single most important piece of information on your page when it comes to achieving good rankings in search engines. The words you use in your title carry much more weight than anything else on your page. So use this limited space wisely.
Page titles are written this way in HTML code:<title>Page title here</title>
The page title-tag is placed inside the header tag in top of the HTML page:<head>
<title>Page title here</title>
How to write good page titlesFirst, make sure that all your web pages have unique titles. Go through each of your web pages and write a title that makes use of the most important keywords. The page title appears at the very top of your browser; it may or may not be the title on the front of your page.
Keep in mind that the page title is what the users will see as the hyper linked title in search results, so the phrase should trigger users to click on the link and visit your site!
The goal is to make titles that make people click and make use of your primary keywords for each page. If you want a page to rank well on "dental services Chicago" make sure to use those exact words in the title. Maybe a title like "Dental Services in Chicago - open 24 hours a day" would work well for you.
Introduction to META-tagsMETA-tags are pieces of text placed in the header of your web pages. Text in a META-tag is used to describe the page. META-tags are often referred to as META-data, which means "data about data", or in this case, data about your page.
META-tags can help search engines better understand and present links to your website. Thus, it is highly recommended that you fill out the META-tags on ALL of your web pages.
META-tags that are important to useThere are many kinds of META-tags and standards, but the only two you need when optimizing a website are META-description tag and the META-keywords tag.
The two META-tags are written this way in HTML code:<META NAME="DESCRIPTION" CONTENT="Put your description here ….">
<META NAME="KEYWORDS" CONTENT="Put your keywords here ….">
The META-tags are placed inside the header tag at the top of the HTML code and just below the page title:<head>
<title>Page title here</title>
<META NAME="DESCRIPTION" CONTENT="Put your description here ….">
<META NAME="KEYWORDS" CONTENT="Put your keywords here ….">
Adding META-tags to your web pagesHow to add META-tags to your web pages depends on how you maintain your website. You should refer to the manual of the editor or content management system you are using for further detailed instructions.
In most cases there will be a simple box somewhere with two fields to fill out: The META-description and the META-keywords, or just "description" and "keywords". Your program will usually take care of inserting the necessary HTML code in the right place for you.
If you are coding your web pages by hand or in a simple HTML editor, make sure you do not make any syntax errors. If the META-tags have not been coded correctly search engines will not be able to read and use them.
Writing good META-descriptionsThe META-description is used by some search engines to present a summary of the web page along with the title and link to the page.
Not all search engines use the META-description. Some search engines use text from the regular text on your web pages and some use a combination. All you can do is make sure you have valid META-descriptions on ALL your web pages so that the search engines that use them have something to read.
Writing good META-keywordsMETA-keywords are very often misused and misunderstood. Some webmasters put hundreds of keywords in this tag in the completely unrealistic hope of receiving good rankings for everything listed. Some webmasters even include keywords that have absolutely nothing to do with the actual web page or website.
That has forced the search engines to lower the META-keywords importance in the determination of what a web page is about. However, some search engines still read and use the META-keywords. Thus, remember to write relevant META-keywords for all your web pages.
You should write a unique set of keywords for each web page. You should NEVER copy the same set of keywords for all your web pages.
Do not add keywords that are not 100% directly related to each web page. To be safe, only use keywords that are found in the visual text on your web pages.
It is often discussed whether or not to use commas between keywords. Most search engines will not care as they remove the commas before reading the keywords, but some search engines might still use the comma to find exact keyword and keyword phrase matches.
In any case, the META-keyword tag does not carry much weight in the overall ranking algorithm for any of the major search engines and you will never get penalized for either using or not using commas.
Some people use commas because it makes it easier to read in the raw code. This can be helpful if you want to edit the META-tags by hand at a later time.
Will META-tags give me top rankings?No, META-tags are not magic bullets. You will not achieve top rankings on keywords if you only use them in your META-tags.
However, we still recommend you fill out your META-tags as some search engines do read and use the data. In those search engines it will give you a better chance of receiving relevant visitors if you have correct, optimized, and relevant META-tags on all your pages. Remember, only those words that appear in the main body of the page should be in the META-tags.
Popularity measurementFor a search engine to understand which pages on the Internet are the most relevant to search users it is simply not enough to look at the content on each web page. Most search engines today analyze both link structures and click-streams on the Internet to determine the best websites and web pages for any given query.
Convincing people to link to your website is not something you do overnight. It takes time. By implementing a strategy to improve your link-popularity, you can gain a long-term advantage. Similar to building a good reputation or a trusted brand, it takes time. But once you have that "trust", the value is high.
Getting the most relevant websites to link to you will not only be good for your link-popularity, as measured by search engines, it will most likely lead to many good visitors coming directly from the referring websites.Tip:
The following list can serve as an inspiration for where to look for relevant links
Link-popularityThe link popularity element measures the number of links between web pages on the Internet. There are two kinds of links to focus on: inbound links and outbound links - links to your website and links from your website.
You can look at inbound links as a form of voting. If site A has a link to site B, A is voting for B. The more votes B gets from other sites, the more link-popularity B gains.
Each website that votes (i.e. has outbound links) is a delegate as they themselves have a number of voters behind them - the websites that link to their website. The more voters they have behind them, the more weight their vote carries on other sites. So the most popular websites have the most link-popularity to pass on.
Some search engines can "understand" the theme of related web pages. For these search engines, inbound links from thematically related websites might get a better scoring.
Outbound links are a bit different. They work similar to references in a scientific report. If you have references to the right authorities in your profession, it shows that you know what you are talking about. You acknowledge the "masters" and thereby put yourself in a category above the ones that don't.
Do not use free submission software or services to submit to hundreds of thousands of free directories and search engines just to gain more inbound links. Links from most of these places won't do you any good and there is even a risk that some of the links you get this way will harm your rankings.
Do not participate in organized exchanging of unrelated links between websites (link farming), to boost you link-popularity factor. Most search engines consider that to be spam. Instead, focus on getting inbound links from relevant major players in your field - they are the only ones that really count.
Internal link popularityIn addition to inbound and outbound links, link structure on your own website has an important role in determining the value of each of your web pages.
The pages throughout your website that you most often link to will gain the most internal link-popularity. If one of your web pages has 500 internal pages pointing to it and another only has 10, the first web page is more likely to be valuable to users as there are more pages "voting" for that page.
Click-popularityClicks are another way of measuring popularity. Of the millions of results that search engines serve, only a few of them get clicked. Sometimes people come back to the search result shortly after they click, and click another link or make a new search.
If analyzed, these clicks can tell the search engine what users find most important, and where they find the best answers to their searches. However, click-popularity is not widely used and not a dominating factor.
Also, click-popularity is one of the parameters of search engine rankings that you should not try to manipulate in any way! Don't try to boost your own click-popularity by doing hundreds of searches and then clicking on all your own links. It won't work. Search engines that apply this method are way too smart for that.
Common problemsThere are a number of things that can cause a search engine to be unable to read and index your website or specific web pages correctly. This section will provide a brief introduction to some of the most important things to watch out for.
If you want to learn more about how to solve each of the issues described below, visit Search Engine Watch.
FramesFrames have always been a problem for search engines and some users to understand. Some studies show that users have a problem performing simple standard tasks that they would otherwise be able to solve with a similar website not using frames.
Jacob Nielsen is a well-known and respected usability guru that has some rather extreme views on the use of frames. Read his article "Why Frames Suck (Most of the time)" to understand the arguments against using frames.
When a website uses frames, users can rarely bookmark specific pages deep inside the website. If they use the browsers bookmark function and click it later to return they will end up on the front page - even if they did the book-marking from another web page deeper in the website.
If users can't bookmark pages within your site, neither can the search engines. If users can only bookmark your front page - as is the case with most websites using frames - that is also the only page most search engines will link to.
Some search engines might try to browse and index individual pages on your website outside the normal frame set. However, that will leave you with a whole new set of problems. If the search engines send visitors directly to content pages that are supposed to be viewed inside a frameset it is usually difficult, if not impossible, for users to navigate further into your website.
Some search engines read the content in the NOFRAMES section of a frameset. The NOFRAMES section is a special part of the HTML code that is presented to users with browsers that cannot read frames (e.g. search engines, very old browsers, and homemade viewers).
The code looks like this:<noframes>Place content here</noframes>
Browsers and search engines that don't support frames can read the content you place between the two NOFRAMES-tags.To use the space the best way you should …
There are more advanced options available if you want a frames-based website indexed and ranked well. The easiest solution, however, is still to get rid of the frames.
If you want to use more advanced methods to make your frames-based website more search engine friendly, consult with an experienced search engine optimization expert or firm.Two more ways of solving the problems of using frames are:
If you can make additional entry points, you can use Pay For Inclusion programs to have the pages indexed in major search engines. You can use the inclusion programs even if the website is framed - as long as you can provide a unique URL for each target or entry page.
Dynamic websites and web pagesDynamic websites are very difficult for search engines to read because they use long and strange URLs.
A traditional static website is comprised of a number of individual files usually ending with .html - e.g. index.html, products.html etc. Each page is a unique file and usually has unique content.
A dynamic website very often only has one or a few files called "templates". The template dictates how to present the content but holds no content itself. All the content is stored in a database. To show a page on a dynamic website, you need to tell the template what to load from the database. Technically, that is done by adding "parameters" to the file name.
If the template is called "pages.asp" and you want to load the content with ID #54, the URL could look something like this:www.YourDomain.com/pages.asp?page_id=54
That part is not so complicated. However, there is often more than one parameter attached to the URL as the database needs to have more information about sort order, navigation etc. This is when it gets complicated for the search engines.
There is simply no way for a search engine to be sure what parameters identifies a new page and what parameters are just a sort order of the content, a navigation setting, or something that does not justify indexing the new URL as a new unique web page.
There are many other problems related to having dynamic websites and websites built on content management systems indexed in search engines. It is unfortunately not possible to cover all of it in this tutorial.
The good news is that more and more search engines are now starting to include dynamically created pages in their results.
Flash and Macromedia DirectorSearch engines do not read Flash and Macromedia files. Content and links in any of these formats will not be accessible by search engines.
Search engines basically only read regular text on a web page. They don't read text and follow links within special formats such as Java Applets or any other formats that require the user to have a certain program installed.
You can read more about Flash, Macromedia and Java Applets in the Links and Navigation section.
Image maps or navigation buttonsNot all search engines read image maps. Most search engines read navigation buttons. If you use image maps you should add the same links as you have in the image maps as regular text-based links.
You can read more about image maps and navigation buttons in the Links and Navigation section.
IP delivery, agent delivery, cloaking, and personalizationIt has become more and more common for websites to implement some degree of personalization, regional targeting, or other forms of individualization in the way web pages are served to individual users.
In its most simple form it can be a program on the server that checks what browser people are using, and serves up a specially tailored version for that browser. The same kind of program can also be used to check if people have specific plug-ins installed so that they get a version of the website they can read.
A more advanced use would be to check what country the user is coming from to serve a local version. Some portals and search engines have been doing that and many other cross-national websites do it. There are many legitimate reasons to do so including business strategies, marketing, and legal issues with products only allowed in certain countries.
The same techniques can also be used to serve different content to search engine spiders and normal human visitors. These techniques are usually referred to as "cloaking".
In most cases search engines do not like the use of cloaking. Thus, do not use cloaking unless you have a very good reason to do so.
A cookie is a small text file that a web server can save in a users' browser for future retrieval when that same user returns. It can be used to store login information or other preferences to make it easier or better for users to use a particular website.
Cookies are very safe to use in the sense that they cannot be read or shared across different users or websites. If a cookie is set on your browser, the website that wrote it is the only website that can read it. Also, other users of a website cannot get to the information in your cookie - it can't be transferred or shared.
Thus, even if the search engine spiders did accept cookies there would be no way for them to re-fetch it in the users' browser when they click a link in a search result. Consequently there is really no point in writing cookies for the search engines or sending cookies to the search engines.
SUBMIT AND INDEX PAGES
Getting indexedSearch engines do not always include all web pages from a web site. Usually they will only include a sample of your pages - the ones they determine to be most valuable.
Some of your web pages will be more important to have indexed than others. A product information page is far more important to have indexed than a contact form, as it is more likely someone will search for your products than your contact information.
Search engines do not always find the right pages to index by themselves. Sometimes they need a little help and guidance. In this section you will learn how to submit your web pages to search engines to get them indexed.
Submitting to search enginesSubmitting a website to search engines is not always enough to be included or continue to be included. Most search engines analyze the link structures and click streams on the Internet to determine which web pages are most important for them to keep in their indexes.
If no other site is linking to your website it will be harder - in some cases, even impossible - to get indexed. You should therefore try to get as many of the biggest and most relevant websites to link to you.
How to submit to search enginesMost search engines have a form you must fill out to submit your website. You will usually find the link in the bottom of the front page labeled "Add URL".
Generally you should only submit your front page. The search engines will follow links from that page to the rest of your website. However, it is a good idea to submit all of the pages from your website. BuildTraffic.net makes this easy by spidering your domain and retrieving all our your pages for submission.
However, if you have important sections of your website that are not directly accessible through the regular navigation you can also submit them. If you have a site map (a page with links to all the web pages on your website) you can submit that too, to help the search engine spiders find all your content.
Is your site indexed?In most of the major search engines you can verify if your website or specific web pages have been indexed. This is not the same as your ranking, but serves as the basis for a user to find your site. If your website is not indexed it will never rank and never be found.
For example in Google just do a search for your domain name. If you appear in the results then you are listed.
Do not over-submitYou should never submit a web page to a search engine if it is already indexed. Unless you have made changes to the content of your website or have noticed that your site is no longer listed.
Excluding pages from getting indexedIf there are pages or directories on your website that you do not want search engines to index you can exclude them using one of two different methods:
Robots.txtRobots.txt is a file that you place in the root of your web server. The file uses a simple syntax to exclude specific types of users - in this case search engine spiders - from parts of your website. You can either exclude specific search engine spiders or all spiders. To exclude all search engine spiders from all directories on your web server, you should write the following:
Note: This would disallow everything including your home page!
Learn more about how to write robots.txt files at SearchTools.com.
META-robotsMETA-robots are small pieces of code you can place in the header of your HTML documents. You can use META-robots tags if you don't have access to your web server's root or if you want to exclude single pages on your website. You can read more about how to use the META-robots code at SearchTools.com.
MONITOR AND ANALYZE RESULTS
Traffic analysisThere are various ways you can track and monitor the traffic that comes to your website as a result of your search engine optimization. With some of the methods you will be able not only to see how much traffic you received, from what search engine and what keywords people are using to find you, but also how users behave after they land on your website and how many of them actually end up as customers.
All that information is available to you! Some of it is quite easy to get access to. As you scale up and want more information and better and faster reporting you will have to spend more resources, on computers, software programs and not the least - the time to handle it.
In this section we will give you a very brief introduction to each of the main ways you can monitor and analyze the traffic on your website and the results of your search engine optimization.
Collecting data, mining data and reporting resultsThere are three distinct parts to traffic analysis:
Second, you need to filter the data - or data mine it - to extract the information you need and make the calculations you want for your reports.
Finally, you need to make some graphical reports of the data you have collected and filtered. This can be real-time or close to real-time reports through a web-browser or printed reports in Word or PDF format.
Sometimes traffic analysis packages come bundled with two or all three elements included. Trackers and network sniffing products usually include all three parts, and server log analysis tools include the last two.
Statistics are not absoluteThere are no absolute numbers and standards on the Internet with regards to statistics and traffic analysis. Each method of collecting data multiplied by the endless number of ways to process and mine it makes it virtually impossible to compare the results of analysis across websites.
You should use traffic analysis to track and monitor trends over time for your own website. Traffic analysis can also be used to monitor real-time statistics to tune live ad campaigns.
Trackers - hit countersTrackers, or hit counters, are essentially small pieces of code placed in a hidden part of your HTML code. Each time your page is loaded the script calls up another server whose database records that your site had a visitor. The tracking server can also record anonymous information about the visitor that requests the web pages at your server.
From a web interface you will be able to follow the traffic on your website in almost real time. The types of reports and features you get access to vary a lot from one tracker to another. With most of them you can see what search engines people come from, what keywords they used, and in some cases what search engine spiders have been visiting your website.The advantages of trackers:
You can sign up for a free tracker at ServuStats.com
Server Monitor Log AnalysisThe Server Monitor is a small application that sits on the server and counts who is requesting what from the server. The Server Monitor stores the information in a log file - commonly known as a "server log file". The log file is a standard text file with one line of text for each action.
This log file can be analyzed using a number of dedicated applications. One of the most well known producers of such applications is Web Trends.
If you are running your own servers or have a professional hosting solution, you should be able to get access to the raw server monitor log files. In this case you can run your own reports using a log file analyzer. However, for large websites log files can become very large, taking up mega bytes of data every day.
Some service providers run server monitor log analysis off a central server and will provide you with access to online reports. They generally include good standard reports, but you will not be able to perform your own filtering.Advantages:
Network package sniffingWith this technique the raw data packages on your network are monitored as they pass by. There is no need to insert scripts in your web pages or have the server monitor application produce log files. It is all handled on a network level.
This gives you several advantages over all other monitoring methods. First, you get access to data that neither a server monitor or a tracker can pick up, such as "stop requests" (when a user hits the STOP button before the page is loaded). You will also be able to access extremely large amounts of visitor data much faster. In fact, with some network sniffing products you can get statistics in real time - for any size website.
Network sniffing in itself is just a way of collecting visitor data - like a server monitor producing a log. However, network sniffing products often come bundled with some kind of reporting software as well as modules to output the data as traditional server log files or export it directly to a database for further data mining and reporting.Advantages:
Ranking analysisDon't get obsessed with rankings! Yes, it is important to have good rankings for your primary keywords but there are better ways to monitor the results of your search engine optimization campaign than ranking reports.
One of the best ways is through traffic analysis as described in the last section. With the right reporting tools, you will be able to see what search engines visitors came from and what keywords they used to find you. That is a lot more useful to determine the true value of your work.
If and when you do check your rankings, you should not check thousands of keywords for your website 10 times a day. This places an unreasonable load on the search engines servers, which the search engine companies do not like.
Within the services you get using BuildTraffic.net, you can see the average ranking on the keywords that brought you visitors.
Rankings change all the timeRankings change all the time. Sometimes you don't get the same results from a search engine if you test it at two different locations at the very same time.
If you want to make ranking reports using automated tools or services you should be aware that they only reflect the situation the second the report was made and from the location it was connected. From another location at another time, the results might look very different.
How do search engines work?
Search engines use automated software (known as robots or spiders) to follow links on Websites, harvesting information as they go. When someone submits a query to a search engine, the engine returns a list of sites, ranking them on their relevance to the keywords used in the search. How search engines assess your site and determine the relevance of the words often depends on how the specific search engine works. Some engines, such as Excite, use artificial intelligence to recognize concepts that frequently appear together. Other search engines, list the more popular sites first. There is no way to guarantee that your site will come up on top in a search. After all, we can't all come up on top!
Which search engines should I register with?
All of them, well most of them wouldn't hurt. It is often quoted that the top 7 search engines account for 95% of web traffic combined; Yahoo!, Google, Bing, AltaVista, Excite, Infoseek, HotBot, Lycos and WebCrawler. There are hundreds for you to choose from.
How long does it take to be listed?
Don't expect your site to show up in search engines immediately. It can take anything from 24 hours to 6 weeks or more! It depends on the search engine. Most search engine crawlers typically retrieve only a few pages from each site on each visit and visits can be weeks apart.
How do I get my Website to the top of search engines?
I have read so much that if your Website doesn't appear within the top 10 of search engine listings, then people aren't likely to find you. This may be true, but it's not the only place where people can find you and with millions of Websites out there, we can't all be in the top 10! I have found a lot of new Websites by links from other Websites There is no simple trick that will get your Website to the top of search engines. Also the search engines change the way they prioritize the listings every few months, so you can spend all your time trying to beat the system.
Search engines rank web sites on 5 main criteria:
1) Web page Title
First, and probably most important, is the title of your web page. If the title of your home page for example says simply ABC Company Home Page - you're going to find your listing at the very bottom of any search result list. Each of your web pages has a title (you'll notice it in a blue bar, across the top of your browser window). The title of each page should, in one sentence or less, describe the contents of the page with as much detail as possible. For example, don't just put "Homepage ABC Golf Store", instead use something like, "ABC Golf - retailer of golf shoes, new golf balls, used golf balls, designer clubs and more!"
Next in importance is the description meta tag. This tag is not seen by visitors but is embedded within the html code of your web pages. The tag should describe the contents of the current page - and where possible, use some of the same keywords that were used in the web page title (this will increase your relevancy ranking in many search engines). The description is often found in search engine listings. First they list your title, then they follow the title with your description. So, think of the page title and the description as a small advertisement. It has to be specific and catchy enough that a use! r will click on your listing instead of someone else's listing. In both the description and title tags, where possible, you should try to use unique words and phrases - this will help to differentiate your pages from competitors' pages.
The keyword meta tag isn't used by all search engines, however it can still be quite effective. Search engines use keywords to categorize web pages. However, simply putting in a long list of words won't do you much good (if they are too general they will be ignored, and if you have too many your page may be black-listed (i.e. removed from the search engine). Instead of using keywords, you should use short specific phrases. For example don't use "golf accessories, golf balls, golf clubs" - instead use "custom golf tees, white golf gloves, used golf balls, golf clubs for left handed players"
Some search engines will give more popular web pages a higher ranking. How do they determine popularity? The more pages that have a link to your page, the more popular the search engines will consider your page to be. And, if the linking pages have similar content to your page (i.e. they are about the same subject matter -using similar keywords, and descriptions), you'll also get a relevancy boost. Wherever possible, you should see if other web site owners (non-competitive, of course) would be willing to put a link to your site. Often they'll link to you, if you link to them. Or, if you provide information on your site that would be useful to their readers, they're likely to create a link to your page anyway.
5) Page Content
It's important to make sure that your pages contain a lot of text. Search engines can't index images - so even if some of your content is displayed graphically, you should also have it presented in a text format. The content of your pages should contain some of the same words that you used in the page title, description, and keyword meta-tags. This increases your relevancy positioning. In fact, if it seems to the search engines that the body of your page is not related to the page title or description they will penalize you by giving you a much lower ranking or dropping your listing all together. Also, keep important keywords in the content section of your page as close to the top of your web page as possible. The search engines will give your page a boost in the rankings because they will assume your page is more relevant to those keywords than other web pages that contains the exact same keywords, but lower down in the body of the page text.
Now that you have a basic understanding of what the search engines are looking for you need the tools that will allow you to use your knowledge to skyrocket your listings to the TOP of Search Engines.
BuildTraffic.net offers search engine optimization tools that will get you a top ranking in the search engines.
These are the same tools used by the "Search Engine Experts".
Part 1 - Why Search Engines Are Important To You . Learn how much traffic you could be getting.
Part 2 - The Basics. What's the difference between Yahoo and Alta Vista, and more?
Part 3 - How to get listed. What to do and what you can expect to happen.
Part 4 - How to improve your ranking. Just being listed isn't enough. Learn tricks to get a top ranking.
Part 1 - Why Search Engines Are Important To You
You've built your site. You've tested it. You're ready for business. So where are all the people? You need to get the word out about your great site, but with limited resources and no advertising budget, you need some way to attract visitors. What can you do? Try the No.1 way people find out about sites - Search Engines! Surveys show that over 85% of internet users find new Web sites by using search engines. Other surveys show that after email, search engines are the most popular activity on the web.
Instead of blowing your limited resources on banners and ads galore, use search engines to send motivated visitors to your site. Click through rates on banner ads continue to drop, while search engine traffic is on the rise and growing everyday.
Search engine optimization( techniques to improve how your Web site ranks in the search engines) has been called the cheapest and most effective marketing tool available. Expect to pay professional firms that specialize in search engine optimization $2000 for a small site - $10,000 is common for big companies. If you are in an extremely competitive business, plan on shelling out $25,000 and up. Oh, and don't be surprised if you have to sign a six month or even a year contract. That's the norm for a professional shop. You have an option to "do it in-house". With a little help from BuildTraffic, you can do search engine optimization yourself, improve your site's search engine ranking, and save yourself a lot of money, frustration, and time.
Search engine traffic is the kind of traffic you want. Why?
Traffic you receive from search engines is already targeted. Visitors arriving at your site from search engines have entered a keyword relevant to your site, so they are already interested in your product or service. This is the best source of potential customers you can have.
Search engines are the number one way users find new sites. Surveys show that over 85% of users rely on search engines to locate information on the Web. If you optimize your site to do well on the engines, then register your site with search engines, you should see increased traffic to your site.
Search engines are free to users and users know where and how to use them. One of the first things a novice to the Internet learns is how to use Yahoo (actually Yahoo is a directory, but we'll discuss the difference later).
Part 2 - The Basics
Before you can position your site to do well in the search engines, you need to understand search engine fundamentals. A search engine is a giant database that lists sites on the Internet. You access the database when you enter keyword searches and receive a list of relevant sites.
Search sites are the Internet's Yellow Pages.
Think of a search engine as a giant, automated version of the yellow pages. If you need information on "party planning" from the yellow pages, there are several steps to retrieving it.
Go to the yellow pages and look under the alphabetized subject list for "party."
Note the subcategories: "party planning" "party - children's", "party - rental equipment", and so on.
Examine the companies listed under "party - planning" and decide which company best meets your needs.
You can repeat this process online using a search engine.
Search Engines Versus Search Directories.
People use both search engines and directories without ever realizing there is a difference between the two.
Directories (Yahoo Directory, LookSmart) have human editors who review web pages, rank them, and then organize them into categorized lists with brief descriptions. The categories and descriptions are based on submissions, but are edited by professional editors (real people in the loop reviewing the sites being submitted).
Search Engines (Bing, Yahoo Search, Excite, Google) have automated programs called spiders that index sites and score pages based on proprietary guidelines. There is no human judgment involved. Search engines index all the information on all the Web pages they find. The indexes are generated automatically, based on the words and phrases that are found on Web pages.
What is a Search Engine?
Search engines send automated computer programs (called robots or spiders) to crawl the Internet in search of Web pages. Basically these spiders follow links to travel from URL to URL. When they visit your Web site, the robot indexes (or records) the text of your page or pages (if it is a deep crawling spider) and stores it in the search engine's index. Later, when a user enters a search query at the search engine's Web site, the search engine scans Web pages in its index for pages that provide the best match.
In theory, the search engine spider is supposed to be able to find all the sites on the Internet. However, since new sites are being added daily, it's risky to assume that the spider will find you. Expedite the process by submitting your URL to the search engines.
What is a Directory?
Unlike the automated search engine process, each entry in a directory is first reviewed by a human being. You submit a short description to the directory for your entire site, or editors write one for sites they review. A keyword search will only look for matches in these descriptions, so be careful how you describe your site. Techniques to receive a high search engine rating will not work with a directory. While good content is necessary for search engines, both good content AND visual appeal are mandatory in human-edited directories. Remember, manual review takes time! The typical time lag between submission of a site and its actual listing in Yahoo is five months. You can speed up the process at some directories, but expect to pay for that service. This trend will probably continue.
Yahoo's Business Express Program, offers express guaranteed consideration of your commercial Web site within seven (7) business days.
Looksmart moved to an exclusive pay for listing scheme - all new submissions must pay to have their site reviewed.
Open Directory Project is a free directory. (Hint: If you get your site listed here, it will help you later in Google.)
Search Engine and Directory Hierarchy.
There are literally thousands of search engines on the Internet, but naturally you're most concerned about your ranking on the high-traffic sites. Some of the smaller search engines may not bring you a lot of traffic, but your listing gives you another source of links (which can help in your overall link popularity building). If you can do well on the sites listed below, you will probably do well on others too.
Top Search Engines Google bing AltaVista Iwon Lycos HotBot Teoma Ask Jeeves WiseNut Overture Top Search Directories Yahoo - The biggest search site of all. Do well here. LookSmart Open Directory Zeal AOL
Part 3 - How to get listed
Don't wait to be discovered! Submit your URL directly to the search engine or directory.
Search Engine submissions. There are two ways to submit to search engines and directories, manually or using an automated submission tool. Here is a summary of both methods.
Manual Submission - Use the Add URL form from the search engine site itself. This way, you have absolute control over where your site is submitted. However, this process is that it is a very time consuming and labor intensive activity. Some search engines bury their Add URL form so far down in the site that one wonders if they are intentionally trying to thwart potential applicants.
Automated Submission Tool - Fill in the data once and the tool automatically submits your URL to multiple engines. It is a fast, easy one step process and you only have to fill in the data forms once. There are several features you should look for in an automated submission tool.
The number one item of importance is to make sure the submission tool replicates hand submissions. This is hard to tell from a non technical standpoint. However, you can be assured that BuildTraffic's system does and, it is the most advanced system available on the Internet today.
The second thing to look for in an automated submission tool is to find one that allows you to pick and choose which engines to submit to. If you are doing well in one engine, but not in another, you may want to submit to the search engine where you need improvement and not submit to the engine where you are doing well. Again, BuildTraffic's Search Engine Tools provides the user the flexibility to submit any page on their Web site to their choice of over 750,000 different search engines.
Find a submission service that allows you to include a junk email address. Because most submission services submit to many search engines, you may receive spam email by some of the smaller engines that occasionally sell their email addresses. Note: It is not the submission tool company that is selling the email addresses, it is the search engines themselves. BuildTraffic's Search Engine Tools allows you to use an alternate email address when submitting your site. Additionally, BuildTraffic monitors the search engines it submits to and purges spam-abusive engines from the submission list.
Bottom Line: Submit your site to directories by hand. Save time by using a submission tool to submit to search engines.
How often do I submit?
Your best strategy is to submit weekly until your site gets listed. Check your listing frequently. If your site disappears suddenly, you may be the victim of a search engine database omission. Search engines frequently have multiple versions of their databases and they aren't always in sync. You may be listed in one version of their database and not in another. Your only recourse is to resubmit your site.
To see if you have even been picked up in a search engine, go to the search engine's site and do a search with your company's domain name as the search query.
How long does it take to get listed?
These times vary with search engines and directories. The search engines will optimistically report a very short time, but most sites get listed within the following times:
4 - 6 weeks
4 - 6 weeks
The reality is that most search engines and directories are very backlogged and slow to get listings added to their databases. Don't be surprised if you experience waits much longer than these. As mentioned earlier, you may wait over 5 months to see your site get listed on Yahoo!, if it gets listed at all. We recommend that if your site isn't listed within the time periods above, that you resubmit your site to the search engine where you are not listed.
Is being listed in the search engines good enough?
A listing won't automatically increase your traffic. A good ranking may. You want to be listed on the first three pages of the search engine results page. Most web surfers aren't patient enough to look more than the first 20-40 listed links. In fact there is a considerable drop off just going from the first to the second page.
Part 4 - How to improve your ranking.
How do search engines rank pages?
Search engines use a ranking algorithm to determine the order in which matching web pages are returned on the results page. Each web page is graded on the number of the search terms it contains, where the words are located in the document, and other criteria that changes frequently.
All search engines have a different method of ranking. That's why you might rank number 1 on one engine and number 25 on another. Robots look for relevance and rank results on a secret ever-changing algorithm. Some look at TITLE, some look at META tags, some look for link popularity. Search engine optimization means optimizing the Web site for the best possible positioning based on the page's keywords and description.
BuildTraffic's Search Engine Tools is a search engine optimization package that will walk you through the whole optimization process. BuildTraffic has numerous experts who have conducted extensive tests to help identify what techniques work to better your ranking in the search engines. Additionally, we have identified what can get you into trouble with search engines.
General tips to get a good ranking.
1. Create a good site with good content.
This is critical, especially as search engines grow in sophistication. If your site contains worthwhile material, users will return to your site and will recommend it to others. Other sites will link to you - which will in turn help you by improving your link popularity.
2. Pick keywords visitors will actually use on a search engine query.
If you have keywords that are very competitive, consider narrowing your focus to improve results. The keyword "horse" will return thousands of responses and may not place you near the top, while "Appaloosa" is more focused and targeted to a particular query.
Consider using a keyword phrase instead of just one keyword. Visitors to search engines use phrases to narrow their searches. For example, instead of using a keyword like "horse" that would return too many responses, use a more specific keyword phrase like "Alabama Quarter horse."
Brainstorm a good list of list of keywords. Tap into other people - a fresh perspective can help uncover words you may have missed. You can also use the BuildTraffic Keyword Generator to generate words for you.
Don't just guess at which keywords are popular, get quantitative feedback using the Keyword Popularity tool included in Search Engine Tools. Remember, if you pick the wrong keywords, all your optimization will be wasted.
3. Include keywords in your TITLE tag.
Pages with keywords appearing in the TITLE are assumed to be more relevant to the topic than those without.
4. Use keywords in META Keyword and Description tags.
Using META tags will not hurt you in search engines that don't use them, and they can definitely help you in search engines that do index them. While they are not as important as the TITLE tag, META tags can give you the edge over your competition since most web sites don't even use them.
5. Use your keywords throughout your page.
Search engines will check to see if the keywords appear near the top of a web page, such as in the headline or in the first few paragraphs of text. They assume that any page relevant to the topic will mention those words right from the beginning.
6. Have a good keyword density on your page.
Keyword density is derived by dividing the frequency of that word by the total words on the page. Frequency is a major factor in how search engines determine relevancy. A search engine will analyze how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages. This can turn into a balancing act as too high a density can be considered spam by some engines. Usually you are safe if your keyword density falls between 1 - 5 %.
7. Continually work on improving your link popularity.
Listings on popular Web sites can increase your traffic significantly. They do this in two ways:
They give potential visitors multiple paths to your Web site.
They can increase your ranking in search engines that use link popularity as part of their formula.
Most search engines use link popularity as relevance criteria. For example, the Google search engine (not their new directory) is based almost entirely on link popularity.
Success with search engines and directories is not one magical thing you do specifically. It is the culmination of your whole strategy. It is a time consuming, labor intensive activity with great rewards. Bottom Line: Use BuildTraffic's Search Engine Tools at least once a month to help you get more traffic from the search engines.
BuildTraffic's Search Engine Tools is a search engine optimization power house. It's a mega tool bundle that can help you improve your search engine ranking and bring more traffic to your site. Our system walks you through the whole search engine optimization process. From selecting your most effective keywords and building META tags, to optimizing your pages for peak search engine performance, to submitting your pages to the search engines of your choice and to tracking your site's performance in the search engines over time. BuildTraffic has the one stop solution that will greatly improve your search engine performance.
Before getting started on using gateway pages and other HTML techniques
to improve your search engine ranking, you need to know a little about spam
and spamdexing. Spamming the search engines (or spamdexing) is the practice
of using unethical or unprofessional techniques to try to improve search
engine rankings. You should be aware of what constitutes spamming so as
to avoid trouble with the search engines. For example, if you have a page
with a white background, and you have a table that has a blue background
and white text in it, you are actually spamming the Infoseek engine without
even knowing it! Infoseek will see white text and see a white page background,
concluding that your background color and your page color are the same so
you are spamming! It will not be able to tell that the white text is actually
within a blue table and is perfectly legible. It is silly, but that will
cause that page to be dropped off the index. You can get it back on by changing
the text color in the table to, say, a light gray and resubmitting the page
to Infoseek. See what a difference that makes? Yet you had no idea that
your page was considered spam! Generally, it is very easy to know what not
to do so as to avoid being labeled a spammer and having your pages or your
site penalized. By following a few simple rules, you can safely improve
your search engine rankings without unknowingly spamming the engines and
getting penalized for it.
Some techniques are clearly considered as an attempt to spam the engines.
Where possible, you should avoid these:
Keyword stuffing. This is the repeated use of a word to increase its
frequency on a page. Search engines now have the ability to analyze
a page and determine whether the frequency is above a "normal"
level in proportion to the rest of the words in the document.
Invisible text. Some webmasters stuff keywords at the bottom of a
page and make their text color the same as that of the page background.
This is also detectable by the engines.
Tiny text. Same as invisible text but with tiny, illegible text.
Page redirects. Some engines, especially Infoseek, do not like pages
that take the user to another page without his or her intervention,
Meta tags stuffing. Do not repeat your keywords in the Meta tags more
than once, and do not use keywords that are unrelated to your site's
Never use keywords that do not apply to your site's content.
Do not create too many doorways with very similar keywords.
Do not submit the same page more than once on the same day to the
same search engine.
Do not submit virtually identical pages, i.e. do not simply duplicate
a web page, give the copies different file names, and submit them all.
That will be interpreted as an attempt to flood the engine.
Code swapping. Do not optimize a page for top ranking, then swap another
page in its place once a top ranking is achieved.
Do not submit doorways to submission directories like Yahoo!
Do not submit more than the allowed number of pages per engine per
day or week. Each engine has a limit on how many pages you can manually
submit to it using its online forms. Currently these are the limits:
AltaVista 1-10 pages per day; HotBot 50 pages per day; Excite 25 pages
per week; Infoseek 50 pages per day but unlimited when using e-mail
submissions. Please note that this is not the total number of pages
that can be indexed, it is just the total number that can be submitted.
If you can only submit 25 pages to Excite, for example, and you have
a 1000 page site, that's no problem. The search engine will come crawling
your site and index all pages, including those that you did not submit.
There are certain practices that can be considered spam by the search engine
when they are actually just part of honest web site design. For example,
Infoseek does not index any page with a fast page refresh. Yet, refresh
tags are commonly used by web site designers to produce visual effects or
to take people to a new location of a page that has been moved. Also, some
engines look at the text color and background color and if they match, that
page is considered spam. But you could have a page with a white background
and a black table somewhere with white text in it. Although perfectly legible
and legitimate, that page will be ignored by some engines. Another example
is that Infoseek advises against (but does not seem to drop from the index)
having many pages with links to one page. Even though this is meant to discourage
spammers, it also places many legitimate webmasters in the spam region (almost
anyone with a large web site or a web site with an online forum always has
their pages linking back to the home page). These are just a few examples
of gray areas in this business. Fortunately, because the search engine people
know that they exist, they will not penalize your entire site just because
There is an inappropriate amount of fear over the penalties of spamming.
Many webmasters fear that they may spam the engines without their knowledge
and then have their entire site banned from the engines forever. That just
doesn't happen that easily! The people who run the search engines know that
you can be a perfectly legitimate and honest web site owner who, because
of the nature of your web site, has pages that appear to be spam to the
engine. They know that their search engines are not smart enough to know
exactly who is spamming and who happens to be in the spam zone by mistake.
So they do not generally ban your entire site from their search engine just
because some of your pages look like spam. They only penalize the rankings
of the offending pages. Any non-offending page is not penalized. Only in
the most extreme cases, where you aggressively spam them and go against
the recommendations above, flooding their engine with spam pages, will they
ban your entire site. Some engines, like HotBot, do not even have a lifetime
ban policy on spammers. As long as you are not an intentional and aggressive
spammer, you should not worry about your entire site being penalized or
banned from the engines. Only the offending pages will have their ranking
Yes! Definitely! In fact, the search engines do not discourage responsible
search engine positioning. Responsible search engine position is good for
everybody - it helps the users find the sites they are looking for, it helps
the engines do a better job of delivering relevant results, and it gets
you the traffic you want!
As a webmaster, you should not be too afraid that you are spamming the search
engines in your quest for higher search engine rankings. No question about
it, though, spam is something that every webmaster should understand thoroughly.
Fortunately, it is easy to understand it. So learn the rules, re-examine
your web pages, resubmit to the engines, then create gateway pages to get
better ranking on the engines, using the rules above. If you need any more
information on search engine spamming and search engine positioning, see
http://www.buildtraffic.net We wish you the best of fortune in your
web promotional efforts!
We all know it and we all do it. Whenever the typical web user needs to
find something on the web, he or she will almost always instinctively go
to one of the top search engines and run a search. Then, he or she will
have a look at the first 30 search results returned (usually the first three
pages of results), hardly ever looking beyond that. If nothing looks appealing,
they will run another search using a variation of the keywords they used
on the first search. Again, they will look at the first 30 results. If they
still find nothing of interest, they may switch to a different search engine
and repeat the process. This, believe it or not, is the typical web navigation
behavior of at least 86% of all the 110 million web users out there. The
question is, does your web site capitalize on this behavior? If the answer
is yes for you, then no matter what a user searches for that has any relation
at all to your products, they should find you even if they hadn't thought
of finding you! That translates to a very high amount of daily traffic to
your site. Is this what is happening for you?
We bet the answer is no. Very few web sites really capitalize on this behavior.
However, you can learn how to do so. For this tutorial, we shall assume
that you are in the business of selling wedding gowns. You have a web site
that does a good job of displaying your wedding gowns, but like most other
web site owners, your web traffic is really low and clients are simply not
finding you. You have done everything you can think of, but your pages hardly
come up on the search engines. What next? Well, get ready to learn one of
the most effective online marketing strategies.
We know that the web user will think of a keyword or phrase and type that
into a search form on an engine. That is his or her first step, and so it
should be ours. This is a very important step! Get a piece of paper or open
up your text editor and keep writing all the keywords and phrases that you
think of as you go through the steps outlined below. Selecting the right
keywords or keyword phrases will skyrocket your traffic. Research has shown
that most people use phrases of 2 to 3 keywords when searching. People have
come to learn that just typing in one keyword in a search will get them
thousands of records that they just do not need. Therefore, they combine
two or three keywords in their searches.
Your list should end up having about 50 or more keywords and phrases. You
must write down every word that a person might use to look for your site
PLUS words they may use to look for other related information or products.
This is no easy task. It involves some research and a fair bit of thought.
Remember, thousands of people out there looking for wedding gowns may not
all think alike, look for wedding gowns directly, or even realize at the
time that they can get wedding gowns on the net! To get a good list, here
are the steps that you should go through. Keep your pen and paper handy
so that you keep writing as you go along each step.
Think! Brainstorm! Sit down for several minutes and think! Write down
whatever keywords or phrases come to mind. Then look at your list and see
what other word combinations you can make out from the keywords you have
already written down.
Look at your products and services and see what words relating to them
will pop up in your mind. Write the names of each product or service that
you deal in. These product names will also be part of your keyword list.
Ask your family, associates, clients, and friends what words and phrases
they would most likely use to search for products or services like yours.
See what keywords your competitors use. See what keywords are featured
on their pages and in their meta tags.
Be specific AND general in scope. Think of words that also describe things
related to what your site is about. For example, if you sell wedding gowns,
also think about other things and words related to weddings, such as honeymoon,
bridal registry, wedding rings, engagements, and other wedding accessories
and services. The idea here is to capture people who may be looking for
other things related to weddings, whether they are directly looking for
wedding gowns or not at the time. Chances are that someone looking for any
wedding related information will also be interested in knowing about wedding
gowns. Get the idea? This is a very powerful way of getting more targeted
traffic to your site.
Do not forget misspelled words. People often misspell words when they
enter their search terms. Include commonly misspelled words or phrases as
part of your list.
If your site deals with a particular region, remember to include that
in your keyword list. E.g. 'Los Angeles wedding gowns' if you deal in wedding
gowns within Los Angeles. A lot of people will search for wedding gowns,
get a big list of returned search results, and decide to add the words "Los
Angeles" (or wherever else they may be interested in) to get more specific
Use the long and short form of words, e.g. consult, consulting, and consultants.
DO NOT SPAM! Do not use keywords that have nothing to do with your web
site. Do not list the keyword "sex" or "free pics" just because they are
frequently searched for if they do not have anything to do with your web
site. You will only get traffic that is of no use to you anyway because
surfers will come to your site and leave angrily when they realize you just
cheated them. Also, search engines are on the look out for such deceitful
practices and can sometimes ban your pages for spamming. As long as you
stick with keywords related to your core business, you will be fine, but
do not go way out of line.
Now its time for some research. You now need to look at what people
are actually searching for on the various engines and get more phrases for
your list. Here is a list of sites that will help you out with that. When
doing your research at these sites, remember steps number 5, 6, and 8 above
- people often misspell words, they use different word forms, and they may
be looking for a whole lot of different stuff all related to your products.
How Does Link Popularity Work?
When some search engines index a page, they will check their database for how many pages are linking to that page (or domain). This will give the page a 'link popularity' rating. In some cases, such as Google, it will also use the link popularity of the site that is linking to that page to determine the quality of the link. For example, a link from C-net would push the rating much higher than a link from a personal page.
When a search is performed, the engine will first scan for relevant results. Once it has the results, it will use the link popularity rating to sort the results. You can see great examples of this by performing searches in Google. Most of the results for general query's are from high traffic sites. For example, the top 3 results on Google for an 'entertainment' search are Netscape, Lycos, and Yahoo.
So how is it that you get links?
First, the Website must have quality content. A site with good content will get plenty of links from around the Web through no extra effort by the webmaster.
Another great way to get "quality" links is to get listed in search directories such as Yahoo and the Open Directory. These count as links from high traffic Websites, and will increase your link popularity significantly. Getting a listing in the Open Directory will get your link on several Websites, such as Lycos, AOL, and Altavista, since the data is syndicated on many engines.
You can find out more about getting listed in directories here.
Maintaining an affiliate program is also helpful. Many search engines will follow affiliate links and they will factor into the site's link popularity. An affiliate program can effectively give a site thousands of links on the Web.
Giving Links to the Engines
There could be thousands of pages linking to a site, but they don't help unless the search engines are aware of them.
When anyone publishes a web page, it really doesn't matter if it's a Business
related site or just your own personal page, it's not much fun if you don't
have any visitors, right?
Of course someone who is trying to build an income online is very much more
interested in this than one who is only playing with a "home page".
Ya gotta' have traffic (i.e.: visitors) if you ever hope to make any sales.
According to some industry reports, better than 50% of visitors find your
site by doing a "Search" with the Search Engine they like the best.
Most everyone has their favorite so you never know how someone might find
Every one of these Search Engines have unique ways of handling the additions
of new sites. Some utilize Meta Tags and some do not, but here's the catch
- they can change the way they do any or all of this at any time they decide
to, without any prior notice.
In my opinion, you can drive yourself nuts trying to make sure you are in
the "top" whatever.
There are a whole bunch of services out there loudly proclaiming a "guaranteed"
listing in the "top 10" by using their service. Let's get real here, there
can only be 10 sites in any "top 10" with any Search Engine. If you multiply
that by the number of Search Engines, then maybe... you might be listed
in the top ten with one. And that's a very BIG MAYBE!
The best thing you can do is try to make your site as Search Engine (SE)
friendly as possible. One of the ways to do this is by the use of Meta Tags.
The first time we heard about Meta Tags, we knew that this was most decidedly
something we needed to do, but not one of these references told us what the
danged things were supposed to look like, what the proper format was, what
all we should include, and on and on went the questions in our mind.
So for your information here is the proper format to use for your Meta Tags:-
First, of course you'll have the beginning of your web document. Doesn't
matter if you do this yourself or use a WYSIWYG editor.
<title>(Here's where YOUR title would go) </title>
Your "Title" should have no more that 60-60 characters.
Right after that is where the Meta Tags go. Here is the complete format
<meta http-equiv="title" content="Your Info">
(if you've developed a good title, that describes your site well, insert
the very same thing in between the quote marks where I've inserted "your
info". Do leave the quote marks in place.)
<meta name="resource-type" content="document">
<meta name="revisit-after" content="30 days">
<meta name="classification" content="consumer">
These can stay this way. What they are telling the various engines is that:-
this is an HTML document, you request they visit every 30 days to update
the changes you've made, your information is to be classified as "Consumer"
<meta name="description" content="YOUR Info">
You can repeat your title info here, again if it's a complete title that
really targets what your site is about. However, here you can elaborate
more, up to 150 characters.
<meta name="keywords" content="YOUR KEY WORDS">
This one is really important! What would YOU search for using a SE and trying
to find your site? What is the main focus of your site? Do you offer some
free stuff? Information? What kind? Sell something? - What?
Spend some time deciding what the real key words should be for your site,
up to 1000 characters.
Really this should be what you do first. Then build your "Title", and other
information around these. Do NOT repeat key words more that 1 to 3 times.
Many search engines will consider this "Keyword Stuffing" and will not list
your site at all.
<meta name="robots" content="all">
This tag is asking the robot to list all of your site. Some engines list
each page and some only list your entry or index page. But, if this tag
is present, the "robot" will list the maximum it's been programmed to list.
<meta name="distribution" content="global">
This tag tells the robot that your site is of interest to the whole world
and not just "regional" or "limited" in some other way.
<meta name="rating" content="general">
This tag rates your site. Similar to the Movie or TV rating of programs.
If your site is an "Adult" site, you'd better rate it that way.
If your site is intended for children you can check out these locations
and register it for a better rating:- Register with SafeSurf, Weburbia or
RSAC. We believe there is also an "icon" they'll allow you to include on
<meta name="copyright" content="1998 - 2012">
This is of course self explanatory. We've left our info in place because
we've been on the internet with copyright material since 1995. If your site
is new this year, delete the 1995.
<meta name="web author" content="YOUR info">
Did you create your site, or did someone else do it for you? That's what
will decide what goes here.
All of this information should be included between the </title> tag
and the </head> tag.
After that comes the <body> tag and then of course all the great information
you've lovingly gathered together for your visitors.
Hope this helps to explain the importance of Meta Tags. Every page you develop
should include these tags formatted with relevance to the page they're on.
One big NO-NO we want to mention here - Don't copy and use these tags from
other sites! Remember, everything you see on someone else's site is copyright
material even if you don't see the symbol or any reference to copyright.
It's an unwritten law and you can be sued for using their information without
prior permission. We do as much "peeking" at source codes as anyone out
there does. We even save some we see on our desktop, but all the code we
save is for studying only. It's a great way to learn more about what I'm
trying to do.
So how do you make the search engine robots give your site a better rating
than all the other millions of Web sites trying to do the same thing? Simple,
give them what they want. You can't trick them or make them think that you
are better than you are. Think about a visit from the eyes of a robot. He
finds a site, usually from links embedded in Web pages, then loads the text
from the first page.
He looks for the META tags and pulls out the keywords and description. If
not there he takes the first 200 or so characters of text and uses them
as a description.
The Title is extracted.
He extracts the pure text from the page (strips out the HTML coding). He
takes out the common words leaving what he feels may be keywords. (Most
do not do this last step.)
He now extracts the hyperlinks collating them into those that belong to
this Website and those that don't (He visits these later as this is how
he finds new Web sites).
He may do the same with the email addresses.
He goes on to the next page and so on until he has visited all of the pages
in your Web site.
Now he stores all of this information.
He now knows how many pages you have, how many 'outside hyperlinks in your
site', and can give your site a score based on how it is set up. These are
What do they do with the info? When someone comes to search a phrase or
keyword, another search routine program takes over using the information
the robot found. A person types in the keywords and the search program returns
the 256,000 pages matching their keywords. BUT they also consider the following:
How old is the Web site or how long has the engine known about it? How large
is the Web site? Was it properly constructed? How many hyperlinks are there
to outside Web sites?
VERY IMPORTANT! How many hyperlinks are located on other Web sites to this
site. The older and better the Web site the more links to it.
These robots know when you are cheating. You can't trick them. It is so
simple for the robot developer to incorporate code to negate the tricks.
What about scoring keywords only once or twice per page or area like meta,
title, etc.? Is this page close in size to all the other portal pages? How
many Web pages in the same directory have the word "index" in
them? Does this site have a lot of content? Is any text the same color as
the background? Are there links to outside sites? Each page can be checked
and compared against what the robot feels is a statistically normal page.
These are computers you know.
You need a lot of pages with normal content. Instead of spending the time
to make fake pages, give the real ones content. This will also give your
visitors something to come back to. CONTENT.
If there is one thing you have learned about robots, it is that there is
absolutely no pattern to them. Most robots are stupid and wander randomly.
For example, 50% of robot hits to my sites ask for the robots.txt page and
then go away never asking for anything else. Then they come back a week
later, ask for the same thing and then go away, again. This happens over
and over again for months. You will never never figure it out. What are
they doing? If they wanted to see if the Web site was really a Web site,
they could just Ping it. This would be much faster and much more efficient.
They seldom visit another page and if they do, they ask for one other page
every visit or so. Some come in and issue rapid-fire requests for every
page in the Web site. How rude! You have to quit worrying so much about
robots. It takes 6 months before they request enough pages to do you any
good. We really quit thinking about them a long time ago. Build a lot of
pages correctly, and, if you have reciprocal links to them, the robots will
find them someday.
Try this: Go to AltaVista and type into the search box "link:YourSite.com"
(Leave off the www). This will list the reciprocal links to your Web site.
Try link:crownjewels.com and you get 136 links to it. Think about this now:
The robots say to themselves, "Here is a site that must be popular or why
would so many Web sites SIMILAR to it have it's link on their pages?" Remember
that only SIMILAR sites with SIMILAR THEMES would probably have a link to
your site. They give more importance to this than you submitting your link
to them. Wouldn't you?
Go to heavily trafficked sites matching your Web site's Themes and use AltaVista
to find out how many reciprocal links they have. This will prove to you
we are right.
Search engines are nothing more than a measure of reciprocal links to your
site. The problem is, you are constantly having to fight for your positioning
in the search query listings. Forget about that. Leave the fighting to people
who are able to spend 24 hours a day trying to trick everybody. Quit trying
to compete with the large organizations pouring millions into their marketing.
Completely forget about Search Engines after submitting to them and go after
the reciprocal links. The Search Engines will then believe you are a heavily
visited site because you will be. You will now be getting the traffic you
so richly deserve.
Search engine visitors to your site, are oftentimes not qualified visitors.
Too many visitors pop into your home page for 2 seconds and then leave.
You know how it is. We all do it when we are using the search engines. Either
it wasn't the information we were looking for, or they had this huge graphic
on this stupid portal page, which just took forever to load. These visitors
shouldn't even count, but they get counted as 12-18 hits in your server
logs. Hits are requests to the server. One page request can incur a lot
of hits: requests to the page itself plus the graphics, each count as a
Reciprocal links bring in qualified visitors. These are visitors who were
already on a Web site which had matching Themes to yours. They already have
a good idea of what type of site you are. They will come into your site
and actually stay awhile. These visitors should count as double credit,
they are so good.
We know which type of visitor we would rather have.
How do you get people to WANT to put your link on their Web sites? Why would
a similar site put a link to your site on theirs? Simple, you have similar
Themes. You are similar, but not competition.
There is one very important lesson to be learned from this crazy robot behavior.
You need to make the navigation in your Web site so easy that a visitor
can find any page within 2 clicks of your home page. One way of doing this
is installing hidden DotLinks. Dotlinks are little periods that are linked
to other pages which are not really noticeable on your page if you put it
as a period. Although they are not easily seen by the human eye, they are
a link that a robot can follow in your Web site. When you do this, robots
can find your pages faster and more easily.