Removed From Google Index, Why ?

Removed From Google Index, Why ?

by: Tiberiu Bazavan

1. Google Statement

กYour page was manually removed from our index, because it did not conform with the quality standards necessary to assign accurate PageRank. We will not comment on the individual reasons a page was removed and we do not offer an exhaustive list of practices that can cause removal. However, certain actions such as cloaking, writing text that can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in permanent removal from our index. If you think your site may fall into this category, you might try กcleaning upก the page and sending a reinclusion request to [email protected]. We do not make any guarantees about if or when we will reinclude your site.ก

SO : Firs of all No one other than Google could tell you precisely why your site was removed. And Google doesn’t do that sort of thing. The best that can be had is an analysis yielding red flags that could lead to your siteกs removal.

2. No SEO can guarantee inclusion in Google after he/she analyze the site.

3. The search engines are not terribly specific about the practices that are considered to be SE spam, however, Google offers a short list:

Avoid hidden text or hidden links.

Don’t employ cloaking or sneaky redirects.

Don’t send automated queries to Google.

Don’t load pages with irrelevant words.

Don’t create multiple pages, subdomains, or domains with substantially duplicate content.

Avoid กdoorwayก pages created just for search engines, or other กcookie cutterก approaches such as affiliate programs with little or no original content

Most of ‘the restก is covered by:

กAvoid tricks intended to improve search engine rankings. A good rule of thumb is whether youกd feel comfortable explaining what youกve done to a website that competes with you. Another useful test is to ask, กDoes this help my users? Would I do this if search engines didn’t exist?กก

กDon’t participate in link schemes designed to increase your siteกs ranking or PageRank. In particular, avoid links to web spammers or กbad neighborhoodsก on the web as your own ranking may be affected adversely by those links.ก

กGoogle may respond negatively to other misleading practices not listed here, (e.g. tricking users by registering misspellings of wellknown web sites). Itกs not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it.ก

Webmaster Guidelines | Quality

http://www.google.com/webmasters/guidelines.html

Yahoo Search is even more restrictive in their definitions of spam and undesirables, however, concentration on detection and removal has not been receiving quite as much focus. It would be an excellent idea to be aware of and conform to Yกs restrictions, since Yahoo has no reacceptance policy. Banishment has been, in every case Iกve heard of, permanent. [http://help.yahoo.com/help/us/ysearch/deletions/deletions05.html]

4. Crosslinking / Interlinking

The latest practice apparently added to the list of undesirables is crosslinking or interlinking. Made for the SE sites are linked together in an attempt to artificially inflate PageRank.

กCrosslinking If your entire site is sitting at PR0, one possibility is a crosslinking penalty. Sometimes a webmaster who controls two or more websites will place links from every page of one website to every page of the other sites to increase the PageRank ofall the sites. If detected, this will quickly incur a penalty if not an outright ban from the Google index.ก

Whyกs My Sites PageRank Now Zero

http://www.rlrouse.com/pagerankpenalty.html

Among other indicators, factors which might prompt discovery of crosslinkage could be:

Same content verbatim

Same cookie structure

Javascript function names

Linked CSS and JS files

CSS class names

Same contact information posted on websites

Common name servers

Same/similar images and/or graphics theme

Site hosted on same IP/block

Whois information matching

Alexa contact information matching

Interlinking of domains

Common backlinks (indirect crosslinking)

Same credit card used for anything

Login from same IP to separate accounts

Residual cookies from past logins

Similar file names or linking/directory structures

Code Comments

It used to be considered relatively safe to have as many inbound links as possible. Regardless of source. Over the course of this year, that assumption has spawned link purchase, and hidden crosslinking. Nowsites must also be very careful about inbound links. The crosslinked and purchased links networks have been devalued. Sites linking to those types of networks have reported decreasing traffic, and finally,over the past month or so, a number of such sites have been completely dropped from the index.

Whether this was a manual removal or an algorithm shift can’t be determined without proprietary information from inside Google, which we already know isn’t possible. Remember Googleกs SE spam fighting philosophy: กGoogle prefers developing scalable and automated solutions to problems, so we attempt to minimize handtohand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

The crosslinking tactics alone used are consistent with those of other sites that are considered spam and may have been reported as such. If Google should target those characteristics based on spam reports for other sites, then it is not surprising that homeboundmortgage would be caught by the same adjustments to the algorithm or filters, and be dropped from the index as well.

5. Over Optimization Penalty

This summer, a new term has emerged, Over Optimization Penalty, which refers to the tweaks most SEO make to pages to กfine tuneก them to the top of their keyword categories. Page length, Keyword Density, bold, underline, italic, H1 formulas, link text, and various other small elements are manipulated until the perfect balance is struck, and the SEOกd site contains just a small bit more than the other sites in the top 10. It can be a full time job keeping a site at that level with these small changes. Google has lowered the bar, now effectively saying that high keyword densities, and many of the other SEO tweaks are evidence of too much SEO. Filters are created, and such sites drop in the rankings. Sites that have been playing too close to the edge are penalized.

ALT Text

Keyword in Alt text Not only does this contribute to ideal word count, it may show keyword stuffing in the ALT tags. To clean up, switch to clear text navigation.

Old Link Exchanges

Pages stored in the Internet Archive may indicate the site was once involved in some questionable link exchanges.

Duplicate Content

All duplicate pages should be eliminated. Link just one page consistently.

Now, once all these changes have been made, what to do. You can, of course, try writing [email protected]. กWe do not make any guarantees about if or when we will reinclude your site.ก

Webmaster Guidelines | Not Listed

www.google.com/webmasters/2.html

I personally know of only 4 sites that have been reincluded after manual removal. In each case, the site was crawled regularly, but was not included in the index for over six months. I don’t know the specific reason for this, of course, but I would imagine it might be some sort of a testing period. How strong is a webmasterกs resolve to walk the straight and narrow, despite lack of indexing.

So, assuming best case, you might be looking at six months or more before your sites are reincluded in the index. Once delisted, I also imagine such sites must stay squeaky clean. An SE might forgive once, but seldom twice.

6. Looking Ahead

VERY worst case, permanent exclusion from organic results on Google. Recovery basically means starting over, nearly from scratch. Plan a 1218 month Overture or AdWords campaign, originally targeting the current website. Pick more specifically targeted keyword phrase initially, to keep costs down. I realize this is a high priced keyword neighborhood. You may need to create new, perfectly targeted landing pages to lower acquisition costs.

Select a new domain name. Without reinclusion within a short period of time, the current name will continue to lose value daily.

Build a new, clean site under the new domain name. Text must be fresh, not a duplicate of the current domain. Do not duplicate site structure, filenames, or other elements that could link it to the banned name.

Gradually add organic links. Expect it to take 612 months in order to acquire 1,000 related links. Continue to link to related sites over the next 18 months. Grow the site adding one new page (250500 words) each day.

As the new site rises in the SERPs gradually switch the PPC traffic to the new site, and retire the current site completely. At any point in the process, if the current (old) site should reappear, it shouldn’t be an undue amount of work to gradually retarget the newly acquired links to the older site. Encourage natural link text by those linking to the site.

If youกre going to optimize at all, test first in a safely isolated site. Youกre not going to be able to push the optimization envelope for quite some time. The key to longterm survival and growth will need to be the กcontent is kingก model.

The only bright news in the picture is that the site isn’t being metahijacked, at least not under any of the keywords Iกve tried (your homepage META keywords list.) Google itself wouldn’t encourage such a practice, and in fact will likely be glad to deal with any such offenders under the DMCA.

Sources:

Crosslink Detection

www.webmasterworld.com/forum3/25568.htm

Crosslinking Penalty

www.webmasterworld.com/forum3/23890210.htm

Sandbox Effect

www.promodo.com/websitepromotionarticlesen/aboutgooglesearchenginepromotiontips_page1_seo60.html

Innocent Interlinking of Sites

www.webmasterworld.com/forum3/25564.htm

A Statistical and Experimental Analysis of Googleกs Florida Update

www.linksecrets.com/pub/floridareport.html

Speculation About August Changes

www.webmasterworld.com/forum3/25251.htm

If the sites were dropped due to algorithm changes, then its possible that theyกd come back after cleanup. However, their supporting network of links has, at the very least, been devalued. That alone will cause a change in PR, even if cleanup makes them eligible for reinclusion. Site owners report that in these cases, cleaned up sites have come back into the results in 23, to six months. Getting back to the top would take additional time, and new, unblemished linking.

Two schools of thought. After cleanup, resubmit the site, or allow it to be found via linkage. Though Google states that thereกs no oversubmission penalty, most webmasters tend to be leary of such submission, allowing robots to find the site via linkages. Iกve never had a problem with resubmitting URLs that have dropped out for whatever reason. I take Google at their word that itกs harmless, as long as its a manual resubmission for good reason (for example, page down when the crawler came through), not submission through a service or promotion program. One submission is enough. Either way, watch the site logs. If robots don’t show up, it could well be a case of manual deletion and permanent ban.

The site needs to be squeaky clean for the near term. The days of easy SEO are on hiatus. The oldfashioned methods (content for users, natural linking) are back. Iconocast has quite a backlog of goodwill links left. Those must be preserved and strengthened, while getting rid of harmful inbound linking.

Again, thereกs no guarantee or certainty of getting back into the Google index. Same procedures, same wait and see.

Sources:

The August Chronicles

www.webmasterworld.com/forum3/25553.htm

Denial of Google Over Optimization Penalty

www.markcarey.com/googleguysays/archives/discussdenialofgoogleoveroptimizationpenalty.html

Future of SEO

http://list.audettemedia.com/SCRIPTS/WA.EXE?A2=ind0408&L=led&D=1&T=0&H=1&O=D&F=&S=&P=266

7. Finally , Should you have questions about the information or links provided, please, feel free to ask on our SEO Forum

Best regards,

Tiberiu Bazavan , ZettWalls Media

About The Author

Tiberiu Bazavan

SEO Consultant at ZettWalls Media

zettwalls.com

[email protected]

This article was posted on January 19, 2005

by Tiberiu Bazavan

The Proper Way To Use The robot.txt File

The Proper Way To Use The robot.txt File

by: Jimmy Whisenhunt

When optimizing your web site most webmasters don’t consider using the robot.txt file. This is a very important file for your site. It let the spiders and crawlers know what they can and can not index. This is helpful in keeping them out of folders that you do not want index like the admin or stats folder.

Here is a list of variables that you can include in a robot.txt file and there meaning:

Useragent: In this field you can specify a specific robot to describe access policy for or a ข*ข for all robots more explained in example.

Disallow: In the field you specify the files and folders not to include in the crawl.

The # is to represent comments

Here are some examples of a robot.txt file

Useragent: *

Disallow:

The above would let all spiders index all content.

Here another

Useragent: *

Disallow: /cgibin/

The above would block all spiders from indexing the cgibin directory.

Useragent: googlebot

Disallow:

Useragent: *

Disallow: /admin.php

Disallow: /cgibin/

Disallow: /admin/

Disallow: /stats/

In the above example googlebot can index everything while all other spiders can not index admin.php, cgibin, admin, and stats directory. Notice that you can block single files like admin.php.

About The Author

Jimmy Whisenhunt is the webmaster at VIP Enterprises http://www.vipenterprises.org

[email protected]

This article was posted on February 06

by Jimmy Whisenhunt

Submitting Your Website

Submitting Your Website

by: Matt Colyer

Now that your site is built properly, it is time to submit it to be indexed. One common approach is the use of automated submission software, which is not a good idea and instead I suggest performing manual submissions. Below are links to the major search engines that you should submit to. It could take anywhere from one to six weeks before you get indexed, but if you trade links with other sites it will speed it up. You can join http://www.linkexchangeit.com to trade links with other webmasters for free.
Altavista
Altavista is now owned by Yahoo and the add URL page is down for now.
http://addurl.altavista.com/sites/addurl/newurl
The Open Directory Project (DMOZ)
You will have to find a category thatกs related to your site to submit. It can take a long time before you get in the DMOZ, but when you do itกs well wroth the wait.
http://dmoz.org/add.html
Fast
Alltheweb is now by Yahoo and the add URL page is down for now.
http://www.alltheweb.com/add_url.php
Google
Google is the most powerful search engine out there because so many other search engine use Googleกs index like AOL.
http://www.google.com/addurl.html
MSN
MSN uses Yahooกs index, so you might want to skip this and go to Yahoos.
http://submitit.bcentral.com/msnsubmit.htm
Yahoo
Yahoo now has itกs own index, but it is an other powerful search engine because like Google they provide there index to other search engines.
http://search.yahoo.com/info/submit.html

About The Author

Matt Colyer is the owner of the Marhen.com Network and is a parttime SEO. He also is a php, CGI and ASP developer.

This article was posted on May 30, 2004

by Matt Colyer

A Visit with Adam Soroca, a Representative from Te

A Visit with Adam Soroca, a Representative from Terra Lycos

by: Robin Nobles

A Glimpse into the Lycos Search Engine. . .

A Visit with Adam Soroca, a Representative From Terra Lycos

By Robin Nobles

Recently, the Academy of Web Specialists (http://www.onlinewebtraining.com) had the good fortune of having Adam Soroca, a representative from Terra Lycos (http://www.lycos.com), attend a chat session to visit with our students. Mr. Soroca provided some excellent information about Terra Lycos as well as their plans for the future.

As means of introduction, Adam is the Group Product Manager for Lycos InSite and is in charge of building a comprehensive suite of search marketing tools. He has general management responsibilities for the vision, execution, and delivery of the Webbased platforms.

Adam explained further,

ขI am charged with building a search marketing platform for Lycos. We launched our first product based on paid inclusion into the FAST Web index in February of this year. We were very quickly able to prove that by nature of our brand and presence, search marketers would come to visit for search marketing services.

ขTwo weeks ago, we launched our second product, InSite AdBuyer, a costperclick ad auction engine. Both of these products are designed around the concept of contextual advertising, so that a searcher is presented with highly relevant information when performing a search.

ขWhile our platform is currently focused on our own index, we seek to provide onestop access for all search marketing needs, including multiple index submission and multiple bidmanagement capabilities. All of these are designed to drive traffic (i.e., customer acquisition). We are also looking further down the visitor experience to the ROI side to encompass campaign management.ข

The following are questions asked by Academy students to Mr. Soroca.

Question: ขI see when I do a search on Lycos that there is an กadvertisementก on the right side. Is this the InSite AdBuyer?ข

Adam: ขCorrect. Those are the AdBuyer costperclick ads. Our advertisers bid to be in the 1st, 2nd or 3rd positions.

ขThese are highly targeted ads as they are exact match keyword driven. So behind the scenes, advertisers are involved in a bidding auction to vie for that space.ข

Question: ขWhen someone clicks the กplace your ad hereก link, does it show what the bids are for the keywords in question, so you know what you have to bid against?ข

Adam: ขNo, actually we don’t display the prices on the front end. To view the prices for particular keywords, advertisers must sign up for an account and view that in the advertisers’ area.ข

Question: ขWhat results does FAST supply to Lycos?ข

Adam: ขFAST currently provides Lycos with our search results. Our paid inclusion platform enables Web site owners and marketers to guarantee that their Web content is included in the Web index and frequently refreshed. This differs from paid placement in that the paid inclusion customers do not necessarily show up in the top of the results just because they paid us. This content is viewed by the relevancy algorithm and displayed appropriately.

ขCurrently this program is available to small/mid size companies for an annual subscription. For larger companies, it will be available at a costperclick (those sites >1,000 URL’s). Paid placement and paid inclusion are two separate products. If you bid for placement, you pay for actual clicks. If you buy paid inclusion, that means that your pages are guaranteed to be in the Web index (spiders can’t find everything), and the content will be updated every 2448 hours. So, if your content changes, the index will reflect that much more quickly. With an index of 2.1 billion documents, it can take FAST a while to recrawl the entire index. This program guarantees you are in there.ข

Question: ขSo, you have to pay for inclusion before you bid for placement? Is this what I understand?ข

Adam: ขThey are two separate programs. The paid inclusion relates to FAST and the Web index. The paid placement relates to the three ad panels that display on Lycos Search.ข

Question: ขAnd your placement number in Lycos is then based on the algorithm of the page submitted?ข

Adam: ขCorrect!ข

Question: ขWhere the paid placement is a cost per click, your placement is then based on the bid placed for a particular keyword?ข

Adam: ขYes, exactly.ข

Question: ขIf you aren’t indexed and would like to buy pay per click, where would the cost per click be linked to?ข

Adam: ขThe costperclick ads go to any page that you want provided that you have content that is relevant to the keyword that you bid on. The costperclick ads are completely separate from the index. You do not have to be in the index to bid on the ads. The ads are the กadvertisementก on the right hand side of the page.ข

Question: ขWe are talking about cost per click (ad placement) and paid inclusion. My question is, if you don’t pay for inclusion and pay for ad placement, would you be indexed with FAST or Lycos search engines?ข

Adam: ขNo, they’re completely independent. Paid inclusion gets you into the Web index. Paid placement buys you space on the Web results page. You can have either or both, but they do not work together per se. However, it helps to purchase both for broad exposure through paid inclusion and targeted exposure through the CPC ads.ข

Question: ขIn your new InSite AdBuyer program, FindWhat provides the technology but NOT the results for your payperclick ads, is that correct?ข

Adam: ขWe have private labeled the FindWhat ad auction platform endtoend. Lycos engages in relationships with the advertisers directly.ข

Question: ขWhere are the sponsored listings from?ข

Adam: ขThose are provided by Overture, which is similar in concept. Advertisers are bidding for the 1st, 2nd, and 3rd positions throughout Overtureกs network.ข

Question: ขWhat is the purpose of the ขFast Forwardข button in your search results? I notice that using the button opens up a framed page where the left frame shows the search results including Overture ads, and the current site is highlighted. Does this button make it easier to visit the current results, and is that its purpose?ข

Adam: ขThat is exactly the purpose. We are trying to give the searchers an easier way to browse through the pages that are served as results.ข

Question: ขIs it necessary to submit to both FAST and Lycos, either through free Add URL or pay inclusion, or does submitting to FAST get your site listed in both indexes?ข

Adam: ขWeb site owners and marketers can submit to the FAST index either through Lycos or directly to FAST. There is no difference in the indexes. Lycos search results are powered by FAST.ข

Question: ขDoes the #1 result from Kanoodle still show up in some Lycos searches?ข

Adam: ขNo. Kanoodle does not provide results on Lycos.ข

Question: ขAre the results from the Lycos Web Sites Directory (ODP results) mixed in with the FAST results or only available through the Web Directory link at the top of the page?ข

Adam: ขOnly FAST results appear in the Web result listings.ข

In Conclusion

If you don’t have visibility in the Lycos search engine (http://www.lycos.com), you have several choices. You can pay for inclusion in the FAST index (http://www.positiontech.com/fast/index.htm), which will get you in the regular search results for Lycos. You can also participate in their InSite AdBuyer program (http://insite.lycos.com/searchservices/), a new costperclick advertising program. Or, purchase keywords through Overture (http://www.overture.com), which will also get you visibility in Lycos.

At the top of the Lycos page, you’ll see a link to the Web Directory, which is where ODP results are found. ODP results can only be found through this link, and they’re not mixed in with the FAST results. However, making sure that your site is listed in the ODP (http://dmoz.org) is another way to get added visibility through the Lycos search engine.

About The Author

Robin Nobles, Director of Training, Academy of Web Specialists, (http://www.academywebspecialists.com) has trained several thousand people in her online search engine marketing courses (http://www.onlinewebtraining.com) and is the content provider for (GRSeo) Search Engine Optimizer software (http://www.seoptimizer.com). She also teaches 4day hands on search engine marketing workshops in locations across the globe with Search Engine Workshops (http://www.searchengineworkshops.com).

[email protected]

This article was posted on December 15, 2002

by Robin Nobles

Why you should NOT submit your site on Search Engi

Why you should NOT submit your site on Search Engines

by: Zoran Makrevski

Before to answer to this question we have to know what is the difference between a search engine and directory. Here is a brief explanation.

Main difference between search engine and directory lay in the way how websites get entered into their index. People submit their sites to the directories, which are reviewed by human editors. Think of the directories as collections of Internet sites organized by subject. Search engines works by sending out a spider to fetch as many documents as possible. Another program, called an indexer, then reads these documents and creates an index based on the words contained in each document.

Having this said, let’s see why you should not submit your site to the Search Engines.

More than 90% of the search engines traffic comes from three major search engines Google, Yahoo! and MSN.

You can see many ads on Internet which looks like this: ขsubmit your site on X00.000 search engines…ข

Submitting your site on hundreds of thousands search engines wouldn’t help and is simply not worth. Save your money as you would do when you see ad’s like: ขloose weight while you sleepข.

These three search engines provide results for many other search engines, and if you are listed on these three search engines, your site will be listed on many other search engines too.

Altavista for example show results from the Yahoo! index. Rankings are not the same, because the algorithm is different but index is the same. If your site is new and is listed on Yahoo but not on Altavista, be patient and do nothing. Altavista will show your site when they update their index from yahoo. AOL (America On Line) and Netscape use Google index. They additionally receives and listings from DMOZ. Examples are many.

So called ขMetaข search engines like DogPile or Metacrawler are showing results from Google and Yahoo among of others. These search engines actually search the top search engines and show combined results.

If your site is not new, there is a chance that search engines spiders already find your site and index him. So before to do anything check is your site already listed, even if you did not submit your site on any search engine.

But what you should do if your site is just uploaded?

As we already explain search engines are equipped with spiders which will find your site. All you need is to submit your site in Directories, and search engines spiders will find your link and your site will be indexed. But be sure to provide enough ขfoodข for the spiders. That is, submit your site in enough directories, and you will make that process shorter.

While submission to the search engines is wasting of your time, submission to the directories is not. You will benefit from submitting your site in Directories, not only because search engines will find your link there and index your site, but you will also increase your link popularity. Search engines consider each link who is pointing to your site as a one kind of vote and give him a ขcreditข for each ขvoteข.

That’s why you should submit your site to as many directories you can. However, don’t expect bulk traffic from the directories. Even the biggest directories like Yahoo! or DMOZ are not able to deliver load of traffics. From the other side, traffic delivered though the directories is usually quality traffic, because these people are browsing to find products or services that you offer.

Submitting your site more than once on search engines might slow down indexing time. If you still want to submit your site on search engines best is do that just once, and do that manually. Companies who advertise that they will submit your site on X00.000 will not do that manually, that is for sure. They use automated submission software, even all major search engines say’s in their submission guidelines that you should submit your site manually. And major search engines are what you should care for only. You remember the fact where the over 90% of traffic comes from?

Remember that only submitting your site to the search engines does nothing to increase your ranking in most cases. If web site optimization on your web site is not properly implemented or is not implemented at all chances are small that your site will be on first few pages with results. And having in mind that only about 7% of the people look further then third page, top is the only place you want to be.

How to calculate if the search engine optimization is worth to invest in? Consider this: What is the annual worth of one customer to you?ก Is it 25 €, 250 €, or perhaps 2500 €? How many customers you need to get the invested money back?

Optimization of your website might be one of your best investment ever, if is planed and implemented right.

About The Author

Zoran Makrevski

Search Engine Optimization Services

SEO.Goto.gr

This article was posted on November 02, 2004

by Zoran Makrevski

The Proper Way To Use The robots.txt File Update

The Proper Way To Use The robots.txt File Update

by: Jimmy Whisenhunt

In my last article about the robots.txt file I had spelled it wrong. It should have been robots.txt instead of robot.txt. The article should read like this:

When optimizing your web site most webmasters don’t consider using the robots.txt file.This is a very important file for your site. It let the spiders and crawlers know what they can and can not index. This is helpful in keeping them out of folders that you do not want index like the admin or stats folder.

Here is a list of variables that you can include in a robot.txt file and there meaning:

1) Useragent: In this field you can specify a specific robot to describe access policy

for or a ข*ข for all robots more explained in example.

2) Disallow: In the field you specify the files and folders not to include in the crawl.

3) The # is to represent comments

Here are some examples of a robots.txt file

Useragent: *

Disallow:

The above would let all spiders index all content.

Here another example

Useragent: *

Disallow: /cgibin/

The above would block all spiders from indexing the cgibin directory.

Useragent: googlebot

Disallow:

Useragent: *

Disallow: /admin.php

Disallow: /cgibin/

Disallow: /admin/

Disallow: /stats/

In the above example googlebot can index everything while all other spiders can not index admin.php, cgibin, admin, and stats directory. Notice that you can block single files like admin.php.

About The Author

Jimmy Whisenhunt is the webmaster at VIP Enterprises http://www.vipenterprises.org.

[email protected]

This article was posted on February 16

by Jimmy Whisenhunt

Get Listed in Google Without Submitting Your Site

Get Listed in Google Without Submitting Your Site

by: Mario Sanchez

With Google delivering so much traffic, it is only normal to be eager to submit your page and have it indexed as soon as possible. However, submitting your page is not your only option, and itกs not the best one. If this sounds strange keep reading.

Talking about its indexing process, Google says:

กWe add thousands of new sites to our index each time we crawl the Web, but if you like, you may submit your URL as well. Submission is not necessary and does not guarantee inclusion in our index. Given the large number of sites submitting URLs, itกs likely your pages will be found in an automatic crawl before they make it into our index through the URL submission form.ก

We can therefore draw two conclusions:

1. Submitting your site does not guarantee inclusion.

2. Most pages are found and indexed automatically, when Google crawls the web.

The Google folks have also made it clear that Google gives a page more importance when it is found through an automatic crawl. This can be easily verified when we consider how Googleกs PageRank system works: when page A links to page B, part of page Aกs PageRank trickles down to page B, increasing page Bกs PageRank (and, therefore, its importance). A manually submitted page will not enjoy this benefit.

Now that you know that manual submission is neither necessary nor the best way to go, what can you do to make Google find your pages?

The best way, at least in my personal experience, is to write an article on your area of expertise and submit it to popular article syndication sites like http://www.marketingseek.com or http://www.ideamarketers.com . These sites will post your article, so that online publishers can use them for free in exchange for including your resource box at the end of the article. A resource box (a.k.a. bylines) is a small paragraph about yourself, written by you, which contains a link to your homepage.

In very little time, your article will show up in websites and ezines across the web. It will then be just a matter of time (usually days) before Google crawls those pages and finds your links. If you followed good web design practices and have included a link to a site map in your homepage, Google will follow it as soon as it finds your homepage, and all your pages will be indexed. Itกs as simple as that.

The most popular articles you can write are those that list a collection of tips related to your area of expertise. One of my most succesful articles is called ก50 Surefire Web Design Tipsก, and it is nothing but a checklist of guidelines to follow when designing a website.

Another good way to help Google find your pages is to exchange links with other sites. Google will crawl those sites, find the links to your page, and add it to the index.

Finally, remember to optimize your pages before you try to get them listed, so that you have a better chance of ranking high in the search engine results pages (SERPs). After all, what good would it do to get your pages listed if nobody can find them?

About The Author

Mario Sanchez is a Miami based freelance writer who focuses on web design and Internet marketing topics. He publishes The Internet Digest ( http://www.theinternetdigest.net ), a growing collection of web design and Internet marketing articles, tips and resources. You can freely reprint his weekly articles in your website, ezine, or ebook.

This article was posted on October 18, 2003

by Mario Sanchez

Search Engines 101 Search Engines Explained

Search Engines 101 Search Engines Explained

by: Kristy Meghreblian

What Are Search Engines?

A search engine is a database system designed to index and categorize internet addresses, otherwise known as URLs (for example, http://www.submittoday.com).

There are four basic types of search engines:

Automatic: These search engines are based on information that is collected, sorted and analyzed by software programs, commonly referred to as กrobotsก, กspidersก, or กcrawlersก. These spiders crawl through web pages collecting information which is then analyzed and categorized into an กindexก. When you conduct a search using one of these search engines, you are really searching the index. The results of the search will depend on the contents of that index and its relevancy to your query.

Directories: A directory is a searchable subject guide of Web sites that have been reviewed and compiled by human editors. These editors decide which sites to list, and, in which categories.

Meta: Meta search engines use automated technology to gather information from a spider and then deliver a summary of that information as the results of a search to the end user.

Payperclick (PPC): A search engine that determines ranking according to the dollar amount you pay for each click from that search engine to your site. Examples of PPC search engines are Overture.com and FindWhat.com. The highest ranking goes to the highest bidder.

There are a few downfalls you should know about using PPCs:

The use of PPC search engines as part of your search engine optimization process will not improve your search engine positioning in the regular editorial search results. Instead, they will most always appear in a กSponsoredก or กFeaturedก area located at the top or side of the regular search page results. Even though your paid listing will appear at the top of the search page, many users will not click on paid listings because they look at it as an advertisement. In the past, people used to always click on banner ads, but now they are seen more of as a nuisance. Similarly, the same thing is happening with PPC listings. Also, PPC listings are not always as relevant to a query as the editorial search results.

If your site is not effectively search engine optimized before you begin to submit it to a PPC, it will still be poorly advertised afterwards. The optimization of your Web site is critical to the success of your rankings.

When you stop paying for a PPC submission, your listing disappears and so does the traffic.

PPCs can be an effective shortterm solution for gaining exposure and driving immediate traffic to your Web site while you wait for full indexing, but it can become expensive if you use it as a longterm solution.

How Do Search Engines Work?:

Search engines compile their databases with the aid of spiders (a.k.a. robots). These search engine spiders crawl the Internet from link to link, identifying Web pages. Once search engine spiders find a Web site, they index the content on those pages, making the URLs available to Internet users. In turn, owners of Web sites submit their URLs to search engines for crawling and, ultimately, inclusion in their databases. This is known as search engine submission.

When you use search engines to find something on the Internet, you’re basically asking the search engine to scan its database and match your keywords and phrases with the content of the URLs they have on file at that time. Spiders regularly return to the URLs they index to look for changes. When changes occur, the index is updated to reflect the new information.

What Are The Pros And Cons Of Search Engines?

Pro: With the vast wealth of information available on the Internet, search engines are the most effective and efficient way to find information based on your specific search requests.

Con: Because search engines index mass quantities of data, you are likely to get irrelevant responses to your search requests.

Are Search Engines All The Same?

Search results vary from search engine to search engine in terms of size, speed and content. The results will also vary based on the ranking criteria the search engines use. If you aren’t getting the results you need, try a different search engine. While the results may not be wildly different, you may get a few search results from one search engine that you didn’t from another.

How Do Search Engines Rank Web Pages?

When ranking Web pages, search engines follow specific criteria, which may vary from one search engine to another. Naturally, they want to generate the most popular (or relevant) pages at the top of their list. Search engines will look at keywords and phrases, content, HTML meta tags and link popularity just to name a few to determine the value of the Web page.

About The Author

As Submit Today’s copywriter and editor, Kristy Meghreblian has written online content for many successful companies, including Monster.com. She has successfully combined her excellence in journalism with the delicate art of keyword density as it relates to search engine optimization. As a result, she has helped many Submit Today clients achieve top ranking. Submit Today is a leading search engine optimization, submission and ranking company located in Naples, Florida.

[email protected]

This article was posted on January 05, 2004

by Kristy Meghreblian