[Sigia-l] Don't submit websites to search engines?

Eric Scheid eric.scheid at ironclad.net.au
Sun May 16 19:49:00 EDT 2004


On 17/5/04 8:36 AM, "Richard Wiggins" <richard.wiggins at gmail.com> wrote:

> So the premise of the article
> is ridiculous on its face

Another ridiculous premise is that the search engines exhaustively crawl the
web on a regular basis. Huge swathes of the web have *never* been crawled,
and not simply because they are behind /robots.txt, dynamic pages, or
firewalls. Huge swathes of eminently crawlable webspace.

How do I know? I have google send me an email alert anytime they add a new
page to their index that contains the keyword "IAwiki", and I receive a
trickle of alerts for what I know to be very old pages. Specifically the
SIGIA-L archives, and even other blogs.

    http://www.google.com/webalerts?hl=en

e.




More information about the Sigia-l mailing list