>>One of our competitors has submitted his key words
>>(150 of them) to Infoseek via ftp.
>Infoseek (or any other search engine) doesn't
>have an FTP site to upload to, you'll have to
>give more details so I know what you're talking
It is a little difficult fo me to know what exactly the competition did as they are not going to tell everyone what they do. From what I gather they were able to access an FTP site and download a "robots text file" with their keywords. For them to do this they had to make many enquiries to Infoseek and obviously convince them that they would not abuse the privilage. If you were to correspond with Infoseek you are more likely to get an honest answer from them.
>>Does the Spider software do much the same?
>>ie submit keywords via ftp directly into the
>You can write a robots.txt file that tells the
>robot how to index your site, to a certain degree.
If it is posible to have a text file to EXCLUDE certain pages/directories, is it perhaps possible to tell the robot what keywords should be used via this text file. Again this would prevent your "Welcome" page from being cluttered with keywords.
>The registered version of Spider creates
>keyword-loaded pages that you upload to your
>server to give you high listings for certain
In other words all the Spider does is tell the Search engine's robot to index the site. The keywords used in the Spider are merely for the creation of alternative pages that have to be posted to the site.