>The point is this: normally, when a CGI script >generates a page, that page doesn't actually >"exist", it's only shown in the browser of >the person who has requested it, the page is >not written onto any hard disk. That's what's >meant by "on the fly".
>By the sound of it Tim Schulte's script actually >writes HTML pages onto the hard disk. This >message board behaves the same. The CGI script >actually writes HTML files onto the hard disk >instead of temporarily "flashing" the output >in the browser which is what most CGI scripts >do, and why robots ignore them.
I don't think that is what is happening in Tim's case or in mine either. I use SMX by the Internet Factory to create most of my pages. You can embed sql database calls as macros in these template pages to dynamically change the content. Since these pages exist in an ordinary directory as opposed to a cgi bin my guess is that the deep engines are filtering on "cgi" and "?" in the URL. Like Tim's my pages all look like standard html. If you click on my home page above you will see that the feature property changes on every reload. This is all done through server side scripts that are invisible to the user and also to any spiders or robots.