>>FYI: The optimum length for keywords is far less
>>than 200 WORDS (200 characters is one engine's limit,
>>and 255 characters it the usual limit).
>Where do you get this info from?
The techniques used by the search sites are based on
linguistic analysis routines that have been used for
computer archiving and retrieval of other documents for
quite a while. It's never been done on such a grand scale,
but they haven't invented anything new. There are several
incredibly boring archives of academic publications about
experimental search engines. Some of these turned into
commercial products - study their ancestry and you can often
predict their behavior.
>As far as I know you can put up to 1024
>(1000 to be >safe) as keywords.
It is not important how many they will accept,
but how many they consider important, and how
the number of keywords affects the algorithms that
calculate the relevancy.
Some algorithms for relevancy divide the "points"
they allow for the keywords section by the number of
words. With 100+ keywords in there, you dilute the
Other algorithms rank the keywords by their position
in the string.
Shorter strings are not penalized by either method, but
lengthy ones are not helped and may be less effective than
their creators think.
> With this number of chars
>So it's wise to have as man keywords
>in the body text AND especially in the
>H1, H2 etc. tags. This really counts more
>than keywords in the metas. ok?
Yes, the most bullet-proof submission will be one with
a fully SGML-compliant structure, and the keywords
in the appropriate places. It will never get penalized when
the search engines revise their tactics, for one, and it has
the structure the robots are trained to recognize.