craig.dunn@conceptdevelopment.net
that have been preprocessed/generated.
By definition this results in "certain url will serve... set format", and
restricts prevents the 'disappearing META' solution.
I do take your point - but pre-processing/generation does not necessarily
result in "certain urls will serve" formats. In many cases you still want
to sniff (minimal processing I accept) then deliver pre-processed content
(which may well be in a different directory to the one "specified" in the
url. If they are serving ads they are certainy doing some processing (or
someone else is)
Of course, if you are doing NO "processing" of any kind - even this
minimal kind, then what you say is correct.
The compromise I think a lot of places will go to (I am certainly going
there) is partial pre-processing with minimal processing to actually
serve the page. If your site has data that goes stale quickly, some kind
of application cache system is good way to go - other than lots of cron
jobs :-)
I feel this is an opportunity - we can create a simple way to identify an
imode page, and perhaps give it some momentum by having a major search
engine pick it up. This is not just for the benefit of google - who could
possibly get by with the robot.txt stuff.
Nick
[ Did you check the archives? http://www.appelsiini.net/keitai-l/ ]
Received on Fri Feb 9 13:41:21 2001