On Sun, 15 Jul 2001, Nick May wrote:
> It isn't so much "keeping people out" as controlling how content is
> presented. If you do sniff'n'serve, (identical URLs serving up different
> content based on the client requesting that content)....
Ouch.
I've had to deal with this from the other end in a couple of
situations. One of these was at Blink.com (a site that allows you to store
bookmarks on-line). We were pretty heavily into mobile for a while,
and one of the things we were working on was providing a reasonable
mobile equivalant for a link you'd bookmarked on your PC. Part of what
we relied on was data from our spider.
But spidering becomes much more difficult and expensive when you've got
to be hitting a link multiple times with a bunch of different header
sets. If you want your data to be collected accurately, I strongly
suggest issuing redirects after header testing wherever possible. At
least that way your other URL might get into the spidering systems out
there and be treated as separate content.
> Since header faking is so easy, having some kind of additional validation
> that the client can definitely display the content you are serving is
> useful.
This just leaves me gaping in astonishment. Sure, header faking is
easy. But it's hardly likely to be done accidently, is it? (Or do you
frequently get naive IE/Windows users coming to your site that somehow
happen to be delivering a P209i's headers?)
If someone's gone to the trouble to fake his headers to look like an
i-mode device, he probably really does want the content you'd serve to
an i-mode device.
cjs
--
Curt Sampson <cjs@cynic.net> +81 3 5778 0123 de gustibus, aut bene aut nihil
Basically, a tool is an object that enables you to take advantage of the laws
of physics and mechanics in such a way that you can seriously injure yourself.
--Dave Barry
[ Did you check the archives? http://www.appelsiini.net/keitai-l/ ]
Received on Mon Jul 16 05:19:22 2001