Brad Stone wrote:
>>Or, the request may even be coming from some malicious
>>script kiddie
>>using a tool like your GETURI.
> Yes, you're correct.  But then I can nitpick your nitpick
> and say that if someone was using a tool such as GETURI I
> would hopefully recognize that and return XML instead.

But how can you explicitly recognize every possible user agent out
there, and handle each one specially?  Especially when the user
agent can so easily be spoofed?  Checking the user agent has to be
the worst way to tailor web page content.

> Parsing HTML blows.  Of course, one would charge for such a
> service.

Sure, parsing HTML sucks pond water.  So don't use HTML in your web
pages - use xhtml instead.

> I agree, Hans.  Thanks again for nitpicking.  ;)  But, I
> shall again nitpick your nitpick and state that the reason
> you need to do server side validation is not (only) because
> the request may come from another source.  But because the
> data could be, well, crap.  And your client side scripting
> may not always work especially when people like testing your
> site with "out of the ordinary" browsers.

Well, yes, semantic validition is yet another level of checking.  In
other words, your data may be well-formed, but still well-formed
crap!  I was just referring to the requirement of CGI programs
having to accept and tolerate ill-formed input, regardless of its
source.  As I said, many published programs do not even meet that
basic requirement.

> Sure we have standards from W3C, but does everyone follow
> them?  No, and because of this other "standards" are
> accepted (just look at the difference in IE and NS
> javascript handling).  So the accepted standard is to write
> your apps to work with the most popular browsers and their
> "standards".  I know you won't agree with this, Hans, but
> try to contain yourself. <smile>

Please refer to <>. Note
that even Microsoft is a member of the W3C.  (Along with IBM, and
about 478 other companies.)  Very roughly, membership in the W3C
means a commitment to follow the W3C recommendations.

Well, we've been debating this issue for ages.  I would argue simply
that it's no more difficult to target to all possible user agents by
following agreed upon standards, than to take advantage of browser
specific features and risk annoying some number of users.  I would
also argue that the best way to avoid trouble in the future with new
user agents is to follow the current agreed upon standards.  In
particular, that means xhtml strict with css.  Indeed, handling
future user agents gracefully is one of the justifications for xhtml!

OK, maybe that means you can't take advantage of all the latest
JavaScript gimmicks and tricks.  But in my own browsing experience,
I've often found that the quality of a web site is inversely
proportional to its use of flashy JS effects and gimmicks.  For the
most part, I've personally found that disabling JavaScript by
default has a net positive effect on browsing web sites.  (Using
Konq, I can enable JS specifically for just those domains that
absolutely require it.)

Oh yeah, believe it or not, I also now say "to heck with NS4
compatibility!".  The browser world has moved well beyond NS4.


Cheers!  Hans

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2020 by and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].