× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Brad Stone wrote:
> ...
> Trust me, I get a LOT of emails when I don't do things for
> early releases.  :)  Then again, there's no reason that
> version 2 of my books can't contain these updates.  But
> right now there's no need to update them just yet.

Ah, I see, "planned obsolescence", eh?  ;-)

> But I would also hope that most could make these very minor
> changes should they choose to do so.  As I prefer to teach
> someone how to do something than just to do it for them with
> cut and paste code.
>
> Speaking of which, Hans, you probably have the best idea as
> to this.  What is a ballpark figure (say a percentage) of
> the performance hit of using a 1024 character field and
> using CHECKR as compared to using a varying field and %len?
> Using CHECKR I still have excellent performance.  I'd say on
> average CHECKR is used about 15 times per page.  If it takes
> 1.5 seconds to respond, what type of increase one would
> expect to see?  What does CHECKR use in the "bowels" fo the
> OS to do this checking?  Some sort of array manipulation?
> Is that code fine tuned to squeeze more performance out?

I think Jon already responded nicely when he said 'Why teach new
techniques using "old" tools - it doesn't make sense.'

There are several ways of looking at the issue.  On the one hand,
run-time performance of compiled code is rarely an issue in the
overall picture.  Database and network latencies are always going to
be bigger concerns.  I'm reminded of an article published a while
ago discussing the implementation of (I believe) the new pension
plan system in Sweden.  After scrapping their original work which
was late and buggy, the development team reworked their original
prototype written in Perl, and got the system up and running in
short order.  Apparently, there was some concern over the
performance of the Perl code since it is interpreted (rather than
compiled).  But they still found that 70% of the CPU time was spent
in the database.

So on the one hand, the fact that character varying and %LEN() are
faster than using CHECKR or %TRIM() doesn't really matter a whole
bunch.  Barbara uses a great metaphor - it's like standing on a
chair to get closer to the Moon.  (OK, that may be a bit extreme in
this case.)

I think the main issue is clarity of code.  I would argue that using
varying length character data makes for more readable code since you
don't have to constantly bridge the semantic gap between fixed
length and varying length string requirements.  CGI is an
application domain that is best handled by programming languages
with strong character string functionality, that is, varying length
strings.  Prior to the introduction of varying length character
variables, programmers constantly used %TRIM() to convert fixed
length to varying length strings, which made the code more difficult
to understand and maintain.  Varying length strings are simply a
much better match to the requirements of CGI programming.

(Of course, nothing can match a language like Perl when it comes to
character string manipulation, but I'm sure you don't want to hear
that. ;-) )

Cheers!  Hans





As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.