× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 22-Oct-2015 00:24 -0500, John Yeung wrote:
On Wed, Oct 21, 2015 at 4:52 PM, CRPence wrote:
On 21-Oct-2015 10:10 -0500, John Yeung wrote:
<<SNIP>>
From where I'm sitting, it feels like the compilation (of
correct *syntax*) should either (1) always succeed whether or not
the file is found, and if there's a runtime error, so be it; or
(2) never succeed when the file is not found.

<<SNIP>>
While those absolutes might fit, in a /logical/ sense,
they tend not to fit, in a /practical/ sense. Perhaps strangely,
most people are [begrudgingly] joyful when the pre-compiler
diagnoses that their reference is problematic. That although they
may have to make changes to their source, and then issue the
pre-compile request again, they usually have fared better than if
they had only learned of the likely failure *after* they also ran
their test suite. In the end, they have saved possibly huge
amounts of time.

Well, on the face of it, all you've said is that "static
compile-time checking can save potentially huge amounts of time".
Fine. I don't think anyone could sensibly argue against that. I
wouldn't want to give that up (given that my code is embedded in
RPG). My proposed behavior (2) *doesn't* give it up. Maybe that
wasn't clear.

That was clear. Seems I did not cover that, because in an attempt at being more succinct I omitted some background from a prior attempt at replying; I ended up giving up on that draft, as the text had become unwieldy. Unfortunately my unwritten _thoughts_ are not easily conveyed in this medium ;-)


Maybe I need to state it in different terms: I believe that a
missing reference should be treated roughly the same as a
found-but-assumed-erroneous reference. (So if you want to be
permissive, then be permissive with all references; if you want to
be strict, then be strict with all references.)

I don't see how my proposed behavior (2) is any less practical than
what you've described. Unless you are saying that in practice,
missing references are overwhelmingly NOT errors in real-world code,

I made mention of, in my above reply [though that part was snipped], of my acceptance of and there being "the apparent rationale behind the choice to ignore the missing-references, whilst not ignoring those located-references for which the validity checking fails."

Specifically to what that "rationale behind the choice to ignore the missing-references" portion alludes, is that for the SQL [this, from my ditched attempt at replying]:

"... the typical expectation for a reference /not found/ is that eventually, the object will exist; that is to suggest, that typically for a -204, the likely reaction within the program is the respective CREATE statement. Arguably the SQL has the ALTER as well for which a revision could occur, thus similarly the SQL could infer that such changes are possible, but only by overlooking the relative infrequency [of /altered/ as compared to /created new/ as dealt with in the same program source]."

Essentially then [also from my ditched draft], I somewhat repetitively offer, what is the presumption for handling missing-references vs the presumption for handling located-references:

" So anyhow the SQL has decided that, if a reference is found, then that reference will be used to perform the validity checking; presuming the object is persistent and thus validation is appropriate. If the reference is not found, then the statement will only be syntax checked; presuming the object is temporal and the intention of the program must be that the object will be created when required. If there are errors with the found\validated reference, then the errors will be overlooked according to a GENLVL specification suggesting to ignore the error."

Specifically with regard to [again, from my ditched draft] the much rarer "scenario of a program needing to operate in [and thus compile in] effectively the simultaneous dual-world-view of both past and present [or present and future], is just not something that is very typical, and is something easily overcome [...] by upping the severity-level (GENLVL) allowed, so as to still pass the generated HLL code to the HLL compiler [which for the OP, apparently that is non-functional], or overcome by use of dynamic SQL, or overcome by ensuring the references are not found and thus not validity-checked. Important point about overcoming, is that the default behavior ensures that validity-checking failures will fail the compile; i.e. so as to please the most, [whereas] the few will have to overcome the difficulty in their atypical situation" [the "atypical situation" referring to the altered\incompatible vs missing TABLE-reference.

and that putting the onus on the programmer to either ensure the
reference exists or explicitly choose a higher GENLVL is somehow
unreasonably onerous.


By intent, considered "unreasonably onerous" for the presumed-common missing-references scenario, but considered not so onerous for the presumed-much-rarer altered-reference scenario whereby the program references both a post-ALTER version of a TABLE and a pre-ALTER version of that TABLE.

FWiW, ALTER activities are usually done in entirely unrelated phase(s) from the normal run-time; i.e. during upgrades to the application. That alone makes them rarer. Often, the ALTER is all that is done, thus implying that a validity-checked statement occurring for both the down-level\pre-altered table and the up-level\post-altered table would be rarer still. On occasion however, the upgrade feature itself must perform both the ALTER and then an UPDATE [or some other statement(s)] against the altered TABLE to /finish/ the upgrading of the TABLE to the matching\present level of the new application run-time that will have been compiled to\against the up-level vs the down-level objects; probably most of these will encounter a validation error, but not all would, because some altered tables will remain compatible with whatever are the other statement(s) performed with\against that altered table.

Also FWiW, I acknowledged in a separate post in the thread, that the SQL pre-compiler could in theory be modified /to look for/ and ignore failed validity-checks for any file that was named also in an embedded ALTER. I was hesitant to suggest that would be something to pursue as a design change, thinking that [esp. due to the rarity but also], that idea as implementation might end up being described as the proverbial /opening of a can of worms/; though mostly that reflects my not having given the idea much thought.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.