It has been eons since I did this (I use options 1 and 3) but it should apply the default value for the field to any new records - on update it should be unchanged. Personally I would specify a default value in the DDS for all new fields. I just prefer explicit to implicit.

Jon Paris

On Jan 19, 2011, at 4:47 PM, Jeff Buening wrote:

Thanks Chuck and Jon for input. I have started using SQL for new programs.
Jon or Chuck the only thing I am still wondering about is.....

2) Use logicals to ignore the additional fields

I have read many articles talking about using Logicals to minimize
compiles. So lets say I define a Logical file in a Stand Alone program
that does Reads, Writes, Rewrites etc... and add a field to a physical
file. I don't recompile the program with the logical. So, it is only
dealing with 100 bytes storage allocation, and rewriting or writing to the
physical file that is 120 bytes doesn't overwrite or create bad data in the
last 20 bytes because doesn't know about or receive 20 extra bytes from



I won't answer your points directly Jeff - rather I'll just try and explain
what you are seeing.

I had forgotten that you were using your UPDATEPRO program to read the file
before the update so ...

Let's assume that you pass the record area (say 100 bytes) as a parm to
UPDATEPRO. What is actually passed is a pointer - note that ZERO
information about length or anything else is passed. In UPDATEPRO (because
it uses the new record format) that same record is (say) 120 bytes. The
only thing the compiler knew about when it compiled UPDATEPRO was the
length it was given (120) so when it moves data to the parm it moves 120
bytes. But back in the caller the storage allocation was only 100 bytes -
so the next 20 bytes belonging to something just got nuked. If you then
change some data in that record in the caller and pass it back to UPDATEPRO
it will still see the corrupted memory and will write that to disk - hence
your 24 value didn't change. In your current scenario it may be that those
20 bytes don't "matter". But they will in another program and it may take
some hapless programmer who comes after you a long time to work it out.

Best example of this was something I saw while working on the COBOL
compiler team. Customer had a program that had been "working" for seven
years. They changed the value of one literal and recompiled. No test needed
- just changed a literal from 0.75 to 0.78. Meanwhile back in production
all hell broke loose. Why? For 7 years the program had been corrupting
storage as described above. But the area of storage corrupted was the print
buffer - and it was space filled before each line was built - so no harm no
foul. But in that 7 years the compiler had changed how it generated its
storage definitions - and now it wasn't a print buffer that was being
corrupted but a collection of pointers used in program calls. Made for an
interesting debug scenario.

There are at least three options I can think of:

1) Recompile all affected objects when you change a file

2) Use logicals to ignore the additional fields

3) Use SQL

The one option you should _not_ use is the one you are using now - i.e.
lying to the compiler <grin>

Jon Paris

This is the COBOL Programming on the iSeries/AS400 (COBOL400-L) mailing list
To post a message email: COBOL400-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
or email: COBOL400-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2022 by and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.