× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Thanks Chuck and Jon for input. I have started using SQL for new programs.
Jon or Chuck the only thing I am still wondering about is.....

2) Use logicals to ignore the additional fields

I have read many articles talking about using Logicals to minimize
compiles. So lets say I define a Logical file in a Stand Alone program
that does Reads, Writes, Rewrites etc... and add a field to a physical
file. I don't recompile the program with the logical. So, it is only
dealing with 100 bytes storage allocation, and rewriting or writing to the
physical file that is 120 bytes doesn't overwrite or create bad data in the
last 20 bytes because doesn't know about or receive 20 extra bytes from
anywhere?


Thanks,

Jeff

I won't answer your points directly Jeff - rather I'll just try and explain
what you are seeing.

I had forgotten that you were using your UPDATEPRO program to read the file
before the update so ...

Let's assume that you pass the record area (say 100 bytes) as a parm to
UPDATEPRO. What is actually passed is a pointer - note that ZERO
information about length or anything else is passed. In UPDATEPRO (because
it uses the new record format) that same record is (say) 120 bytes. The
only thing the compiler knew about when it compiled UPDATEPRO was the
length it was given (120) so when it moves data to the parm it moves 120
bytes. But back in the caller the storage allocation was only 100 bytes -
so the next 20 bytes belonging to something just got nuked. If you then
change some data in that record in the caller and pass it back to UPDATEPRO
it will still see the corrupted memory and will write that to disk - hence
your 24 value didn't change. In your current scenario it may be that those
20 bytes don't "matter". But they will in another program and it may take
some hapless programmer who comes after you a long time to work it out.

Best example of this was something I saw while working on the COBOL
compiler team. Customer had a program that had been "working" for seven
years. They changed the value of one literal and recompiled. No test needed
- just changed a literal from 0.75 to 0.78. Meanwhile back in production
all hell broke loose. Why? For 7 years the program had been corrupting
storage as described above. But the area of storage corrupted was the print
buffer - and it was space filled before each line was built - so no harm no
foul. But in that 7 years the compiler had changed how it generated its
storage definitions - and now it wasn't a print buffer that was being
corrupted but a collection of pointers used in program calls. Made for an
interesting debug scenario.

There are at least three options I can think of:

1) Recompile all affected objects when you change a file

2) Use logicals to ignore the additional fields

3) Use SQL

The one option you should _not_ use is the one you are using now - i.e.
lying to the compiler <grin>


Jon Paris


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.