× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.




RE:     CHGPF - Gotcha

Bill
When you did the CHGPF,  what did you do to initialize the 3 byte field??
Did every record have a different 3 byte field?  It sounds like you added
this field to have a unique 3 digit number for each date(6 byte) field. 
In that case, what did you do to initialize the file to get it unique to 
start with??? 

I assume that the file had multiple records for each date to start with.
Then you ran this 'FIX' program to increment the 3 byte field to init. the
file??  Which bombed because of #2 below.  which left the file with a 
BUNCH of records per date with the 3 byte field as zero. thus your problem.

Number 2;  If you have the file as 'Output only' the system will buffer the
output(message on compile about block size etc.) this also means that the 
system will only check for dups when it writes a BLOCK of data. Maybe 
10's or 100's of records.  So you wouldn't get the error on the write 
statement until you are WAY PAST the record that caused the dup in the
first place.   (Tough one to figure out for a new programmer.  'But I looked
at the dump and the key fields in the dump were unique!!!') The dump 
may not contain the record that caused the problem. It might be 50 records
earlier. 

John Carr 
-------------------------------------------

bill wrote
The other day, I used the CHGPF command to re-map the files data elements
(utilizing the source file/library parms.).  The new DDS in addition to having
added fields, also specified that one of the new fields is to be part of the
key.  The original file was keyed by a single 6 digit numeric field and the
key was not Unique.  The new additional key was a 3 digit numeric field (just
a sequence number to force uniqueness).  The DDS was also changed to make the
file uniquely keyed.  After the CHGPF was executed, a newly compiled RPG
program attempted to write records in to the new definition of the file.  The
file was defined as output only.  The write op-code had the error indicator
specified to catch duplicate key problems.  I placed the write in a do loop
checking for the I/O error indicator to ensure a unique key.  Well...  it
never sensed the dup key (and therefore not incrementing the second key
value).   Proof that the file was uniquely keyed was done by trying a CPYF
statement and UPDDTA against the file.  Each running of the command indicated
that there were duplicate keys in the file.  
    Anyway, I fixed the problem by recreating the file by using CRTPF
(actually I duped the file via crtduobj from a library where the file had been
previously created by the CRTPF).  Is this a bug?  Maybe, I have not contacted
IBM with the problem yet. 

Bill Greenfield
CAS
+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to "MIDRANGE-L@midrange.com".
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---
+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to "MIDRANGE-L@midrange.com".
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.