> A company is upgrading from Advanced/36 to AS/400. They have files with
> multiple record types and want to reuse as much code as possible (with
> the necessary changes for the AS/400). Right now the plan is to use
> external data structures to define the fields (for the new code) and
> create logicals to define the fields for use with Query. However, some
> of the fields are packed. What have others done in the past to handle a
> similar situation? Any and all suggestions would be greatly appreciated!
> Direct replies are OK to dkbrengle@juno.com.

When I migrated S/34 & S/36 code to S/38 & AS/400 systems and a flat file had
multiple record formats, I would separate the different formats into their own
physical files.  I would then combine the separate physical files with a
multi-format logical file.  The logical file would need a format selector 
so that any writes to the logical file would go to the correct physical file.
The existing programs could then be recompiled to use the multi-format logical
file.  You could then use any system utilities (Query, DFU etc.) against the
logical file or the individual physical files.

This solution assumed that the current multi-format flat file has common fields
that are used to keep the file in sequence and that the current numeric fields 
the file are initialized correctly.

> Darlene Brengle
> Norsoft, Inc.

Chuck Bartholomew
I/NET Inc.

| This is the Midrange System Mailing List!
| To submit a new message, send your mail to MIDRANGE-L@midrange.com.
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].