On 07-Nov-2015 10:39 -0600, John R. Smith, Jr. wrote:
I have a multi-record variable length format IFS file that I need to
copy to a DB2 file. The first few fields are "keys" and a record type
and then there is a "block" data field similar to the layout shown
below. The record length and block data layout vary based on the
record type.

Store
Register
Transaction
Timestamp
RecordType
BlockData

I can't seem to get CPYFRMIMPF or CPYFRMSTMF to work due to various
reasons/errors. Should one of these two commands work and I just
have the parameters set wrong, is there another command that I'm not
thinking of, or am I stuck trying to write my own program to read the
IFS and write the DB2.


If the data across the first five /fields/ is in a format that is understood\recognizable by the Copy From Import File (CPYFRMIMPF) feature [i.e. such that if the import of that data records across just those first five columns is understood or is verified\tested by the OP to be functional], then I can explain how to use that feature also, to complete the work for import across the BlockData. But only *if* the BlockData data is similarly formed in a manner that the CPYFRMIMPF can deal with. Although, even if the BlockData is not in the same form, likely that can be dealt with too.

For lack of a specific example, or confirmation that the first five fields can be imported by that tooling, probably not worth the effort to try to explain. I could whip up an end-to-end example for export-to-import of such /variable/ data with which the tooling could deal with the complete import, even if only by multiple passes of the data; but if that won't apply to the scenario of the OP, I would not spend the time to do so.


This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].