× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Please consider using dynamic SQL.


On 7/14/2011 11:30 AM, Versfelt, Charles wrote:
Hi all,

I'm working on a project and I have a challenge.
I know I can do this "the hard way" (over-coded, hard-to-modify) but there's got to be a very easy, short way to do it.

I have a list of files. Let's say they're called "FILE01" through "FILE40". They all have different formats, but there is a list of fields that are standard to all the files, though not necessarily in the same order, and possibly separated by other fields unique to each file. Let's call those, "FLD01" - "FLD15." In the physical files, the common fields have the exact same field names.

I have an input file, coming in as a CSV spreadsheet that I'm bringing in (CPYFRMSTMF in my CL) and I have to parse out with column headings pointing to each of these standard fields. (say the column headings are *INFLD01 - *INFLD15).

My RPG program has an input parameter, telling me which format I want. (Say, FILE02, or FILE27).

I have to take the file and put it in the format of the file passed by parameter, then send it back to a new CSV in the format of the file requested. I already know how to take the original CSV and turn it into a file and get to all the input fields. I also know how to put them in a data structure and send them back to CSV.

The thing is, to do this for 40 files, I need 40 data structures for 40 file formats... and since they share some field names, the compiler won't let me simply create the data structures without prefixing. So I'd end up with something like this

d @FILE01 e ds extname(FILE01) prefix(F01)
d @ FILE02 e ds extname(FILE02) prefix(F02)
.
.
.
d @ FILE40 e ds extname(FILE40) prefix(F40)


Then, to load my data structures prior to writing to output, (let's say my input file has INFLD01 - INFLD15) I'd need 40 subroutines (or a long select statement subroutines, whatever, I'll do it here as a select) to say:

Select
When infile = 'FILE01';
F01FLD01 = INFLD01;
F01FLD02 = INFLD02;
...
F01FLD15 = INFLD15;
When infile = 'FILE02';
F02FLD01 = INFLD01;
F02FLD02 = INFLD02;
...
F02FLD15 = INFLD15;
...
...
When infile = 'FILE40';
F40FLD01 = INFLD01;
F40FLD02 = INFLD02;
...
F40FLD15 = INFLD15;
Endsl;

This seems like a lot of work to move 15 fields into 1 format (which may be any of 40) when all the files have the same field names.

I can't help thinking there's got to be an easier way to do it, like with some kind of likeds or something.
Some way to just drop the fields into the file format I want, without coding 40 different sets of 15 evals.

It's easy to code the "long way" of course. Just repeatedly copy paste the code sections and change the F01 to F02... to F40, and I'm done. Which is great if the program never changes. But if I have to add a field later I have to add it in 40 places. I can imagine this being a bear to modify.

There just has to be an easier way. Ideas?
Thanks in advance for any suggestions.

Oh, just to make things a touch more complicated: A few of the format files may be missing one or two of the standard fields.
In that case, if the field was in my input file I'd have to format the fields I can and write to report the total number of records with INFLDxx although FLDxx doesn't exist in FILEyy.

Charlie :)




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.