× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Hello,

Am 30.12.2020 um 04:29 schrieb smith5646midrange@xxxxxxxxx:

I'll give up and do it the hard way with multiple physicals and multiple CPYFs selecting the record type for each physical and then process it like any other data set.

Good decision. See Roger’s comment as a start.

The thought was to not have to split the data from one file to multiple files and I thought I had seen this done but apparently I’m wrong and the ability doesn't exist. Maybe my brain is going back to my old COBOL days and the REDEFINES on the 01 level.

I’m pretty sure you remember some COBOL-Stuff. I recently read about this feature in a book, changing code flow depending on record type. Now, that isn’t too far from Alan’s suggestion for using substr, but for me, this deviation from a proper database table with an uniform field structure almost hurts. ;-)

Since we don’t know, how big is that file, how often does it need to be processed, is it mostly static or coming in from external, in updated form regularly, … All in all, it’s again to decide the trade-off between coding time and job run/interactive response time. And, as Dan pointed out, how much mess you want to leave behind for people needing to understand what you’ve done in particular code. :-)

:wq! PoC


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.