|
Are you adding or modifying fields? Don't delete the LF's just remove their members. Try the CHGPF command. It can and will insert new fields, delete old fields and expand numbers or character fields. It will not re-define char to num or visa versa. If no key fields are changed and LF's have their own view, it should be fast. No need to copy the data. Much faster too. Thou you cannot re-compile the programs until after you changed the file. If no programs use the PF and the LF have their own record format, (view), you will not need to bother with the file updates. *MAP *DROP is very slow. ADDLFM is faster than re-compiling and the LF that use the PF record format will be updated automagically! Christopher K. Bipes mailto:Chris.Bipes@Cross-Check.com Operations & Network Mgr mailto:Chris_Bipes@Yahoo.com CrossCheck, Inc. http://www.cross-check.com 6119 State Farm Drive Phone: 707 586-0551 x 1102 Rohnert Park CA 94928 Fax: 707 586-1884 -----Original Message----- From: Mlpolutta@aol.com [mailto:Mlpolutta@aol.com] Okay, here's what I have "working" so far - a lot of the suggestions I have received are exactly what I already have. I apologize for the lack of clear explanation. Here is what the process does currently - 1. Divide the number of records by the number of "parallel jobs" requested (the file has REUSEDLT(*YES) so I'm ignoring the possibility of large gaps due to deleted records, except that the CPYF for the last "chunk" goes to *END rather than the record count) 2. Remove all dependent logical files 3. Submit a CPYF using FROMRCD(xxxxxxxxxx) TORCD(yyyyyyyyyy) FMTOPT(*MAP *DROP), which will copy the record range in parallel with all the other jobs. This CPYF has an OVRDBF SEQONLY(*YES zzz) before it. 4. Monitor for the completion of all the submitted jobs 5. Then submit parallel rebuilds of all the LFs It appears to me that CPYF FMTOPT(*MAP *DROP) is not the fastest critter out there. Is there a dynamic way to do FMTOPT(*MAP *DROP) other than CPYF? I know COBOL has "MOVE CORRESPONDING" but that would require specific version compiles, which I want to avoid if possible. Just as one other data point, the record format is HUGE - over 1800 bytes. (an inherited situation, but I have to live with it for now, at least) Thanks for all your replies so far! Michael
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.