Just one comment.

Rather than deleting the LF's, I would do RMVLFM (before) and ADDLFM
(after).  It's safer.  No risk of missing source or level checks.


Al Barsa, Jr.
Barsa Consulting Group, LLC


914-251-9406 fax


             Evan Harris                                                   
             nz>                                                        To 
             Sent by:                  Midrange Systems Technical          
             midrange-l-bounce         Discussion                          
             s@xxxxxxxxxxxx            <midrange-l@xxxxxxxxxxxx>           
             04/29/2004 03:28                                      Subject 
             AM                        RE: File Reorg - Cons               
             Please respond to                                             
             Midrange Systems                                              

Hi Mike

just to give you even more to think about :)

I was once lucky enough to have a 140 million record monster file with a
700+ character record length and 25 logicals built over it that had 20
million plus deleted records which had to be re-organised. (makes me sound
like a Yorkshire man seeing it written down like that...)

In any case, if time is at a premium some of the things I found that can
help a great deal are to:

- Drop the logical files before doing the re-org then manually create them

- You can run two or more rebuilds through submitting CRTLF commands to run

concurrently; normally RGZPFM seemed to/used just build them in an
arbitrary order I could never figure out but looked to be original creation


- It is possible to somewhat optimise the rebuild sequence which may save
disk and will almost certainly save time. Analyze the keys on your logical
files to determine which have the most complex keys or keys that can be
used to assist in building subsequent access paths (some experimentation
may be in order here). If you optimise the order from most complex to least

complex and arrange them in job streams appropriate for the keys (e.g. one
job stream for customer number/etc keys, another for customer name/etc
keys) you may save some time.

- If you are really cramped for time write a program to write the records
out to a new file. This also provides the option to run parallel job
streams of the read/write operations by selecting records to be added into
the file to be replaced by relative record number ranges. When the writes
have completed delete the old file (if you dare :) and rename the new as
the old and start building your indexes. This can significantly speed
things up :)

Hope this helps

Evan Harris

> > On Behalf Of Mike Berman
> > Subject: File Reorg - Cons
> >
> > I noted alot of suspicion when I attempted to set up a file reorg
> > Just getting lists of those with large amounts of deleted records. Is
> > there some possible occurance that would be a detriment? I am of course
> > referring to off hours running.

This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.

This thread ...


Return to Archive home page | Return to MIDRANGE.COM home page