× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 1/14/2016 11:51 AM, Cyndi Bradberry wrote:
We are experiencing issues with files being created with either the default of 10000 records or *NOMAX. I personally don't like either, (no wars on this please).

What are best practices for file sizes ? How do you address the reuse deleted records option ? How do you address file full ?

I think it would be interesting if there were a set of best practices
for all files. In my case, I have 'code' tables (like state
abbreviation to full name) which never come close to 10k records, never
have deleted rows, and never need attention. I also have transaction
tables which grow like weeds, but rarely, if ever, have a deleted
record. *NOMAX works great for us. Then I have master tables, which
get many updates, few inserts and deletes. Every so often one of these
will outgrow it's allocation and the operator gives it a '1' and tells
the developers about it in case there's a problem.

For new tables, defined through SQL, we accept the defaults which reuse
deleted records. Haven't had a pressing need to change that policy.

My thought is to build a file to contain the number of records and deleted records for each file at end of month. That will, over time give me an idea of how big things grow.

Look at the SQL catalogue table SYSTABLESTATS for some real time statistics.

With respect to end of month, I'd like to share my experience. I have
transaction tables which grow continuously from January through
December, then are reset to 0 rows in January. Were I to try to set
these tables' sizes at any given end of month, they'd be wrongly too
small 11 out of 12 months. The point being that in addition to the
current month's statistics, you may want to store an 'all time largest'
statistic too. Along with the date that high tide occurred, so you can
pin down why it was large.

I also would like to program to check for files nearing their stated capacity and bumping them by either a set amount or a percentage of active records and reporting that to me for review.

I don't do any processing by RRN, so I can set any table to reuse
deleted records. The last time I had to manage a VTOC this carefully
was on System/3 in the early 1980s.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.