|
A word of warning about using a data queue. Space management on data queues is very poor in OS/400. If you add records to a data queue and for any reason, the program that removes records from the data queue is not running, then the data queue will expand to contain the buildup of records. However, contrary to what you might expect, the size of the data queue never goes back down once the records are removed. At that point, you're stuck with a rather large data queue space and, as far as I've been able to see, the only recovery is to delete the data queue and then rebuild it from scratch. Rich Loeber Kisco Information Systems http://www.kisco.com ------------------------------------------------------------------------ Don Tully Sr wrote:
There is absolutely no down side to reuse deleted records. The performance issues are very close to zero. All applications that I have written for the past 13 years have all files set to reuse deleted records. It certainly eliminates the downtime problems associated with file reorgs. If you must guarantee that records are processed in write sequence, then you must add a key, perhaps timestamp. Obviously the relative record number will no longer indicate the sequence the records were written to the file. If you want to go to the effort of using a data queue, that certainly could also be a way to go. I have also written many data queue routines for high performance requirements. Don Tully Tully Consulting LLC -----Original Message----- From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of James H H Lampert Sent: Monday, September 25, 2006 5:24 PM To: midrange-l@xxxxxxxxxxxx Subject: File that has records constantly being added and deleted Here's the situation: We have a file. Any arbitrary number of jobs can put records into the file; a single dedicated job reads the records, in arrival sequence, processes them, and deletes them. We thus have a file that rarely has more than a few active records, but accumulates lots and lots of deleted ones. Is there a way to squeeze out deleted records without having to grab an exclusive lock on the file? Or would it be more sensible to set it to re-use deleted records, and modify the processing program to read by key? Or are there other ideas? -- JHHL -- This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options, visit: http://lists.midrange.com/mailman/listinfo/midrange-l or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a moment to review the archives at http://archive.midrange.com/midrange-l. -- This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options, visit: http://lists.midrange.com/mailman/listinfo/midrange-l or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a moment to review the archives at http://archive.midrange.com/midrange-l.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.