And the doc's linked to previousily...

From the REDBOOK - Striving for Optimal Journal Performance on DB2
Universal Database for iSeries sg246286..

Obviously this approach should not be carried to extremes. If your batch
jobs are going to add and
update 100,000 rows, this single commit cycle approach probably makes
sense. If your batch
job is going to add and update 40 million rows placing all of those into a
single commit cycle
would be foolish, since the working set of locks resident in main memory
employed in order to
keep track of so many tentatively changed rows will surely degrade rather
than enhance your
ultimate performance.


On Fri, Jan 18, 2013 at 3:20 PM, D*B <dieter.bender@xxxxxxxxxxxx> wrote:

In the CL
program that calls the batch program, specify the files that use commitment
control and open them. Start a commit cycle in the CL program before
calling the batch program. In the application program(s), change the file
description to specify that commitment control is in use. Once the program
returns to the CL program, end the commit cycle to force any pending file
I/O to complete.

This might not work, if there is concurrent workload working with the same
file!!! The lock contentions of the long lasting recordlocks slows down all
up to doing nothing or breaking down your batchjob. This won't work with
concurrent save activities, I would be very very very carefull to follow
such recomendations!!!

This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives

This thread ...


Return to Archive home page | Return to MIDRANGE.COM home page