× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



This is another tree for you to bark up:

We have experienced performance gains in blocking when processing a high volume of records. I'm not sure what your F-specs look like, but if you can block your input or output, you might find this helpful.

Here's a document about how to control block sizes (since IBM's default block size is about 4k).
https://www-912.ibm.com/s_dir/slkbase.NSF/1ac66549a21402188625680b0002037e/d6738e1cd37e1f33862565c2007cef79?OpenDocument

I ended up writing a little service program utility to determine the maximum block size (in records) for the file passed to it. Very helpful in our batch processing environment.

I also read into Data Structures. I have heard in lectures that it's faster b/c you're performing one large bulk data read vs a lot of little field reads. However I'm a huge fan of qualified fields and love using file I/O DS's for that reason alone, so I've never ran a test to question what I was told about their performance.

-Kurt

-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx] On Behalf Of James H. H. Lampert
Sent: Tuesday, October 26, 2010 12:31 PM
To: RPG programming on the IBM i / System i
Subject: Question on file I/O optimization (in a "nightly refresh" program that currently takes all night AND ALL DAY)

Comparing old-style RPG reads from externally-described files (i.e., a
target DS may not be specified at all) to the new option of specifying a
target DS that's defined on the file:

When I first learned of the new option, I was told that it was more
efficient, because it replaces a whole bunch of individual field data
transfers with one big blit of the entire record.

The file in question (opened for input only) has 314 fields. But we are
only looking at 11 of those fields, and 7 of them are currently defined
in 2 other structures, and in one of those, it effectively makes fields
that are non-contiguous in the file contiguous in the structure the
program refers to.

I started looking into converting this to the new "read-to-DS," but then
I realized that I might be barking up the wrong tree, replacing implicit
individual field transfers with explicit ones, and/or replacing
comparisons of entire structures with comparisons of the individual
fields therein.

The obvious question is whether or not RPG, when doing an old-style
externally-described read, skips the fields that are never referenced,
or transfers them anyway.

So am I barking up the wrong tree here?

--
JHHL

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.