× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Thanks for your reply John

We just had a meeting concerning this situation
One of the other programmers was actually in STRDBG while a user was using the program and it looks like the culprit is a join logical file
At one point the code takes a full 20 seconds to read the next record in the join logical
This join logical is between 3 separate files
Personally – I hate such files

We believe the access paths on this file may not be the most efficient – what with deleting records and running re-orgs(RGZPFM) etc.
So we plan on deleting this logical and recreating it
However, because its month end, we have decided to not do this until early morning on the 2nd

Alan Shore
Solutions Architect
IT Supply Chain Execution



60 Orville Drive
Bohemia, NY 11716
Phone [O] : (631) 200-5019
Phone [C] : (631) 880-8640
E-mail : ASHORE@xxxxxxxxxxxxxxxxxxxx

‘If you're going through hell, keep going.’
Winston Churchill


-----Original Message-----
From: John Yeung [mailto:gallium.arsenide@xxxxxxxxx]
Sent: Tuesday, May 31, 2022 4:56 PM
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxxxxxxxx>
Cc: Rob Berendt <rob@xxxxxxxxx>; Alan Shore <ashore@xxxxxxxx>
Subject: [EXTERNAL] Re: Is there a limit as to the number of files an RPG ILE program can procss

On Tue, May 31, 2022 at 1:48 PM Alan Shore via MIDRANGE-L <midrange-l@xxxxxxxxxxxxxxxxxx> wrote:

I do agree with your thought as to it being a hard halt - but right
now I am clutching at straws as to what is causing this slow down

Back in the day, it used to be 50 files. To me that sounds like a ridiculously high number already, but indeed our old codebase has some gigantic legacy programs that use that many, and you can tell that a subsequent programmer (still from ages ago) split off part of the processing to another program because they needed more than that.
Maybe IBM should *tighten* the file limit rather than remove it, to encourage "modular" programming. ;)

But as for the performance problem, my guesses would be (1) when the program was changed, some algorithmic inefficiency was introduced, like inadvertently reading a file inside a loop when it really only needs to be read once before or after the loop; or (2) maybe the file that was added is often in use, so the program often has to wait for it to be freed up; or it's chaining with lock when it could be chaining without lock.

John Y.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.