×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
On 11-Dec-2015 17:42 -0600, Vinay Gavankar wrote:
We have a program which receives some values thru Data Queue read and
it updates a file with the values received.
The problem is that there are multiple copies of the file it needs
to update.
Let us say it is updating FILEA, so that is what is in the compiled
code.
On the system there are currently 15 files which are exact copies of
this file. They have a different file name, but the the same record
format name, field names etc.
The program reads a 'control' file, which has the names of these 15
files. For every file, it does an OVRDBF of FILEA to the actual file
name, Opens the file, updates/writes the record, and then Closes the
File.
The system was built for 'flexibility' years back, when they started
with 2 or 3 files. It is flexible, in the sense it did not need a
code change as the files being updated grew from 3 to 15.
This is a very 'busy' program, meaning it is constantly receiving
data thru the data queue. Actually there are multiple copies of this
program running simultaneously receiving data from the same data
queue and all of them are pretty busy.
Now they are finding that all these open/closes are seriously
impacting the CPU usage.
An architectural change is obviously indicated, but that would take
time to implement.
As usual, they are looking for a 'short term', quick and dirty fix
(to be implemented in 2 weeks) to eliminate these open/close.
The only thing we could think of was to define the current 15 files
separately in the program and then update them as needed, losing all
flexibility.
Any other ideas or suggestions would be greatly appreciated.
TIA
I was composing a big reply offering various ideas, but the simplest
[effectively to what Jon alludes] is really quite simple, and probably
can be put in place in that scenario; likely with a very simple code
_addition_ requiring just a couple hours to test after implementing to
satisfy both that the function is not impacted *except* with a benefit
in performance by eliminating all full-opens and full-closes. Note
again, that the suggestion, if as I expect, would not be a design change
nor even any code change [except either in a preface CL or an effective
initialization for process of the control file]. But what the situation
is, basically, I will not be able to fully vet most of what I have
already typed, in a timely fashion, and providing followup replies is
not going to be easily accomplished this time of year.
So if you could create a very pared-down program source that has an
effective receiveDQ() that obtains sample data from somewhere [perhaps a
couple files defined by DDL and populated with a simple INSERT DDL] and
an effective processDQ() that performs the insert\write¿\delete\?
activity to the various FILEA files [for which there is also DDL and
DML], I can probably much simper and quicker compose what I believe to
be that simple resolution; the data can be limited to examples for just
two files, because obviously any number greater than one will be
sufficient to apply to the next and beyond, but the /received/ data must
be directing operations to at least two files so I know how the decision
is driven to direct to the alternate file; one insert and one update
might be most useful to get a better picture. The invocation for the
compiler would also be required, including specifying explicitly,
anything that might be changed command defaults; meaning H-specs must be
there too; and note that I can not compile any newer than IBM i 7.1.
p.s. Due to time limitations, I have also not read the most recent
replies to know what has been further discussed, even if perhaps already
resolved; I will try to get through the entire thread by tomorrow
evening. While some of the already noted ideas that suggest a change to
much of the way things already work probably offer a better overall
strategy, I think they might miss-the-mark for a Q&D change; i.e. they
align more with the already alluded eventuality of the re-architect
path, than they align with the more immediate pain-reduction path.
As an Amazon Associate we earn from qualifying purchases.