× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.


  • Subject: RE: Dynamic arrays/SETOBJACC
  • From: Glenn Birnbaum <gbirnba@xxxxxxx>
  • Date: Fri, 30 Apr 1999 13:06:15 -0700

FYI,

We have used SETOBJACC for several long running (7 days of continuous 
processing) updates (OK fixes) of prior year databases.  By selecting the files 
with the highest read usage and placing in memory via SETOBJACC (we carved out 
a separate pool of 150 MB that will hold guarentee), we saw as high as a 30% 
decrease in clock time (CPU seconds really doesn't change).  This method also 
has the advantage that the data is always current.  If you load an array, and 
then the data is changed by some other process, it will no longer be current.  

We plan to research this method for day-to-day operations also.  For example, 
if you have a look up file (i.e. Customer Master, SKU master, etc.) and it's 
reasonable to load the PF/LF's into memory, then all jobs on the system that 
use that file will benefit.  And their will be no programming changes necessary 
to any applications and all jobs will see the most current data.  In this world 
of GB memory sizes, this really is now feasible.

Glenn Birnbaum


> Larry Bolhuis wrote:
> 
> > I haven't tested this but would not the use of SETOBJACC to move the
> > file entirely into main storage give you almost the same speed boost
> > with *ZERO programming?
> 
> Yes, SETOBJACC would likely be a much better performer than an array.   And 
>with 50
> records the amount of memory you'd have to count out would be negligible.  I 
>believe
> that the minimum that you could carve out for a subsysstem is 32K... more 
>than enough
> for this file.
> 
> But it may not even be worth doing at all because the AS/400 does such a darn 
>good job
> of memory management.  It naturally will keep frequently used data in main 
>storage
> anyway, and again, with such a small file it is very likely that the entire 
>file will
> remain in main storage for the duration of the program anyway.
> 
> It would be interesting to bench mark, but I would suspect that your array 
>program
> would perform significantly worse than the program that just let Data 
>Management
> handle things.
> 
> jte
> 
> >
> >
> >  - Larry
> >
> > lg - Loyd Goodbar wrote:
> > >
> > > I have a work file (about 50 records) that is accessed very often by a 
>RPG4
> > > program. I was thinking about reading the file into an array to make 
>lookups
> > > faster. The problem is the file changes in size; sometimes it's 50 
>records,
> > > sometimes 60, or 40, etc. I'd like to create a dynamically-sized array at
> > > runtime. I briefly looked at the ALLOC/DEALLOC/REALLOC opcodes, but they
> > > really didn't make much sense.
> > >
> > > Is there a relatively easy way of creating dynamic arrays in RPG, or am I
> > > forced to create an arbitrary upper limit?
> > >
> > > Thanks,
> > > Loyd
> > >
> > > --
> > > 
+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to MIDRANGE-L@midrange.com.
| To subscribe to this list send email to MIDRANGE-L-SUB@midrange.com.
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.