× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



If the desire is to have the tables in question at a single point in time,
maybe temporal tables might help?

I'm still trying to digest the use case for why a copy of the data is
needed. That seemingly is the problem statement, the time needed to make
the copy, so why make it in the first place?

--
Jim Oberholtzer
Chief Technical Architect
Agile Technology Architects


On Tue, Feb 27, 2024 at 8:30 AM Richard Schoen <richard@xxxxxxxxxxxxxxxxx>
wrote:

Depending on HOW LARGE the memory chunk will be you would want to do some
heavy testing before removing the tables from the mix.

If the process cracks during processing what is the fallout/recovery ?

Maybe you've already though that out.

If you're writing to tables it shouldn't really be inefficient.

Hard to know without more specifics.

If it makes sense you could always move your temp tables to a library
called TEMP or TMP and then the tables can be shared and potentially
permanent.

Again not knowing the potential for process collisions could make things
interesting.

Another thought could be to re-create that process in something like
Python and try using something like MariaDB or even SQLite if only
temporary storage is needed. SQLite is a smoking fast local db.

These are just ideas to think about.

Regards,
Richard Schoen
Web: http://www.richardschoen.net
Email: richard@xxxxxxxxxxxxxxxxx


----------------------------------------------------------------------

message: 1
date: Tue, 27 Feb 2024 09:02:28 -0500
from: Brian Garland via MIDRANGE-L <midrange-l@xxxxxxxxxxxxxxxxxx>
subject: Passing large amount of data between programs

This is sort of a best practice question.

We have an RPGLE based web service that prices an order. Since we already
had pricing routines that worked off database tables containing order
information we take what comes into the web service and create a QTEMP
version of the tables with the order and run our existing pricing routines.

This has proven to be fairly slow so we started analysing where.
Currently the largest chunk of time is spent clearing and writing to these
tables.
We do build them once per job which makes a difference. We also include
some indexing. Our initial thought is that if we could keep the
information in memory and modify our pricing routine to work off that
memory we could improve the performance. The fallout from this would be
updating our routine that runs off tables to copy the info needed into
memory before pricing and the update the tables afterward. That is to
avoid having two routines that do the same thing. This bit of extra time
on that process is okay.

So, the question is what is the best way to pass that chuck of memory
between programs so that several programs can each update their part?

Brian


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.