×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
If I execute 100 individual insert statements to a table (no commitment control), does the system insert each record out individually, or does blocked inserting occur?
A little background:
We receive files with all sorts of layouts, and we mediate them and output them to our standard file. The mediation programs have a lot of similarities, and I have moved most of those similarities into service programs. However one thing they all still share is that each program is writing to the standard file. I'd like to encapsulate that standard file. In doing so, I'd like to use SQL to write the records, but I can't sacrifice blocked writing due to the high volume of transactions.
I'm aware of inserting multiple rows in one statement using arrays, however I don't want to attempt to control the number of records the system should write out at once. In addition, there will already be an element of complexity using arrays because one incoming record may result in more than one standard record (so doing a blocked insert for that situation would be fine).
I'm on IBM i 7.1.
Thanks,
Kurt Anderson
Sr. Programmer/Analyst
CustomCall Data Systems, a division of Enghouse Systems Ltd.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.