× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Better yet, batch insert with PreparedStatement

http://www.mkyong.com/jdbc/jdbc-preparedstatement-example-batch-update/


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Charles Wilt
Sent: Monday, February 25, 2013 11:33 AM
To: Midrange Systems Technical Discussion
Subject: Re: JDBCR4 and Inserts

I'd suggest extending Scott's JDBCR4 to support batch inserts...

http://www.roseindia.net/jdbc/Jdbc-batch-insert.shtml


Charles


On Mon, Feb 25, 2013 at 11:11 AM, Anderson, Kurt <KAnderson@xxxxxxxxxxxx>wrote:

At IBM i 7.1, we've been using Scott Klement's JDBCR4 for some time now.
It's been great, although we've been doing inserts not as it's shown
in the JDBCR4 presentation, but by sending the SQL statement to SQL
Server for it to then use the Linked Server to select the records to
insert. But we're finding limitations in doing it that way (when
inserting millions of
records) - every so often we get a "Connection Reset" message. I'm
told that the Linked Server can't reliably handle such a large volume
of records, so we're looking at other options.

One option is to instead have the RPG program use JDBCR4 to directly
insert into the SQL Server database. From my understanding based on
the presentation, this method inserts one record at a time instead of writing a
block of records. I ran a test insert and it took about 5 minutes to
insert 10,000 records, way too slow for our purposes. This test was
block-reading the file, but was preparing and executing the insert
statement for every record read.

Presentation:
http://www.scottklement.com/presentations/External%20Databases%20from%
20RPG.pdf See pg 25 for the Prepared Statement Insert

Another suggestion was using Client Access to create a file for a bulk
insert by SQL server.

http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-da
ta-from-db2-to-microsoft-sql-server/

We also tried using CPYTOIMPF and then FTP'd the file to a location
for SQL Server to load the file using a bulk insert. The bulk insert
only took a minute for 2 million records. The COPY and FTP took about 10 minutes.
This speed was great compared to both the test I mentioned above
(inserting directly to the db from RPG) and compared to the method we
have been using of sending a Linked Server statement to SQL Server
(when it would work).
I was hoping to use JDBCR4 to perform the insert so there would be
less steps involved in the process. It's not necessary, but I thought
I might draw upon others' experience.

Thanks,

Kurt Anderson
Sr. Programmer/Analyst
CustomCall Data Systems, a division of Enghouse Systems Ltd.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take
a moment to review the archives at
http://archive.midrange.com/midrange-l.


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a moment to review the archives at http://archive.midrange.com/midrange-l.




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.