If performance were the primary concern, then you might need to look into parallel processing on the MS SQL Server side; using multiple threads to load the data.
Have an IBM i process that outputs records to say 4 text files, FTP them to the SQL Server machine, then run some kind of multi-threaded process (or multiple processes) on that box to perform inserts simultaneously.
As you've discovered, sending millions of "messages" via JDBC has too much overhead. And no matter how you slice and dice this, you're still subject to SQL Inserts on the SQL Server box.
If the positions were reversed, and you were importing data into IBM i files, you could increase performance GREATLY by using RPG block reads and writes.
----- Original Message -----
From: "Anderson, Kurt" <KAnderson@xxxxxxxxxxxx>
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>
Sent: Monday, February 25, 2013 10:49 AM
Subject: RE: JDBCR4 and Inserts
I'm using the jTDS driver.
Yep, I'm re-preparing the statement every time. I'll test the idea of preparing once and see how it goes. Though it sounds like the JDBC connection might not be the answer here, and that's fine. Good idea about mapping to the IFS.
In the scenario where I'm creating a file on the IFS, Bulk Insert is how the file is getting loaded into the database.
Thanks for the responses,