× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



No journalling needed. The file is processed, then cleared for the next load.
The file on the System i is a regular dds created PF.
btw- there were a few older posts about this issue, one described as a "bulk load", but no solution that I could see.
I can't seem to capture which server job is handling this request to see if anything else going on.
jim franz

----- Original Message ----- From: "Dan Kimmel" <dkimmel@xxxxxxxxxxxxxxx>
To: "Midrange Systems Technical Discussion" <midrange-l@xxxxxxxxxxxx>
Sent: Wednesday, January 13, 2010 6:44 PM
Subject: RE: ole db/400 performance large file


This should run very quickly if you have no index and no key (uniqueness
constraint). Just make sure you have plenty of room on the box, both
memory and disk. How are you creating the file? An SQL create will
probably allocate more room, but it also will journal the file by
default. For this kind of operation, I'll usually create with SQL and
then delete the journal. If I need journaling, I'll start it after the
load.

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx
[mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Jim Franz
Sent: Wednesday, January 13, 2010 4:29 PM
To: MIDRANGE-L@xxxxxxxxxxxx
Subject: ole db/400 performance large file

I've search the archives for this & not found a good answer.
One of the network programmers is using the iSeries Access V5R4 OLE DB
connection to move 500,000 records from a network server to the System i
(v5r4). The native file has no logicals, and no key.
It is taking many minutes to load the first 10,000 and hours to load
500,000.
Local lan connection. Cannot see why so slow.
File not in use on either end. Not a huge record layout.
How can we speed this up. It needs to run on a regular basis (weekly or
even daily).
We have considered ftp, but that server is off on the production server.

Same OLE connection used for many applications, but those are generally
random access, and no problems. Only the large file is a problem.
Verified the network side is accumulating the records very fast.
Jim Franz
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a
moment to review the archives at http://archive.midrange.com/midrange-l.



--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.