× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Is this being done by JNI?

Or a client / server approach, I ask this because for JNI if there is even
one object being created inside a loop that is not being destroyed manually
by either freelocalfree or object group then you have a memory leak.

Also for such a big task i.e. 466,000 records garbage collection isn't
really going to help you anyway because every cell object belongs to a row,
every row to a sheet and every sheet to a workbook. As long as you have a
reference to your workbook object (which you need if your going to save the
spreadsheet) then you have a reference to all of the other objects (cells,
rows etc) as well so gc can't delete them.

Neill



-----Original Message-----
From: java400-l-bounces@xxxxxxxxxxxx [mailto:java400-l-bounces@xxxxxxxxxxxx]
On Behalf Of Blalock, Bill
Sent: 20 October 2008 18:09
To: Java Programming on and around the iSeries / AS400
Subject: RE: A question about JAVA JVM and large EXCEL spreadsheet creation



But more importantly, is there anything I can do short of multiple
files, to help the program process the file faster?

You could create a data area with the number of records to start at and
starting place. Say that was 30,000 and 1 initially.

Run program. When 30,000 records processed update the data area
starting place to 30,001 and end the program. Be sure to set on LR and
do all the clean up.

Next execution picks up at 30,001 runs through 60,000.

Sound hokey, I know. But if the program is doing everything you want
and just hits a wall at 35,000 records or so then stopping before you
hit the wall and letting the OS clean up is the fastest way to fix it.

The more elegant ways to fix this will involve Java programming.

-----Original Message-----
From: java400-l-bounces@xxxxxxxxxxxx
[mailto:java400-l-bounces@xxxxxxxxxxxx] On Behalf Of Marvin Radding
Sent: Monday, October 20, 2008 11:21 AM
To: java400-l@xxxxxxxxxxxx
Subject: A question about JAVA JVM and large EXCEL spreadsheet creation

I have created a command (CVTXLS) to convert a file into an EXCEL
spreadsheet using the HSSF/POI classes from Jakarta project. (Thanks to
Scott Klement for his documentation)

I am using this command to convert a 466,000 record file into a
spreadsheet. It has 113 columns of mostly numeric data. The code already
is able to break every 65k records and start a new tab.

The problem is after processing over 35k records quite quickly, it is
now working very slowly and I was wondering why? Is this a memory
allocation problem? Can I speed thinks up by saving the file every now
and then?

Can anyone tell me why it suddenly when from 30k records in the first
hour to only a few hundred records in the next hour? But more
importantly, is there anything I can do short of multiple files, to help
the program process the file faster?

Thanks,

Marvin



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.