× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



This could be a garbage collection problem.

Java objects which are no longer needed hang around in memory until
garbage collection sweeps them up and puts them back into the memory
pool. The JVM does garbage collection when it has nothing else to do
... when it is waiting something else.

It sounds like your programs are running full throttle until heap space
is depleted and forces garbage collection. I could be wrong about that.

You have some control over garbage collection for Java programs executed
by RUNJVA:
GCHINL (garbage collection initial size)
GCHMAX (garbage collection maximum size)

Prompt up RUNJVA, get into extended help, and you can read about garbage
collection.

When Java is run from QSH you can indirectly effect garbage collection
by setting the initial and maximum heap sizes. This will effect when
Java start to run short of heap memory (if that is the problem).

In the Java code itself you can use the method
System.gc();
to "suggest" to the JVM that it should collect the garbage.

I hope that helps.

Bill Blalock

-----Original Message-----
From: java400-l-bounces@xxxxxxxxxxxx
[mailto:java400-l-bounces@xxxxxxxxxxxx] On Behalf Of Marvin Radding
Sent: Monday, October 20, 2008 11:21 AM
To: java400-l@xxxxxxxxxxxx
Subject: A question about JAVA JVM and large EXCEL spreadsheet creation

I have created a command (CVTXLS) to convert a file into an EXCEL
spreadsheet using the HSSF/POI classes from Jakarta project. (Thanks to
Scott Klement for his documentation)

I am using this command to convert a 466,000 record file into a
spreadsheet. It has 113 columns of mostly numeric data. The code already
is able to break every 65k records and start a new tab.

The problem is after processing over 35k records quite quickly, it is
now working very slowly and I was wondering why? Is this a memory
allocation problem? Can I speed thinks up by saving the file every now
and then?

Can anyone tell me why it suddenly when from 30k records in the first
hour to only a few hundred records in the next hour? But more
importantly, is there anything I can do short of multiple files, to help
the program process the file faster?

Thanks,

Marvin



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.