× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Sorry. I repeat the message, because the previous one has a badly formed
subject. Magic of copy&paste

Since the last message, we have done some work, and it worked a little
better.
As Don Brown suggested, we upgraded POI to 3.14 version, and changed some
API. We didn't close previously the xssf workbook which is the template
base for our SXSSFworkbook object. We found, executing a loop for our
function, that closing the underlying workbook was required. Now we can
execute our functions in a repeated cycle without problems.
But it fails when we reach about 200,000 rows, far of what gets Don (a
million records)
We found that the error is produced when the write of Excel file happens,
as if temporary files exhausted the jvm space. In fact, it throws a out of
memory error. Our code to do this is:
SXSSFStr = new_String(%trimr(&(1:)));
SXSSFile = new_FileOutputStream(SXSSFStr);
SXSSF_write(ExcelSXSSFBook:SXSSFile);
FileOutputStream_close(SXSSFile);

hssf_freeLocalRef(SXSSFile);

hssf_freeLocalRef(SXSSFStr);

and SXSSF_write is defined this way:
*-----------------------------------------------------------
D SXSSF_write PR EXTPROC(*JAVA
D :SXSSF_WORKBOOK_CLASS
D :'write')
D tgtExcelFile like(jOutputStream)

At end, we first dispose temporary files, and then close the original
template.

&(1:) is a parameter passed by our API from CA Plex, in this case, a string
with filename and full path.
All our programs are designed and generated from the CA Plex generator.

In our java execution, we declare an environment for our starting job. A CL
declares the required environment variables , basically classpath and java
home. However, while the job works, a complete list of PASE variables are
declared. I suppose it is defined by the system start of java. As far as I
can see in java IBM manuals, these definitions does not interfere with our
job.

In sum, our definitions seem to work, but we get an out of memory error
when a certain number of rows are reached. Error is produced when calling
the write method.

We have some questions:
I have seen some developers (StackOverflow) calling flush() method to write
when reached the windows rows limit. It writes to an Excel file or
temporary file? In this last case, the problem does not change.
A colleague asked if we are using java 64 bits. As long as I can see, yes
, we use it. It is what I can see when dump executes.
He says also:


*The problem is how to ensure your jobs running ILE RPG IV and using "POI"
(via that JNI interface) will use the 64-bit version.The 64-bit JVM gives
you the possibility to have a much larger "address space" for the Java
Virtual Machine, and thus a much larger heap size.This will ultimately
impact the application for things like a very large spreadsheet, if you run
out of heap space.*

which suggest two questions:
1, we are starting a defined version of java, and can see it is the used
java version when looking at our job messages, and looking at dump data.
Could JNI with POI call another java?
2, if it is the case that we use java 8 64 bits, how can we enhance memory
management?

Any help would be welcome
With regards
Jorge Ubeda

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.