× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Hi Jerry,

1) The SET OPTION statement always needs to be the first SQL statement in your program -- nothing new about that, that's always been the case.

2) You should not be using http_url_get() as that was retired in 2017.  Though, I don't think it relates in any way to the problems.

3) The error "automatic storage overflow" means that you are running out of automatic storage.  That means that all of the variables you have stored in automatic storage, when added together, exceed 16mb.  Automatic storage variables are the "normal" variables that are defined in RPG within a subprocedure. (As opposed to those based on a pointer and allocated from the heap.)   I suspect its all of these CLOB variables you've made, I would try to minimize them.  If you use YAJL and base64_decode you can eliminate SQL, which will help.   You could also manage the storage for some of them separately from the heap -- especially those that are just "temporary", which will also help.


On 1/26/2024 1:02 PM, (WalzCraft) Jerry Forss wrote:
I have it working in a test pgm. I am just working with the existing json on the IFS.
I can save to IFS and send to printer. Works great.

Code from test pgm

dcl-s db64 sqltype(clob:2000000);
dcl-s pdf sqltype(clob:2000000);
Dcl-S DecodedReportBinary SQLType(blob : 2000000);

exec sql SET OPTION COMMIT = *CS, NAMING = *SYS, CLOSQLCSR = *ENDMOD;

exec sql set :db64 =
get_clob_from_file(:ResponseFile,0);

exec sql commit;

exec sql set :pdf = json_query(:db64,'$.summarypdfb64' omit quotes);

Clear DecodedReportBinary;
Exec SQL Values QSYS2.BASE64_DECODE(:PDF)
Into :DecodedReportBinary;

When I copy code to main pgm it is failing because of the
exec sql SET OPTION COMMIT = *CS, NAMING = *SYS, CLOSQLCSR = *ENDMOD;

Apparently it needs to be at the top of the pgm, which I now have. If not there pgm won't compile.

When I call web service to get json it blows up. See bottom about Automatic storage overflow.
Hasn't even gotten to my new code yet.

When I call the pgm without the new code it works perfectly getting the json.

What am I doing wrong?

Rc = http_url_get( Request : ResponseFile);

Program: HTTPAPIR4 Library: LIBHTTP Module: HTTPAPIR4
5252 5251
5253 5252 *********************************************************
5254 5253 * Receive response chain from server
5255 5254 *********************************************************
5256 5255 c eval rc = RecvResp( peComm
5257 5256 c : wwRespChain
5258 5257 c : %size(wwRespCha
5259 5258 c : peTimeout
5260 5259 c : *Off )
5261 5260 c if rc < 1
5262 5261 c callp SetRespCode(rc)
5263 5262 c return rc
5264 5263 c endif
5265 5264
5266 5265 /if defined(MEMCOUNT)
More...
Debug . . .

F3=End program F6=Add/Clear breakpoint F10=Step F11=Display variable
F12=Resume F17=Watch variable F18=Work with watch F24=More keys
Automatic storage overflow. +

Additional Message Information

Message ID . . . . . . : MCH4429
Date sent . . . . . . : 01/26/24 Time sent . . . . . . : 12:49:36

Message . . . . : Automatic storage overflow.

Cause . . . . . : One of the automatic storage stacks X'00000002' for the
thread has overflowed or a storage access beyond the maximum size of a
teraspace automatic storage stack was attempted. Further program execution
within the thread is not possible. Automatic storage stack values and their
meanings follow:
1 -- System stack in single level storage.
2 -- User stack in single level storage.
3 -- System stack in teraspace.
4 -- User stack in teraspace.
Technical description . . . . . . . . : Attempt to reduce the automatic
storage used by programs running in the thread.
Bottom
Press Enter to continue.

-----Original Message-----
From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Scott Klement
Sent: Thursday, January 25, 2024 12:40 PM
To: RPG programming on IBM i <rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: Re: External RE: Base64Decode

Hi Jerry,

Okay, looks like I have a small mistake in the web page... I didn't know, and nobody mentioned it to me. It's fixed now.

But I'm kinda puzzled as to why this was a show-stopper for you. I mean, it's open source, publicly available code, but you refuse to download it unless you can get an encrypted connection? It's important that this code that's available free to everyone is encrypted so nobody can view it while you download it?

Well, it's fixed now. If you have more problems, please tell me.

-SK


On 1/25/2024 11:35 AM, (WalzCraft) Jerry Forss wrote:
Scott

Thank you for the explanation.

The first thing I tried to do was download your base64 decode.

All it gives me is
base64-v1.1.zip can't be downloaded securely.

That’s when I moved on to SQL.

-----Original Message-----
From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Scott Klement
Sent: Wednesday, January 24, 2024 3:35 PM
To: rpg400-l@xxxxxxxxxxxxxxxxxx
Subject: Re: External RE: Base64Decode

Jerry,

I'm trying to understand the context, so I'm going to try to distill
your message into that (i.e. removing all the code that seems largely
irrelevant)

You say it works with a tiny little 3k document, but you're having trouble with a 153k document. To me, 153k is still quite small... So I guess I don't understand where the problem is. You say you're thinking to use SQL because of the data size -- and using SQL field types like BLOB, apparently. Which is odd because RPG supports fields up to 16mb in size -- and indeed when you code a BLOB, it gets converted by the precompiler into a regular RPG field, so obviously RPG supports it, right?

I tend to agree that in many cases RPG can be inefficient for large data. This is mostly because RPG insists on everything having fixed-length sizes. In most programming languages a string is not a fixed-length thing -- you don't say "varchar(100000)" you just say "string" (or "varchar" which essentially means the same thing) and it takes care of managing the size of the string for you. It will start out very small, and expand as needed. Whereas in RPG you have to know the size at compile time, and if you have an array (which is also a fixed length) it doesn't take long before you are running out of space. The way you solve that is by working with pointers and managing the memory yourself -- but then you have to really know what you are doing, and often can't take advantage of RPG opcodes or BIFs since they don't understand your dynamically allocated memory structures.

You suggest using SQL to overcome this, and I don't really see how that helps. You still have to bring the data into RPG variables. SQL's
base64 decoder also has a 4096 limit on it, which makes life more
difficult than it really should be. (You can technically get around
it by decoding data in a loop, but...)

I'm not completely convinced that your document is large enough for the above to be a major concern, though. 153k isn't very big, imho. I think you need to give some thought though about what the maximum really is... is 200k really the maximum you'll ever need? Would something like a 2mb field be better?

But if it is a concern, I would suggest using YAJL's subprocedures and my open source base64 service program, which will allow for large data much more efficiently. It's not as simple, perhaps, to have to code subprocedure calls individually for each part, but... it allows you to deal with single array elements and single fields at a time so that you don't need the whole she-bang loaded into an array or structure. And the base64 routines handle up to 2gb (at least in theory) which is a lot larger than 4k.

Have you thought about hiring someone on a consulting basis to look at this with you and figure out a solution, then teach you about it?

On 1/24/2024 2:23 PM, (WalzCraft) Jerry Forss wrote:
Scott

It is working using Data-Into YAJL for a base64 ZPL file, which is only 3760 long.

New problem is a json that contains a base64 PDF that is over 153,000 long.
Trying different data types like SQLType(BLOB : 200000)

I am thinking need to use SQL to consume the json because of the data size.




Subject to Change Notice:

WalzCraft reserves the right to improve designs, and to change specifications without notice.

Confidentiality Notice:

This message and any attachments may contain confidential and privileged information that is protected by law. The information contained herein is transmitted for the sole use of the intended recipient(s) and should "only" pertain to "WalzCraft" company matters. If you are not the intended recipient or designated agent of the recipient of such information, you are hereby notified that any use, dissemination, copying or retention of this email or the information contained herein is strictly prohibited and may subject you to penalties under federal and/or state law. If you received this email in error, please notify the sender immediately and permanently delete this email. Thank You

WalzCraft PO Box 1748 La Crosse, WI, 54602-1748
www.walzcraft.com<http://www.walzcraft.com> Phone: 1-800-237-1326

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.