× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Jerry,

I'm trying to understand the context, so I'm going to try to distill your message into that (i.e. removing all the code that seems largely irrelevant)

You say it works with a tiny little 3k document, but you're having trouble with a 153k document.  To me, 153k is still quite small...   So I guess I don't understand where the problem is. You say you're thinking to use SQL because of the data size -- and using SQL field types like BLOB, apparently.  Which is odd because RPG supports fields up to 16mb in size -- and indeed when you code a BLOB, it gets converted by the precompiler into a regular RPG field, so obviously RPG supports it, right?

I tend to agree that in many cases RPG can be inefficient for large data.  This is mostly because RPG insists on everything having fixed-length sizes.  In most programming languages a string is not a fixed-length thing -- you don't say "varchar(100000)" you just say "string" (or "varchar" which essentially means the same thing) and it takes care of managing the size of the string for you.  It will start out very small, and expand as needed.  Whereas in RPG you have to know the size at compile time, and if you have an array (which is also a fixed length) it doesn't take long before you are running out of space.   The way you solve that is by working with pointers and managing the memory yourself -- but then you have to really know what you are doing, and often can't take advantage of RPG opcodes or BIFs since they don't understand your dynamically allocated memory structures.

You suggest using SQL to overcome this, and I don't really see how that helps.  You still have to bring the data into RPG variables.   SQL's base64 decoder also has a 4096 limit on it, which makes life more difficult than it really should be.  (You can technically get around it by decoding data in a loop, but...)

I'm not completely convinced that your document is large enough for the above to be a major concern, though.  153k isn't very big, imho.   I think you need to give some thought though about what the maximum really is...  is 200k really the maximum you'll ever need?  Would something like a 2mb field be better?

But if it is a concern, I would suggest using YAJL's subprocedures and my open source base64 service program, which will allow for large data much more efficiently.  It's not as simple, perhaps, to have to code subprocedure calls individually for each part, but...  it allows you to deal with single array elements and single fields at a time so that you don't need the whole she-bang loaded into an array or structure.  And the base64 routines handle up to 2gb (at least in theory) which is a lot larger than 4k.

Have you thought about hiring someone on a consulting basis to look at this with you and figure out a solution, then teach you about it?

On 1/24/2024 2:23 PM, (WalzCraft) Jerry Forss wrote:
Scott

It is working using Data-Into YAJL for a base64 ZPL file, which is only 3760 long.

New problem is a json that contains a base64 PDF that is over 153,000 long.
Trying different data types like SQLType(BLOB : 200000)

I am thinking need to use SQL to consume the json because of the data size.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.