|
On Jul 20, 2023, at 11:22 AM, mlazarus <mlazarus@xxxxxxxxxxxx> wrote:
Jon,
That *is* nasty. How would approach processing that?
-mark
On 7/20/2023 10:56 AM, Jon Paris wrote:
Nasty.--
The nastiest one I encountered was one where they basically made up the data names based on the query. If the query (for example) returned two products then instead of an array of data containing something like:
{
"product": [{
"productname": "Loveseat",
"price": 395.99
},
{
"productname": "Sofa",
"price": 1159.99
}
]
}
It would instead be:
{
"product": [{
"Loveseat": 395.99
},
{
"Sofa": 1159.99
}
]
}
Bad enough by itself but a similar system was used for other features. Truly horrible "design".
In the JSON world, there seems to be rather too much of a "because we can" attitude.
Jon P.
On Jul 20, 2023, at 8:32 AM, Brad Stone<bvstone@xxxxxxxxx> wrote:
I had one just the other day that had an array for a descriptions for an
item:
item: x
descriptions: [
desc1, desc2, desc3, desc4
]
Then in the another document they did
item: x
description_1 : xxx,
description_2 :xxx,
....
up to 23. Why 23? Who knows.
On Wed, Jul 19, 2023 at 5:00 PM Jon Paris<jon.paris@xxxxxxxxxxxxxx> wrote:
The ones I hate are those that introduce arbitrarily named elements into
the stream - apparently "because they can".
Jon P.
On Jul 19, 2023, at 5:09 PM, Brad Stone<bvstone@xxxxxxxxx> wrote:to
Don't you love how you can deal with 100 different JSON endpoints and get
100 different interpretations of JSON formatting? lol.
On Wed, Jul 19, 2023 at 4:00 PM Greg Wilburn<
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx> wrote:
Vern,
Thank you! That works perfect...
However, the API developer added a parameter to the call that causes it
wasrespond with element names. I was going to try something new, but it
VERNONtoo easy to complete the task with DATA-INTO.
Thanks to all! I learned something.
Greg
-----Original Message-----
From: RPG400-L<rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
haveHAMBERG Owner via RPG400-L
Sent: Tuesday, July 18, 2023 11:00 AM
To: rpg400-l@xxxxxxxxxxxxxxxxxx
Cc: VERNON HAMBERG Owner<vhamberg@xxxxxxxxxxxxxxx>;
rpg400-l@xxxxxxxxxxxxxxxxxx
Subject: RE: JSON un-named arrays
Greg
You need to add with chg at the end of the statement - there might be
other commit settings one could use, but that's what I and others here
fileused.
Good luck
Vern
On Tue, 18 Jul, 2023 at 8:06 AM, Greg Wilburn<
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx> wrote:
To: rpg programming on ibm i
Vern,
That's what I tried... I get an SQL Error: SQL State 42926, Sql0443 LOB
and XML locators are not allowed with commit(*none).
I can't find how to overcome that.
Greg
-----Original Message-----
From: RPG400-L<rpg400-l-bounces@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l-bounces@xxxxxxxxxxxxxxxxxx>> On Behalf Of VERNON HAMBERG Owner
via RPG400-L
Sent: Monday, July 17, 2023 6:17 PM
To: rpg400-l@xxxxxxxxxxxxxxxxxx<mailto:rpg400-l@xxxxxxxxxxxxxxxxxx>
Cc: VERNON HAMBERG Owner<vhamberg@xxxxxxxxxxxxxxx<mailto:
vhamberg@xxxxxxxxxxxxxxx>>; rpg400-l@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: RE: JSON un-named arrays
Hi Greg
As for how to do this with stream file, if you mean how do you process a
JSON file in the IFS, there is an SQL function that can pull the JSON
-in to be processed - it's used where Daniel has "insert your json here"
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx>>here's an example - JSON_TABLE(Get_CLOB_From_File(:xmlFilePath),
HTH
Vern
On Mon, 17 Jul, 2023 at 4:05 PM, Greg Wilburn<
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:
thewrote:
To: rpg programming on ibm i
Thanks Daniel!
With this API, the Data portion is only dynamic based on my call to the
REST web service (I passed the value in the "fields" as a parameter to
complicated.web service).
I'm willing to try something new (and I like Run SQL Scripts)...
Just curious how I would use this with a stream file on the IFS? I like
to save my latest response on the IFS.
Thx,
Greg
-----Original Message-----
From: RPG400-L<rpg400-l-bounces@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l-bounces@xxxxxxxxxxxxxxxxxx><mailto:
rpg400-l-bounces@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l-bounces@xxxxxxxxxxxxxxxxxx>>> On Behalf Of Daniel Gross
Sent: Monday, July 17, 2023 4:19 PM
To: RPG programming on IBM i<rpg400-l@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l@xxxxxxxxxxxxxxxxxx><mailto:rpg400-l@xxxxxxxxxxxxxxxxxx<mailto:
rpg400-l@xxxxxxxxxxxxxxxxxx>>>
Subject: Re: JSON un-named arrays
Hi Greg,
I don't know, if it helps, but you can walk the whole JSON document with
SQL without problems.
select *
from json_table(
'{...insert.your.json.here...}',
'lax $'
columns (
records integer path '$.records',
fields varchar(1000) path '$.fields',
nested path 'lax $.data[*]'
columns (
trano integer path '$[0]',
ref varchar(50) path '$[1]',
duedate decimal(8, 0) path '$[2]',
name varchar(50) path '$[3]',
amount decimal(10, 2) path '$[4]'
)
)
) as json_doc;
As you can see, the named parts are referenced by ".name", whereas the
"unnamed" JSON elements have no named reference, but only the "[index]".
The nested path reference is the JSON equivalent of an table JOIN.
Of course - if the "data" part of the JSON is dynamic, and you have to
detect the columns be the "fields" element, it can get a but
anBut even then, you can first split the "fields" list into its elements,
youthen construct a dynamic SQL to extract the "data" parts.
So from my POV - SQL is often the easier solution - especially because
gwilburn@xxxxxxxxxxxxxxxxxxxxxxxcan test it with Run SQL Scripts until everything is "perfect".
HTH and kind regards
Daniel
Am 17.07.2023<http://17.07.2023><http://17.07.2023<http://17.07.2023>>um 17:31 schrieb Greg Wilburn<gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx><mailto:
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:gwilburn@xxxxxxxxxxxxxxxxxxxxxxx>>>:
Typically, I use DATA-INTO and the YAJLINTO parser... But I'm not sureif that's possible with the JSON below. The array "data" contains an
un-named array of values.
Any suggestions on how handle reading this?http://187.2<http://187.2>>
Do I need to use the YAJL "tree-based" method?
{
"records": 4,
"fields": "trano,ref,duedate,name,amount",
"data": [
[
1,
"40061433",
20230719,
"Mudpie",
187.2<http://187.2><
],http://156.21<http://156.21>>
[
2,
"40061349",
20230719,
"kiddywampus",
156.21<http://156.21><
],http://70.8<http://70.8>>
[
3,
"40062677",
20230721,
"Chase",
70.8<http://70.8><
],http://110.7<http://110.7>>
[
4,
"40061703",
20230723,
"Stamford Toys",
110.7<http://110.7><
]https://www.totalbizfulfillment.com/><https>://
]
}
[Logo]<https://www.totalbizfulfillment.com/><https<
www.totalbizfulfillment.com/>><www.totalbizfulfillment.com/>>> Greg
Wilburn
Director of IT
301.895.3792 ext. 1231
301.895.3895 direct
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:gwilburn@xxxxxxxxxxxxxxxxxxxxxxx>><mailto:
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:gwilburn@xxxxxxxxxxxxxxxxxxxxxxx><mailto:
gwilburn@xxxxxxxxxxxxxxxxxxxxxxx<mailto:
http://www.totalbizfulfillment.com<http://www.totalbizfulfillment.com1 Corporate Dr
Grantsville, MD 21536
www.totalbizfulfillment.com<http://www.totalbizfulfillment.com><
<
http://www.totalbizfulfillment.com<http://www.totalbizfulfillment.com><
This is the RPG programming on IBM i (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/rpg400-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.