|
Hey Jay
I've used the *NIX sed utility to replace things within IFS files. I
don't know if there is an SQL way to make that call - the QShell version
doesn't do the replace in place, the PASE one does, as I recall.
Just a thought, man!
*Regards*
*Vern Hamberg*
IBM Champion 2025 <cid:part1.tQYG2QnF.2v2q6zzt@centurylink.net> CAAC
(COMMON Americas Advisory Council) IBM Influencer 2023
On 7/16/2025 10:06 AM, Jay Vaughn wrote:
also while we are on the topic - and I have never seen this before butset
wondering if possible...
can you update an json?
is there a such a db2 sql statement where you can say... update myJson,
path/key value = 'foo'?you
or can db2 json only be derived from a db2 table?
thanks
Jay
On Wed, Jul 16, 2025 at 10:53 AM Jay Vaughn<jeffersonvaughn@xxxxxxxxx>
wrote:
Daniel - that was right on the money!
I sincerely appreciate your help on this.
thanks
Jay
On Wed, Jul 16, 2025 at 10:16 AM Daniel Gross<daniel@xxxxxxxx> wrote:
Hi Jay,
as Birgitta and I said repeatedly - it's quite hard to "imagine" what
herehave and want - you should at least provide some "example" data. So -
haveis, what I "interpret" from the given information:
select *
from json_table('{
"orders":[
{"dtNr":"4711", "other_item":"aaa"},
{"dtNr":"0815", "other_item":"bbb"},
{"dtNr":"78714", "other_item":"ccc"}
]
}',
'$.orders[*]' columns(
"dtNr" varchar(20),
"json_data" varchar(32000) format json path '$'
)
)
where "dtNr" = '78714';
This gives you 2 columns - the "dtNr" and a column "json_data" that
contains the whole "orders" array element which has the given "dtNr".
But your mileage may vary - as I simply have "interpreted" what you
allwritten. So if it works - fine - if not - please send more information
about your JSON - shorten it, change the data, whatever makes sense.
HTH
Daniel
Am 16.07.2025 um 15:12 schrieb Jay Vaughn<jeffersonvaughn@xxxxxxxxx>:
errr- actually that is not it - it is pulling the orders array for
jsonorder arrays and not just the orders array that contains dtNr = 7814the
here is a very watered down version of the json that would illustrate
paths I'm after...the
{
orders[
{
dtNr
}
]
}
so I ultimately only want to retrieve the entire orders array that has
dtNr = 7814
Jay
On Wed, Jul 16, 2025 at 8:58 AM Jay Vaughn<jeffersonvaughn@xxxxxxxxx
wrote:
got it to work - and yes Daniel I totally agree that providing the
thetoowould have been more helpful - sorry I couldn't was just too big with
much sensitive data...
this is what I was looking for...
select *
from *json_table*(jvaughn.tempjson1
,'$'
Columns (orders *varchar*(*32000*) format json Path
'$.orders'
,nested '$.orders[*]'
Columns("dtNr" *char*(*10*)
)
)
) as t
where "dtNr" = '78714';
Jay
On Wed, Jul 16, 2025 at 7:48 AM Jay Vaughn<jeffersonvaughn@xxxxxxxxx
word...wrote:
Daniel - silly of me thinking ArrayData was some kind of reserved
But the answer to your question is in the path... the key name of
array?wrote:array is orders.
so If I replace ArrayData with "orders" - I still get null...
Select *
From *JSON_TABLE*(jvaughn.tempjson1
,'$' columns(NESTED '$.orders[*]'
Columns("dtNr" *char*(*10*)
,"orders" *VarChar*(
*32000*) format json Path '$.orders'
)
)
)
where "dtNr" = '78714';
thanks
Jay
On Wed, Jul 16, 2025 at 12:02 AM Daniel Gross<daniel@xxxxxxxx>
So what's the "name" (key) of the JSON element containing the
isReally "ArrayData"? Maybe the JSON path '$.orders' of the element
likewrong?
Without seeing at least some part of the structure it's hard to say
where the error is. From what you write, it seems like it looks
anthis:
correct{
"orders":[
"drNr":"xxxxxx",
"ArrayData:[
...
]
]
}
That the "dtNr" column is populated is a good sign - so it seems
jeffersonvaughn@xxxxxxxxx>:until there.
Regards,
Daniel
Am 15.07.2025 um 22:57 schrieb Jay Vaughn <
needthis is great Birgitta (or anyone avail to answer)..
but my arraydata comes back null... however my dtNr is found! I
jeffersonvaughn@xxxxxxxxxthe
associated array that dtNr is found in...*VarChar*(*32000*)
Select *
From *JSON_TABLE*(jvaughn.tempjson1
,'$' columns(NESTED '$.orders[*]'
Columns("dtNr" *char*(*10*)
,ArrayData
format json Path '$.orders'
)
)
)
where "dtNr" = '78714';
tia
Jay
On Tue, Jul 15, 2025 at 4:26 PM Jay Vaughn <
jeffersonvaughn@xxxxxxxxxwrote:
I actually think I got it...
lots of thanks Birgitta!
Jay
On Tue, Jul 15, 2025 at 4:06 PM Jay Vaughn <
so Iwrote:
thank - this looks like interesting stuff I've never used before
json_documenthave hope...
however both of these statements fail as-is (when replacing
Hauser@xxxxxxxxxxxxxxx>with my actual json resource)
Jay
On Tue, Jul 15, 2025 at 1:01 PM Birgitta Hauser <
wrote:
If you only want to have a string, that contains everything in
Pathobject
or array, you just have to add FORMAT JSON:
Select *
From JSON_TABLE(Json-Document, '$'
Columns(ArrayData VarChar(4096) FORMAT JSON
Bok)stars."'$.Orders');
You Could also use JSON_QUERY.
Values(JSON_Query(Json_Document FORMAT JSON '$.Orders');
Mit freundlichen Grüßen / Best regards
Birgitta Hauser
Modernization – Education – Consulting on IBM i
Database and Software Architect
IBM Champion since 2020
"Shoot for the moon, even if you miss, you'll land among the
(Les Brown)
"If you think education is expensive, try ignorance." (Derek
(whichenoughtraining"What is worse than training your staff and losing them? Not
them and keeping them!"
"Train people well enough so they can leave, treat them well
Behalfso
(Albertthey don't want to. " (Richard Branson)
"Learning is experience … everything else is only information!"
Einstein)
-----Original Message-----
From: MIDRANGE-L<midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On
singleOf
midrange-l@xxxxxxxxxxxxxxxxxxJay Vaughn
Sent: Tuesday, 15 July 2025 18:48
To: Midrange Systems Technical Discussion <
Subject: Re: json_table - get entire array
Birgitta,
Yes, and you are kinda making the point for me.
That could get quite verbose based on everything under this
json
I'm targeting.
So was hoping for a simple way to just point at that array
experiencestringI'm
doing in my statement), and then say, bring it all back as a
Bok)(for
Hauser@xxxxxxxxxxxxxxxexample)
thanks
Jay
On Tue, Jul 15, 2025 at 12:29 PM Birgitta Hauser <
this:wrote:
It would have been helpful to see the JASON ... have you tried
Softwareselect *
from *json_table*(*trim*(jvaughn.tempjson1)
, '$.Orders[*]'
Columns (Column1 ... Path ...,
Column2 ... Path ...,
Nested $.Positions[*]
Columns(....)
)
) as t
where "dtNr" = '78714';
Mit freundlichen Grüßen / Best regards
Birgitta Hauser
Modernization – Education – Consulting on IBM i Database and
stars."Architect IBM Champion since 2020
"Shoot for the moon, even if you miss, you'll land among the
(Les
Brown)
"If you think education is expensive, try ignorance." (Derek
training"What is worse than training your staff and losing them? Not
enough sothem and keeping them!"
"Train people well enough so they can leave, treat them well
they don't want to. " (Richard Branson) "Learning is
subscription…
entireBehalf Ofeverything else is only information!" (Albert
Einstein)
-----Original Message-----
From: MIDRANGE-L<midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On
Jay Vaughn"columns('$')")...
Sent: Tuesday, 15 July 2025 17:55
To: Midrange Systems Technical Discussion
<midrange-l@xxxxxxxxxxxxxxxxxx>
Subject: json_table - get entire array
I'm curious about a concept with json_table...
Given the below statement (which is invalid due to the
Is there any possible way to have the statement select the
thatorders array, sub array/objects, etc, without having to define
subscriptionmailingentire array/subarray/objects DS for that orders array?
select *
from *json_table*(*trim*(jvaughn.tempjson1)
,'$' columns(NESTED '$.orders[*]'
columns('$')
)
) as t
where "dtNr" = '78714';
tia
Jay V
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any
subscriptionmailingrelated questions.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any
mailingrelated questions.--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any
subscriptionmailingrelated questions.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any
relatedmailing--related questions.
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailinglist
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxxrelated questions.
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any subscription
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list--list
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxxrelated questions.
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any subscription
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any subscription
--questions.
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.