× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Hi again

I looked at this again, tried it at work today - it doesn't quite fit my needs, because path_name does not contain any identifying text - all the names of the IFS files are GUIDs.

What I need is to get a list of XMLs that have certain values inside them.

Lots of good stuff to learn, though! Thanks again!

Vern

On 8/16/2022 4:01 AM, Niels Liisberg wrote:
Good to hear Vern;

So you can basically boil it down to something like :

with xmldata as (
    select path_name , xml2clob( xmlparse (document get_xml_file(path_name))) payload
    from table (
        qsys2.ifs_object_statistics(
            start_path_name => '/tmp/',
            subtree_directories => 'NO',
            object_type_list => '*STMF'
        )
    )
    where lower(path_name) like '%cust%.xml'
)
select path_name ,payload
from xmldata
where payload like '%ID%'
with ur;



On Tue, Aug 16, 2022 at 9:57 AM Vern Hamberg via MIDRANGE-L <midrange-l@xxxxxxxxxxxxxxxxxx> wrote:

Hi Niels

All this is well and good but not needed for what I am doing. We
already
use XMLTABLE in out programs to consume the information and put
into our
application tables.

One of the main things I am interested in is to get a list of the
XMLs
that contain certain kind of information - it could be about a
service
done in the field, a status message from the field, or something
else.
The XML has a top-level element like <Service> to identify these. The
name of the XMLs are GUIDs - not a clue about __what__ the content is
about. (In actual use, we get a list of GUIDs for each type, I
believe -
I am not the person doing most of this development - so we know what
they are the first time we see them. But once they have arrived, I
don't
know what type they are from the name.

So I am interested in a list of file names that are service
information.
I am also interested to pull a certain value within the XML. I can
see
that XMLTABLE could be useful there, but I'm not interested here in a
full shredding of the content into tables.

So using locate and substring works very well, both for the column
list
and for the WHERE clause.

Again, this is a kind of ad hoc thing - I have a custom "example"
in RSS
that I can use, setting values of a couple global variables, and
off I go!!

I do like what you present, it is just more than I need, for now,
anyhow!

Vern

On 8/16/2022 2:33 AM, Niels Liisberg wrote:
> Here is my take on that:
>
> You can pasthe the following directly into ACS - SQL script and
play with
> it:
>
> ___________________________
>
> -- Lets say your XML is a representation of a simple table with
rows like
> this one:Examples
> Select *
> from qiws.qcustcdt;
>
> -- Now lets save that table ( id and name) as a XML file to the
IFS so we
> have something to play with:
> call ifs_write_utf8 (
>      path_name =>'/tmp/mycust.xml',
>      line =>  xml2clob((
>          Select
>          xmlelement( name "CUSTOMERS" ,
>              xmlagg(
>                  xmlelement( name "CUSTOMER" ,
>                      xmlelement( name "ID" , cusnum),
>                      xmlelement( name "NAME" , trim(LSTNAM))
>                  )
>              )
>          )
>          from qiws.qcustcdt
>      ))  ,
>      overwrite => 'REPLACE',
>      end_of_line => 'NONE'
> );
>
> -- Now to your real question:
> -- First it you need to look into the XML ( any XML) -you can
use this
> approach:
> -- note the "with ur" that is "uncommitted read" - you need
commit since
> XML is streams
> values (
>      xmlparse (document get_xml_file('/tmp/mycust.xml'))
> ) with ur;
>
>
> -- You could be tempted to simply use POS or INSTR  in the above
to find a
> specific value.
> -- However since SQL supports the access to the XML as table we
can use
> this much more robust way to do it:
> Select *
> from  xmltable(
>      '/CUSTOMERS/CUSTOMER'
>      passing xmlparse (document get_xml_file('/tmp/mycust.xml'))
>      columns
>          id      bigint path 'ID',
>          name    varchar(32)  path  'NAME'
> )
> with ur;
>
> -- So now we have the XML data as tabular info. But you need to
> -- to read it from a directory of XML files.
> -- Here you can combine the CTE list of XML files
> -- and join/pass each file
> with XML_files as (
>      select path_name
>      from table (
>          qsys2.ifs_object_statistics(
>              start_path_name => '/tmp/',
>              subtree_directories => 'NO',
>              object_type_list => '*STMF'
>          )
>      )
>      where lower(path_name) like '%cust%.xml'
> ),
> customer_from_XML_files as (
>      select *
>      from  XML_files , xmltable(
>          '/CUSTOMERS/CUSTOMER'
>          passing xmlparse (
>              document get_xml_file(path_name)
>          )
>      columns
>          id      bigint path 'ID',
>          name    varchar(32)  path  'NAME'
>      )
> )
> select *
> from customer_from_XML_files
> with ur;
>
> -- Voila !! XML files as rows you do all the SQL magic on you
can imagine
>
>
>
>
>
>
> On Sun, Aug 14, 2022 at 10:59 AM Birgitta Hauser
<Hauser@xxxxxxxxxxxxxxx>
> wrote:
>
>> Vern,
>>
>> You had to use the CROSS JOIN and LATERAL because you used the
IFS_READ in
>> a way that it returned multiple rows.
>> I used Get_Clob_From_File(Path_Name) which returns a
LOB-Locator on the
>> IFS file. LOB locators can be used like character columns. So I
used it in
>> the WHERE Condition in composition with the LIKE predicate.
>> If you only want to know in which file the data is located, using
>> GET_CLOB_FROM_FILE is the easiest way (no need of any CROSS JOIN or
>> LATERAL).
>> If you exactly want to see the row and the content of the row,
where you
>> find the string, your solution is the better one.
>>
>> Mit freundlichen Grüßen / Best regards
>>
>> Birgitta Hauser
>> Modernization – Education – Consulting on IBM i
>>
>>
>> "Shoot for the moon, even if you miss, you'll land among the
stars." (Les
>> Brown)
>> "If you think education is expensive, try ignorance." (Derek Bok)
>> "What is worse than training your staff and losing them? Not
training them
>> and keeping them!"
>> „Train people well enough so they can leave, treat them well
enough so
>> they don't want to.“ (Richard Branson)
>>
>>
>> -----Original Message-----
>> From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On
Behalf Of
>> Vern Hamberg via MIDRANGE-L
>> Sent: Samstag, 13. August 2022 18:17
>> To: midrange-l@xxxxxxxxxxxxxxxxxx
>> Cc: Vern Hamberg <vhamberg@xxxxxxxxxxxxxxx>
>> Subject: Re: Find string in multiple it's files using sql services
>>
>> Thanks, Birgitta! I finally got a solution, similar to yours,
might be the
>> first real use I've made of LATERAL.
>>
>> The problem was, I got a message from Ifs_Read that said
>> '/YourFolder/Dir2/Dir3' was not valid for the operation - well,
obviously,
>> because it is a directory!! I did not have the named parameter
>> Object_Type_List - now I do. Here's what I came up with using
CROSS JOIN
>> LATERAL with Ifs_Read - and a couple global variables.
>>
>> This doesn't need to be run under commitment control, although
that isn't
>> usually a problem, as I recall. Performance, in any case, is
probably very
>> dependent on how many streamed files there are and their size.
>>
>> -- category: Custom
>> -- description: Find IFS files for 2 specific values create or
replace
>> variable typeofxml varchar(50); set typeofxml = '<Service>';
create or
>> replace variable testvalue varchar(32); set testvalue = 'DGDO5778';
>>
>> SELECT path_name,
>>          line_number,
>>          line
>>       FROM TABLE (
>> qsys2.ifs_object_statistics(start_path_name =>
>> '/Lvl1/Lvl2/Guids', OBJECT_TYPE_LIST => '*STMF')
>>            ) a
>>            CROSS JOIN
>>                LATERAL (
>>                    SELECT line_number,
>>                           line
>>                        FROM TABLE (
>>                                qsys2.ifs_read(
>>                                    path_name => a.path_name,
end_of_line =>
>> 'ANY', maximum_line_length => default,
>>                                    ignore_errors => 'NO')
>>                            )
>>                ) b
>>       WHERE LOCATE(UPPER(testvalue), UPPER(line)) > 0 and
>> LOCATE(UPPER(typeofxml), UPPER(line)) > 0;
>>
>> I do so appreciate the efforts of the database team at IBM for
giving us
>> these really useful tools, these SQL services. Kudos, Scott!!
and all!
>>
>> Cheers
>> Vern
>>
>> On 8/13/2022 4:09 AM, Birgitta Hauser wrote:
>>> Try something like this:
>>> Select Path_Name, Get_Clob_From_File(Path_Name) "IFS File Content"
>>>     From Table (Qsys2.Ifs_Object_Statistics(Start_Path_Name     =>
>> '/YourFolder/Dir2/Dir3',
>>>  Subtree_Directories => 'YES',
>>>  Object_Type_List    =>
>> '*ALLSTMF')) x
>>>     Where     Right(Trim(Path_Name), 5) = '.xml'
>>>           and Get_Clob_From_File(Path_Name) like '%Whatever
you are
>> looking for%'
>>>     Order By Path_Name      ;
>>>
>>> Note: GET_CLOB_FROM_FILE has to be run under Commitment Control
>>>
>>> Mit freundlichen Grüßen / Best regards
>>>
>>> Birgitta Hauser
>>> Modernization – Education – Consulting on IBM i
>>>
>>>
>>> "Shoot for the moon, even if you miss, you'll land among the
stars."
>>> (Les Brown) "If you think education is expensive, try ignorance."
>>> (Derek Bok) "What is worse than training your staff and losing
them? Not
>> training them and keeping them!"
>>> „Train people well enough so they can leave, treat them well
enough so
>>> they don't want to.“ (Richard Branson)
>>>
>>>
>>> -----Original Message-----
>>> From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On
Behalf Of
>>> Vern Hamberg via MIDRANGE-L
>>> Sent: Samstag, 13. August 2022 00:55
>>> To: Midrange Systems Technical Discussion
>>> <midrange-l@xxxxxxxxxxxxxxxxxx>
>>> Cc: Vern Hamberg <vhamberg@xxxxxxxxxxxxxxx>
>>> Subject: Re: Find string in multiple it's files using sql services
>>>
>>>
>>> I know how to use grep and all, this is not another chapter in
that
>> unpublished rag, for my purposes an sql solution serves me more
directly, I
>> wanted to avoid a command line solution.
>>> Cheers
>>> Vern
>>>
>>>
>>> -----Original Message-----
>>>
>>> From: Jack <midrange-l@xxxxxxxxxxxxxxxxxx>
>>> To: Midrange <midrange-l@xxxxxxxxxxxxxxxxxx>
>>> Cc: Jack <jwoehr@xxxxxxxxxxxxxxxxxxxxxxxx>
>>> Date: Friday, 12 August 2022 4:10 PM CDT
>>> Subject: Re: Find string in multiple it's files using sql services
>>>
>>> On Fri, Aug 12, 2022 at 10:38 AM Roger Harman
>>> <roger.harman@xxxxxxxxxxx>
>>> wrote:
>>>
>>>> As opposed to using "grep"?
>>>>
>>> The only sensible way to do it, and if you are on IBM i and not
>>> planning to off-platform, it's half-past time to learn grep sed tr
>>> etc. and your basic pipeline tools.
>>>
>>> *Another chapter in the continuing saga of "Open Source Is Not An
>>> Option, It's A Necessity".*
>>>
>>>
>>>> -----Original Message-----
>>>> From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On
Behalf Of
>>>> Vern Hamberg via MIDRANGE-L
>>>> Sent: Friday, August 12, 2022 8:15 AM
>>>> To: Midrange Systems Technical Discussion
>>>> <midrange-l@xxxxxxxxxxxxxxxxxx>
>>>> Cc: Vern Hamberg <vhamberg@xxxxxxxxxxxxxxx>
>>>> Subject: Find string in multiple it's files using sql services
>>>>
>>>>
>>>> I have XML files in a folder, I want to see which ones have a
certain
>>>> value in them, such as an account code. I tried
ifs_object_statistics
>>>> with start_path_name ==> set, joined to ifs_read, the
path_name from
>>>> ifs_object_statistics in path_name ==>.
>>>>
>> --
>> This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing
>> list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe,
>> unsubscribe, or change list options,
>> visit: https://lists.midrange.com/mailman/listinfo/midrange-l
>> or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
>> Before posting, please take a moment to review the archives at
>> https://archive.midrange.com/midrange-l.
>>
>> Please contact support@xxxxxxxxxxxxxxxxxxxx for any
subscription related
>> questions.
>>
>> Help support midrange.com <http://midrange.com> by shopping at
amazon.com <http://amazon.com> with our affiliate
>> link: https://amazon.midrange.com
>>
>> --
>> This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing list
>> To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
>> To subscribe, unsubscribe, or change list options,
>> visit: https://lists.midrange.com/mailman/listinfo/midrange-l
>> or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
>> Before posting, please take a moment to review the archives
>> at https://archive.midrange.com/midrange-l.
>>
>> Please contact support@xxxxxxxxxxxxxxxxxxxx for any
subscription related
>> questions.
>>
>> Help support midrange.com <http://midrange.com> by shopping at
amazon.com <http://amazon.com> with our affiliate
>> link: https://amazon.midrange.com
>>

-- This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.

Help support midrange.com <http://midrange.com> by shopping at
amazon.com <http://amazon.com> with our affiliate link:
https://amazon.midrange.com


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.