× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Daniel,

I was trying to experiment with your code below, but I have two issues.

The compiler says object IFS_WRITE in QSYS2 not found. Pretty sure I have the latest PTF groups loaded on v7r3
Second, I don't quite understand where "cust_group_id" is coming from... I'm assuming it is an incrementing number.

Thx,
Greg

-----Original Message-----
From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Daniel Gross
Sent: Thursday, February 3, 2022 1:18 AM
To: RPG programming on IBM i <rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: Re: YAJLDTAGEN with SQL into DS

Hi Jon,

no - I didn’t miss these parts - and both are easily solved using SQL.

As Birgitta mentions one can select the JSON string INTO a string (or even a 2GB BLOB field if needed) and write that out using RPG - very easy.

Or you can wrap an QSYS2.IFS_WRITE(….) around the JSON_OBJECT function. That way you could also write multiple batches into multiple IFS files using a GROUP BY clause and JOINING a table with the output file names or deriving the output file names somehow on the result set level.

exec sql select qsys2.ifs_write(:ifs_file_name,
json_object(
'Customers' : json_arrayagg(
json_object(
'addr_line1' : addr_line1,
'addr_line2' : addr_line2,
'city' : city,
'state' : state,
'zip' : zip,
'phone' : phone,
'email' : email
)
)
)
)
from mytable
where ….;

Or even somehow more dynamically like that:

exec sql select qsys2.ifs_write('/path/to/your/file/output_file_‘ || char(cust_group_id) || '.json',
json_object(
'Customers' : json_arrayagg(
json_object(
'addr_line1' : addr_line1,
'addr_line2' : addr_line2,
'city' : city,
'state' : state,
'zip' : zip,
'phone' : phone,
'email' : email
)
)
)
)
from mytable
where ….
group by cust_group_id;

But I think that this is not the point. Why I have mentioned a SQL solution is because SQL on IBM i is definitely the future. And using SQL as a READ/CHAIN replacement is not how SQL should be used.

I once read somewhere “if you have to use a SQL cursor, you haven’t understood the problem” - even if I wouldn’t go so far, it’s clear, that you have to lay off "procedural thinking" when using SQL, because of SQL's “anti-procedural” and “result-descriptive” nature.

Kind regards,
Daniel

P.S.: Of course Birgitta is 100% right about always checking SQLCODE after doing SQL in RPG programs - even a simple "if sqlcode <> *zero; …“ is better than nothing.


Am 03.02.2022 um 05:42 schrieb Birgitta Hauser <Hauser@xxxxxxxxxxxxxxx>:

plus I think he has need to use different output file names in the IFS which your code doesn't do.
Writing to the IFS is just an SELECT ... INTO into the IFS File:
Here is the complete code that is necessary to execute the SQL Statement and write the data into the IFS.

DCL-S YourIFSFile SQLTYPE(CLOB_FILE);

YourIFSFile_Name = '/YourDir/YourDir2/YourIFSFile.json'; //IFS File Name
YourIFSFile_NL = %Len(%Trim(YourIFSFile_Name)); //IFS File Name Length
YourIFSFile_FO = SQFOVR; //File Operation - Write the data to the IFS - Replace if exists

Exec SQL select json_object(
'Customers' : json_arrayagg(
json_object(
'addr_line1' : addr_line1,
'addr_line2' : addr_line2,
'city' : city,
'state' : state,
'zip' : zip,
'phone' : phone,
'email' : email
)
)
)
Into :YourIFSFile
from mytable;
//Check SQL Code or SQL State here ... and then continue

... and done!
... there are also SQL Stored for writing directly into the IFS: IFS_WRITE, IFS_WRITE_UTF8, IFS_WRITE_BINARY

... and for those who do not need more than this, I've even written a piece of open source code.
https://github.com/BirgittaHauser/Generate-XML-and-JSON

Mit freundlichen Grüßen / Best regards

Birgitta Hauser


"Shoot for the moon, even if you miss, you'll land among the stars." (Les Brown)
"If you think education is expensive, try ignorance." (Derek Bok)
"What is worse than training your staff and losing them? Not training them and keeping them!"
„Train people well enough so they can leave, treat them well enough so they don't want to.“ (Richard Branson)


-----Original Message-----
From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Jon Paris
Sent: Donnerstag, 3. Februar 2022 00:43
To: RPG programming on IBM i <rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: Re: YAJLDTAGEN with SQL into DS

I think you missed the bits about starting and stopping at arbitrary points/batches Daniel - plus I think he has need to use different output file names in the IFS which your code doesn't do.


Jon P.

On Feb 2, 2022, at 4:55 PM, Daniel Gross <daniel@xxxxxxxx> wrote:

Hi Greg,

after fiddling around for 10 minutes or so, I came up with this SQL statement:

select json_object(
'Customers' : json_arrayagg(
json_object(
'addr_line1' : addr_line1,
'addr_line2' : addr_line2,
'city' : city,
'state' : state,
'zip' : zip,
'phone' : phone,
'email' : email
)
)
)
from mytable;

Which results in the following JSON string (which I formatted to be better readable):

{
"Customers":[
{
"addr_line1":"64a N Williams St",
"addr_line2":"",
"city":"Crystal Lake",
"state":"IL",
"zip":"60014",
"phone":"7792204179",
"email":"marvinstoys@xxxxxxxxx"
},
{
"addr_line1":"267 Main St",
"addr_line2":"",
"city":"Huntington",
"state":"NY",
"zip":"11743",
"phone":"6315498743",
"email":"lsdolls@xxxxxxx"
}]
}

I hope I understood your problem correctly.

Of course I understand, that it is always a "problem" to make big changes to existing code - but IMHO modernization starts exactly there.

HTH
Daniel

Am 02.02.2022 um 22:06 schrieb Greg Wilburn <gwilburn@xxxxxxxxxxxxxxxxxxxxxxx>:

Chris,

Thank you for your response. I did not realize that I could SELECT "INTO" a qualified data structure. But it worked perfectly.

I set up my SQL Cursor in a procedure called OpenCursor() then:

OpenCursor();

Dow FetchCursor(AllCustomers.Customers);
WriteJson(Stmf);
jsonCount += 1;
enddo;

CloseCursor();

...in WriteJson()
DATA-GEN AllCustomers
%DATA(inStmf: 'doc=file output=clear countprefix=num_')
%GEN('YAJLDTAGEN');

//-------------------------------------------------------------------
-----------
// Fetch the Next Record
//
//-------------------------------------------------------------------
-----------

dcl-proc FetchCursor;

dcl-pi FetchCursor ind;
rec likeds(AllCustomers.Customers) dim(50);
end-pi;

sqlRows = 0;

Exec Sql
Fetch Next From sqlcsr for :maxrows rows
Into :rec ;

if SqlStt <> sqlOK;
return *off;
endif;

Exec Sql Get Diagnostics :sqlRows = ROW_COUNT;

AllCustomers.num_Customers = sqlRows;

return *on;

end-proc;


-----Original Message-----
From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Hiebert, Chris
Sent: Wednesday, February 2, 2022 1:19 PM
To: RPG programming on IBM i <rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: RE: YAJLDTAGEN with SQL into DS

Try:

dcl-ds AllCustomers qualified;
num_Customers uns(5);

dcl-ds Customers qualified dim(50) inz;

addr_line1 varchar(40) inz('');

addr_line2 varchar(40) inz('');

city varchar(40) inz('');

state varchar(3) inz('');

zip varchar(10) inz('');

phone varchar(22) inz('');

email varchar(50) inz('');

end-ds;
End-ds;

AllCustomers.num_Customers = sqlRows;


DATA-GEN AllCustomers

%DATA(inStmf: 'doc=file output=clear countprefix=num_')

%GEN('YAJLDTAGEN');



If that doesn't work with the SQL you can keep the other customers array and continue to load that with sql and then:

%Subarr( AllCustomers.Customers:1:SqlRows ) =
%SUBARR(customers:1:sqlRows);

Then run the DATA-GEN.

Chris Hiebert
Senior Programmer/Analyst
Disclaimer: Any views or opinions presented are solely those of the author and do not necessarily represent those of the company.

From: RPG400-L <rpg400-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Greg Wilburn
Sent: Tuesday, February 1, 2022 3:03 PM
To: RPG programming on IBM i <rpg400-l@xxxxxxxxxxxxxxxxxx>
Subject: YAJLDTAGEN with SQL into DS


So I have a working application where I've used SQL data access to load a Data structure.



dcl-ds Customers qualified dim(50) inz;

addr_line1 varchar(40) inz('');

addr_line2 varchar(40) inz('');

city varchar(40) inz('');

state varchar(3) inz('');

zip varchar(10) inz('');

phone varchar(22) inz('');

email varchar(50) inz('');

end-ds;



I then use YAJLDTAGEN to create the JSON file:

DATA-GEN %SUBARR(customers:1:sqlRows)

%DATA(inStmf: 'doc=file output=clear')

%GEN('YAJLDTAGEN');



Output looks like this:

[

{

"addr_line1": "64a N Williams St",

"addr_line2": "",

"city": "Crystal Lake",

"state": "IL",

"zip": "60014",

"phone": "7792204179",

"email": "marvinstoys@xxxxxxxxx"

},

{

"addr_line1": "267 Main St",

"addr_line2": "",

"city": "Huntington",

"state": "NY",

"zip": "11743",

"phone": "6315498743",

"email": "lsdolls@xxxxxxx"

}

]





The endpoint has asked for this:

{

"Customers": [

{

"addr_line1": "64a N Williams St",

"addr_line2": "",

"city": "Crystal Lake",

"state": "IL",

"zip": "60014",

"phone": "7792204179",

"email": "marvinstoys@xxxxxxxxx"

},

{

"addr_line1": "267 Main St",

"addr_line2": "",

"city": "Huntington",

"state": "NY",

"zip": "11743",

"phone": "6315498743",

"email": "lsdolls@xxxxxxx"

}

]

}



I know how to create the DS... but then I'm not sure how to do Exec SQL INTO that DS.

I'm pretty sure the "into variable" cannot be "json.Customers".



What am I missing?



TIA,

Greg


--
This is the RPG programming on IBM i (RPG400-L) mailing list To post
a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our
affiliate link: https://amazon.midrange.com
--
This is the RPG programming on IBM i (RPG400-L) mailing list To post
a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our
affiliate link: https://amazon.midrange.com

--
This is the RPG programming on IBM i (RPG400-L) mailing list To post a
message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe, unsubscribe,
or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com

--
This is the RPG programming on IBM i (RPG400-L) mailing list To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our affiliate link: https://amazon.midrange.com

--
This is the RPG programming on IBM i (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our affiliate link: https://amazon.midrange.com


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.