× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Be VERY VERY cautious about using allowmissing=yes.

In general it is a very bad idea.

Reason is that there is no granularity - anything and EVERYTHING could be missing and DATA-INTO would simply say "Yup - all looks good to me!".

Back in the mists of time with the initial release of XML-INTO allowmissing was all we had - but we now have a better option countprefix. By using countprefix you can tell DATA-INTO exactly which fields are optional and you have an easy way of detecting whether they are present or not.

Using part of your DS as an example:

Dcl-ds billing_address;
name char(40);
street_address char(40);
city char(40);
state char(2);
zip char(10);
End-ds;

Assume that name and street_address are both expected but that the other fields are optional. You could add the option countprefix=count_ and change the definitions to:

Dcl-ds billing_address;
name char(40);
street_address char(40);
count_city int(5);
city char(40);
count_state int(5);
state char(2);
count_zip int(5);
zip char(10);
End-ds;

Now if city, state or zip are omitted, even without allowmissing=yes, no error will be flagged and the relevant count_ fields will be 0 or 1 depending on the absence/presence of the field in question.

You can apply the same thinking to the name and street_address fields too and then test their count_xxxx fields to ensure that they were present and issue your own appropriate error if they were missing.

So - if your mother had known about you using allowmissing=yes she'd have told you to STOP IT before it damages you permanently!


Jon Paris

www.partner400.com
www.SystemiDeveloper.com

On May 29, 2018, at 8:42 AM, Steve Jones <sjones@xxxxxxxxxxxxxxx> wrote:

After seeing Scott Klements session at Powerup18 on using Json & the new
data into, I wanted to start using it.

I am getting the error
Message . . . . : The document for the DATA-INTO operation does not
match
the RPG variable; reason code
5.
Cause . . . . . : While parsing a document for the DATA-INTO operation,
the
parser found that the document does not correspond to RPG variable
"result"
and the options do not allow for this. The reason code is 5. The
exact
subfield for which the error was detected is "result.customers(1)".
The
options are "doc=file case=any countprefix=num_". The document name
is
/paytrace/889_output.json; *N indicates that the document is not an
externa
file. The parser is 'YAJLINTO'. *N indicates that the parser is a
procedure
pointer.

So that leads me to believe my DS does not match, but I am not seeing where
it is not matching, unless you need to define fields you are not even going
to use in the pgm.

Here is my DS:
Dcl-ds result qualified;
success ind;
response_code int(3);
status_message varchar(500);
num_customers int(10);

Dcl-ds customers dim(10);
Dcl-ds credit_card;
masked_number char(16);
expiration_month int(2);
expiration_year int(2);
End-ds;
Dcl-ds billing_address;
name char(40);
street_address char(40);
city char(40);
state char(2);
zip char(10);
End-ds;
email char(70);
End-ds;

End-ds;

Here is the raw json I am trying to read from the IFS:
{"success":true,"response_code":1,"status_message":"Your request has been
successfully completed.","customers":[{"customer_id":"100-1
","credit_card":{"masked_number":"************1111","expiration_month":12,"expiration_year":20},"shipping_address":{"name":"","street_address":"","street_address2":"","city":"","county":"","state":"","zip":"","country":"US"},"billing_address":{"name":"Transfer
Order","street_address":"Internal Use
Only","street_address2":"","city":"Anywhere","state":"OH","zip":"44721","country":"US"},"email":"
testing@xxxxxxxxxx","phone":"","fax":"","created":{"at":"5/25/2018 7:26:13
AM","by":"demo123","from_ip":"x.x.x.x"},"discretionary_data":{"Site
IDs":"","Company Name":"","Location":""}}]}

The data-into is:
data-into result %Data(outFile
:'doc=file case=any countprefix=num_')
%PARSER('YAJLINTO');

Anyone see where I am missing what is causing the error?

Thanks!!
--
Steve Jones
H-P Products, Inc
330-871-2054

--
NOTE: The information in this email is confidential and may be legally
privileged. If you are not the intended recipient, you must not read, use
or disseminate the information; please advise the sender immediately by
reply email and delete this message and any attachments without retaining a
copy. Although this email and any attachments are believed to be free of
any virus or other defect that may affect any computer system into which it
is received and opened, it is the responsibility of the recipient to ensure
that it is virus free and no responsibility is accepted by H-P Products,
Inc. for any loss or damage arising in any way from its use.
--
This is the RPG programming on the IBM i (AS/400 and iSeries) (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxx for any subscription related questions.

Help support midrange.com by shopping at amazon.com with our affiliate link: http://amzn.to/2dEadiD


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.