I'm trying to use DATA-INTO using YAJLINTO. It's based on one of Scott's examples. (https://www.common.org/scotts-iland-blog/parsing-json-data-into/).

This is the test code:

dcl-s json varchar(5000) Inz(
'{"Code":400,"Message":"Error","Data":["ConsigneeAddress.Zip -+
Validation failed for property."]' ) ;

dcl-s Msg varchar( 47 ) ;
dcl-s x int( 10 );

dcl-ds MyData qualified;
Code varchar( 5 ) ;
Message varchar( 50 ) ;
num_Data int( 10 ) ;

dcl-ds Data dim( 50 ) ;
*N char( 50 ) ;

data-into MyData %DATA( json: 'case=any countprefix=num_ +
allowextra=yes' )

Msg = MyData.Message ;
dsply ( 'Code: ' + MyData.Code ) ;
dsply ( 'Msg: ' + Msg ) ;
dsply ( 'Num: ' + %EditC( MyData.num_Data:'1' ) ) ;
for x = 1 to MyData.num_Data ;
dsply MyData.Data( x ) ;

*INLR = *On ;

If I have allowextra=yes I receive RNX0356 The document for the DATA-INTO operation does not match the RPG variable; reason code 15.

Cause . . . . . : While parsing a document for the DATA-INTO operation, the
parser found that the document does not correspond to RPG variable "mydata"
and the options do not allow for this. The reason code is 15. The exact
subfield for which the error was detected is "mydata". The options are
"case=any countprefix=num_ allowextra=yes". The document name is *N; *N
indicates that the document is not an external file. The parser is
'YAJLINTO'. *N indicates that the parser is a procedure pointer.

Reason code 15: A call to QrnDiStartArray was made, but the matching RPG variable or
subfield is not an array.


Without allowextra=yes it points to reason code 5, which is:
The document contains extra names that do not match subfields.

Is this because the Data array is not named? If yes, is there a way around this?


This thread ...

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].