On 19/08/2008, at 6:11 AM, Charles Wilt wrote:

IIRC, some of the APIs tell you rather the parameter is signed or
unsigned. But most don't.

The ones that don't tell you expect signed values.

If it's not specified, do you use choose signed or do you look at the
description and decide if unsigned is a better choice?

In general, if the API does not specifically state unsigned then you should use signed. The "Data Types and APIs" section of the Information Centre says that BINARY data types must be signed. Unsigned parameters will be explicitly flagged as such using BINARY (x) UNSIGNED.

For example, a trigger buffer has the following:
Dec Hex
36 24 BINARY(4) CCSID of data
48 30 BINARY(4) Original record offset
52 34 BINARY(4) Original record length

CCSIDs range from 0 to 65535 so signed binary is sufficient.

Record length (without LOBs) ranges from 1 to 32766 (reducing for each Null-capable or Varying-length field) so signed binary is sufficient.

Neither of these values is likely to change without major disruption.

I think the primary reason for using BINARY(4) where BINARY(2) would suffice is simply to minimise cross-language differences. Specifically to avoid widening issues with C (and it's derivatives). Also easier to use 10I 0 (or int) for all numeric parameters than constantly have to check the documentation for size and sign.

Simon Coulter.
FlyByNight Software OS/400, i5/OS Technical Specialists

Phone: +61 2 6657 8251 Mobile: +61 0411 091 400 /"\
Fax: +61 2 6657 8251 \ /
ASCII Ribbon campaign against HTML E-Mail / \

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2020 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].