Just curious as to what guidelines you use when choosing between
signed or unsigned integer to replace the binary data type in API

IIRC, some of the APIs tell you rather the parameter is signed or
unsigned. But most don't.

If it's not specified, do you use choose signed or do you look at the
description and decide if unsigned is a better choice?

For example, a trigger buffer has the following:
Dec Hex
36 24 BINARY(4) CCSID of data
48 30 BINARY(4) Original record offset
52 34 BINARY(4) Original record length

None of which would ever be negative as far as I can tell.

Just curious,

Charles Wilt
Software Engineer

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2020 by and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].