On Mon, 2008-08-18 at 16:11 -0400, Charles Wilt wrote:
All,

Just curious as to what guidelines you use when choosing between
signed or unsigned integer to replace the binary data type in API
calls.

IIRC, some of the APIs tell you rather the parameter is signed or
unsigned. But most don't.

If it's not specified, do you use choose signed or do you look at the
description and decide if unsigned is a better choice?

For example, a trigger buffer has the following:
Offset
Dec Hex
36 24 BINARY(4) CCSID of data
48 30 BINARY(4) Original record offset
52 34 BINARY(4) Original record length

None of which would ever be negative as far as I can tell.


Just curious,


Charles Wilt
--
Software Engineer
CINTAS Corp.

Charles,

I remember that the C standard says that the signedness of
integral types is implementation-defined. I make a point of not
remembering the definition, and I needed a break from reading.
So ...

#include <stdio.h>

int main( int argc, char ** argv)
{
int x;
x = -1;
if( x < 0 )
printf( "int is signed\n" );
else
printf( "int is unsigned\n" );

} // main


As the other responses tell you,

int is signed

Enjoy.

Terry.



This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].