two's-complement hex to dec problem

Hi everybody!

Hope somebody can help me out with this.. I don't seem to find the way-to-go:

I'm trying to encode/decode GPS information into an hex protocol to be send over a radio transmission. The protocol is fixed and I have to work accordingly to it. I'll give the example for the encoding of latitude:

The latitude should be encoded as a '4 byte long', being the decimal degrees expressed in radiants * 100 000 000. A positive number represents the Northern hemisphere, a negative number the Southern hemisphere.

For example:

51° N ---> (51*3.14)/180 = 7.225 N displayed in Radiants 7.225 * 100 000 000 = 722 500 000 = 2B 10 79 A0

Say that it would have been 51°S then I would have to multiply 722 500 000 with (-1), resulting in -722 500 000 (FFFF FFFF D4EF 8660)

This seems to work fine on the encoding side. I have a bought a device (which has given me the description of the protocol) which can decode those 4 bytes, and displays me the right hemisphere.

Now. I'm trying to write an Arduino sketch that can decode those 4 bytes too. I cannot figure out how I can determine if the original number was negative or positive. For example:

Take Decimal 5 outputs (0x05 --> 0101) Take Decimal -5 outputs (FFFF FFFB --> 1111 1111 1111 1111 1111 1111 1111 1011) applying the rules of 2-s complement, toggling all the bits and adding 1.

How would I know if I'm dealing with a negative, 2s-complement encoded number, or just a huge positive number?

Thanks a lot! Jens

How would I know if I'm dealing with a negative, 2s-complement encoded number, or just a huge positive number?

The biggest positive 2's complement number does not have its most significant bit set.