Go Down

Topic: hex/byte to int conversion (Read 2467 times) previous topic - next topic

EVP

#15
Jun 08, 2012, 12:31 am Last Edit: Jun 08, 2012, 12:38 am by EVP Reason: 1
Code: [Select]
int convertFromHex(int ascii){
 if(ascii > 0x39) ascii -= 7; // adjust for hex letters upper or lower case
 return(ascii & 0xf);
}


This receives an int and returns a int. Is this a joke. What is it's propose?

ohh it adjusts for upper lower. What does   return(ascii & 0xf); do.   & means the address of something. So what does ascii & 0xf mean....

majenko

An int is just a bigger storage area than a byte.  A char is a byte.  You can store a char in an int, but you waste space.

You can replace the "int"s with "char"s.

EVP

Ahh yep get that. but still don't quite understand what the return actually means. Cheers

majenko

& has different meanings in different situations.

"&variablename" is the address of variablename.

"variable & variable" is a logical and between two variables or literals.

"ascii & 0xf" is the value in ascii anded with 0xf.

If ascii contains 0x8B, that is a binary value 10001010.  0xf is 00001111.

And the two together and you get 00001010.

Or, 0xB.  A quirk (or a design) of the ASCII character set places certain character groups on aligned boundaries which can be isolated with simple mathematical operations like this.  Ascii '3' is 0x33.  And that with 0xf (which is the same as 0x0f) and you get 0x03 (or 3 in decimal).

EVP

#19
Jun 08, 2012, 12:49 am Last Edit: Jun 08, 2012, 12:52 am by EVP Reason: 1
Ahh nice one majenko. That answers my question exactly. Thanks.




not sure why i started the last two pasts with 'Ahh' other than i have a very literal phonetic way of typing.

AWOL

Quote
"variable & variable" is a logical and between two variables or literals.

No, that's the bitwise AND of two variables.
"&&" is logical AND
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

majenko

Sorry, bitwise, that's what I meant.  I know that now I have had some sleep.

AWOL

Quote
ohh it adjusts for upper lower.

No, it doesn't.
It simply ignores upper or lower case.
ASCII digits '0' to '9' have hex values 0x30 to 0x39.
ASCII digits 'A' to 'F' have hex values 0x41 to 0x46 or 'a' to 'f' 0x61 to 66.
If an ASCII hex digit is greater than 9
Code: [Select]
if(ascii > 0x39), then subtracting 7 from it (assuming it is A to F or a to f) will convert the least significant four bits of it to 0x0A to 0x0F, so that ANDing it with 15 will yield the hex digit represented by the ASCII character.

Of course, you can give it any ASCII character (including punctuation), and it will attempt to turn it into hex.
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

EVP


Grumpy_Mike

Quote
and it will attempt to turn it into hex.

No not attempt, it will actually turn it into a 4 bit number.
Whether it makes any sense to you is another matter, but bit patterns are just bit patterns. Truly learn that and you are half way to understanding most of what computers do.

AWOL

Quote
. Truly learn that and you are half way to understanding most of what computers do.

Ah! That's where I've been going wrong these thirty odd years.

I meant to say that it will turn it into a bit pattern, and that bit pattern will be a hex digit, but that hex digit won't have any particularly meaningful arithmetic relationship to the input character as far as the OP is concerned, but the ex-PFC Wintergreen in my head said "Too prolix"
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

majenko

Quote
that bit pattern will be a hex digit


Close...

That bit pattern can be represented as a hex digit.  :P

AWOL

Four binary bits form a hex digit, by definition.
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

Go Up