Help I am stuck converting a 16 bit Hex value To 2 Characters

Please Help!
This does not work.
I am just trying to store the high byte in a char array element
and the low byte in the next char array element.
The two char array elements must contain the hex representation of the low and high bytes

Example - the highByte value AF hex would be stored in CharResponse[1] as the “A” character
And the lowBytevalue AF hex would be stored in CharResponse[2] as the “F” character

 if (addr[i] < 16) {Serial.print('0');serial_number_string[++serial_number_Postion_Pointer] = '0';
 serial_number_string[++serial_number_Postion_Pointer] = lowByte(addr[i]); 
}
         
  if (addr[i] >= 16) {serial_number_string[++serial_number_Postion_Pointer] = highByte(addr[i],HEX);
   serial_number_string[++serial_number_Postion_Pointer] = lowByte(addr[i], HEX); }

Thank You!
Ron Derkis
rderkis@hotmail.com

rderkis: This does not work

You know that you can use all the text/string formatting functions of the AVR LIBC library with Arduino, don't you?

  Serial.begin(9600);
  uint16_t number= 0xAB;
  char buf[3];
  snprintf(buf,sizeof(buf),"%02X",number);
  Serial.println(buf);

OP: Look at your mess of code. Look at jurs' code. The ONLY correct number of statements on one line is ONE! You are writing code, not a novel. You are not being charged by the line. Use lots of them!

I can make no excuses for my coding mess. I don't do to bad for a 67 year old, learning a new langue. When I started programing on the 2k vic 24, using assembly language, I was vary care full about code length. But now with multi terabyte hard drives and gigabytes of memory, while using a Intel I7 processor.

Compact fast code is no longer a necessary in most cases, but I grant you proper coding is fast becoming a lost art. And one I envoy you for! :-)

PaulS: OP: Look at your mess of code. Look at jurs' code. The ONLY correct number of statements on one line is ONE! You are writing code, not a novel. You are not being charged by the line. Use lots of them!

How's your dutch language :grin: We have got a few of those "creative" programmers out here (Holland) as well who believe all code should stay left as much as possible, comments are costly and limit the programma to as low as possible lines so the program will work faster :grin: :grin:

Hello,

I don't understand why you say 16 bits? 0xAF is 8 bits not 16

rderkis: Compact fast code is no longer a necessary in most cases, but I grant you proper coding is fast becoming a lost art. And one I envoy you for! :-)

I disagree with you (although I am 11 years younger :grin: ) that compact fast code is no longer necessary. However that proper coding is becoming a lost art I absolutely agree. Although I suspect that it will change again as outsourcing is becoming less popular and companies are losing the inside knowledge about the low level of operation of their systems thus resulting in increasing failure of critical application systems. and if the art is lost some have no clue anymore what their systems are actually doing. And I am not talking about small companies.

if you want to convert a 16 bit number into 2 chars why not use union?

union {
    unsigned int number;
    struct {
        uint8_t byte_low;
        uint8_t byte_high;
    };
} convert;

test:

    convert.number = 0xabcd;
    Serial.print(convert.number, HEX);
    Serial.print(" - ");
    Serial.print(convert.byte_high, HEX);
    Serial.print(" - ");
    Serial.print(convert.byte_low, HEX);

@made a "little endian mistake :)" fixed

nicoverduin: Although I suspect that it will change again as outsourcing is becoming less popular and companies are losing the inside knowledge about the low level of operation of their systems thus resulting in increasing failure of critical application systems. and if the art is lost some have no clue anymore what their systems are actually doing. And I am not talking about small companies.

I would strongly disagree with you! Computing power is growing exponentially. Man will not be able to keep up, even a little. The computer will soon be writing programs so powerful that we won't be able to follow the program logic.

But on the lighter side, before then, computers will be able to rewrite my code, using super compact source code and even fix the errors :-)

If you think I am talking about in the far in the further, you could not be further from truth. Ten years or less the computer will have the computing power of the human mind. After that all bets are off, as to how fast they will grow in intelligence.

We are in a race for our vary existence, and not winning. Check out what Steven Hawkins, Bill Gates and a few other vary intelligent and knowledgeable people are saying about the computers.

Here is a quick function to convert a nibble to a hex char, that should be much faster and cost much less memory than sprintf.

char nibbleToHexChar( const uint8_t nibble )
{
	return nibble + (nibble < 0xA ? '0' : 55);
}

then you can use it like so

uint8_t byte = 0xAF;
CharResponse[0] = nibbleToHexChar( (byte >> 4) & 0xF );
CharResponse[1] = nibbleToHexChar( byte & 0xF );

guix:
Here is a quick function to convert a nibble to a hex char, that should be much faster and cost much less memory than sprintf.

char nibbleToHexChar( const uint8_t nibble )

{
return nibble + (nibble < 0xA ? ‘0’ : 55);
}




then you can use it like so


uint8_t byte = 0xAF;
CharResponse[0] = nibbleToHexChar( (byte >> 4) & 0xF );
CharResponse[1] = nibbleToHexChar( byte & 0xF );

Don’t know why you would write it like that… MUCH clearer to write:

char nibbleToHexChar( const uint8_t nibble )
{
	return nibble + (nibble < 10 ? '0' : 'A' - 10);
}

Regards,
Ray L.

rderkis:
I would strongly disagree with you! Computing power is growing exponentially. Man will not be able to keep up, even a little. The computer will soon be writing programs so powerful that we won’t be able to follow the program logic.

But on the lighter side, before then, computers will be able to rewrite my code, using super compact source code and even fix the errors :slight_smile:

If you think I am talking about in the far in the further, you could not be further from truth. Ten years or less the computer will have the computing power of the human mind. After that all bets are off, as to how fast they will grow in intelligence.

We are in a race for our vary existence, and not winning. Check out what Steven Hawkins, Bill Gates and a few other vary intelligent and knowledgeable people are saying about the computers.

This is the debate that keeps coming back… Those same people said these messages in the 70’ies, 80’ies, 90íes etc.
IBM announced their first biological flip-flop within 5 years we would have biological computers… IBM announced beginning of this century to have developed their first laser based flip-flop within 5 years we would have computers working on light… recently the announced a massive parallel processes based on the workings on the neurons in our brains and had configured boards with millios of neurons and billions of synaptcs… The “blue brain project” has show extensive simulations and models based on the workings of the brain. The program director only needed some 12 billion US$ to build a full size computer to simulate human brain…
We have saying here in Holland “De soep wordt nooit zo heet gegeten als dat hij wordt opgediend” meaning “The soup is never eaten as hot as it is served”. In other words it is true that we invent technology exceeding our own capacity… but somehow we never give ourselves the time to use it…

nicoverduin: This is the debate that keeps coming back....... Those same people said these messages in the 70'ies, 80'ies, 90íes etc. IBM announced their first biological flip-flop within 5 years we would have biological computers.... IBM announced beginning of this century to have developed their first laser based flip-flop within 5 years we would have computers working on light... recently the announced a massive parallel processes based on the workings on the neurons in our brains and had configured boards with millios of neurons and billions of synaptcs.... The "blue brain project" has show extensive simulations and models based on the workings of the brain. The program director only needed some 12 billion US$ to build a full size computer to simulate human brain..... We have saying here in Holland "De soep wordt nooit zo heet gegeten als dat hij wordt opgediend" meaning "The soup is never eaten as hot as it is served". In other words it is true that we invent technology exceeding our own capacity.. but somehow we never give ourselves the time to use it.....

I agree with you and your philosophy, I feel much safer about things I can't control by burying my head in the sand. :-)

RayLivingston:
Don’t know why you would write it like that… MUCH clearer to write:

char nibbleToHexChar( const uint8_t nibble )

{
return nibble + (nibble < 10 ? ‘0’ : ‘A’);
}




Regards,
Ray L.

No, that’s clearly wrong!

	return nibble + (nibble < 10 ? '0' : 'A'-10);

Actually I prefer the table lookup since its more readable:

    return "0123456789ABCDEF" [nibble & 0xF] ;

rderkis:
I agree with you and your philosophy, I feel much safer about things I can’t control by burying my head in the sand. :slight_smile:

Sorry :slight_smile: already occupied by ostriches : :grin:

I have been thinking about what you said. No idea should be tossed out with out thinking about it. Why would some one as intelligent as you not truly understand the idea of expotentionly? You mentioned going back as far as the 70s. To you and me with our linear thinking that seems like a long time. But think about it, 44 years max (1970). Lets put this in perspective. The latest research indicates - Homo Sapiens have been around for about 200,000 years. It took about 198,000 years to get where the era when Christ lived. (Donkeys and carts)" 2,000 years later we are walking on the moon and visiting mars. World War 2 knowledge is doubling ever 25 years All human knowledge is doubleing ever year as of last year. Vary soon knowledge will be doubling ever 6 months.

In other words what significance is the 44 years you mention as it relates today?

nicoverduin: This is the debate that keeps coming back....... Those same people said these messages in the 70'ies, 80'ies, 90íes etc. IBM announced their first biological flip-flop within 5 years we would have biological computers.... IBM announced beginning of this century to have developed their first laser based flip-flop within 5 years we would have computers working on light... recently the announced a massive parallel processes based on the workings on the neurons in our brains and had configured boards with millios of neurons and billions of synaptcs.... The "blue brain project" has show extensive simulations and models based on the workings of the brain. The program director only needed some 12 billion US$ to build a full size computer to simulate human brain..... We have saying here in Holland "De soep wordt nooit zo heet gegeten als dat hij wordt opgediend" meaning "The soup is never eaten as hot as it is served". In other words it is true that we invent technology exceeding our own capacity.. but somehow we never give ourselves the time to use it.....