# Problem with String/Char to int conversion

Have following problem

minuty = myRTC.minutes;

which if I send it to serial output

Serial.println(minuty);

returns

"34"

I can use charAt to get "3" and "4" which I want to sent to following function
void Display1to9 (int Digit){
if (Digit == 0) DisplayDigit(1, 1, 1, 0, 1, 1, 1);
if (Digit == 1) DisplayDigit(0, 0, 1, 0, 0, 1, 0);
if (Digit == 2) DisplayDigit(1, 0, 1, 1, 1, 0, 1);
if (Digit == 3) DisplayDigit(1, 0, 1, 1, 0, 1, 1);
if (Digit == 4) DisplayDigit(0, 1, 1, 1, 0, 1, 0);
if (Digit == 5) DisplayDigit(1, 1, 0, 1, 0, 1, 1);
if (Digit == 6) DisplayDigit(1, 1, 0, 1, 1, 1, 1);
if (Digit == 7) DisplayDigit(1, 0, 1, 0, 0, 1, 0);
if (Digit == 8) DisplayDigit(1, 1, 1, 1, 1, 1, 1);
if (Digit == 9) DisplayDigit(1, 1, 1, 1, 0, 1, 1);
}

If I run this function like this

Display1to9(1);

It nicely displays 1 on my 7-digid display
But when I run

Display1to9(minuty.chaAt(0);

Where minuty = "34" and minuty.chaAt(0) returns "3"

The function does not lit 3. I cannot find clear function to convert either string nor char to Int.
And I am pretty much stuck

There is a difference between the number 3 and the character ‘3’.

Hi

MorganS is correct the ASCII character '3' has a decimal value of 51 (try doing a Google search for ASCII table).

You can either subtract 48 (the value of character '0') from Digit before doing the comparison like this

``````void Display1to9 (int Digit){
Digit = Digit - 48;
if (Digit == 0) DisplayDigit(1, 1, 1, 0, 1, 1, 1);
if (Digit == 1) DisplayDigit(0, 0, 1, 0, 0, 1, 0);
if (Digit == 2) DisplayDigit(1, 0, 1, 1, 1, 0, 1);
if (Digit == 3) DisplayDigit(1, 0, 1, 1, 0, 1, 1);
if (Digit == 4) DisplayDigit(0, 1, 1, 1, 0, 1, 0);
if (Digit == 5) DisplayDigit(1, 1, 0, 1, 0, 1, 1);
if (Digit == 6) DisplayDigit(1, 1, 0, 1, 1, 1, 1);
if (Digit == 7) DisplayDigit(1, 0, 1, 0, 0, 1, 0);
if (Digit == 8) DisplayDigit(1, 1, 1, 1, 1, 1, 1);
if (Digit == 9) DisplayDigit(1, 1, 1, 1, 0, 1, 1);
}
``````

or compare characters rather than values like this

``````void Display1to9 (int Digit){
if (Digit == '0') DisplayDigit(1, 1, 1, 0, 1, 1, 1);
if (Digit == '1') DisplayDigit(0, 0, 1, 0, 0, 1, 0);
if (Digit == '2') DisplayDigit(1, 0, 1, 1, 1, 0, 1);
if (Digit == '3') DisplayDigit(1, 0, 1, 1, 0, 1, 1);
if (Digit == '4') DisplayDigit(0, 1, 1, 1, 0, 1, 0);
if (Digit == '5') DisplayDigit(1, 1, 0, 1, 0, 1, 1);
if (Digit == '6') DisplayDigit(1, 1, 0, 1, 1, 1, 1);
if (Digit == '7') DisplayDigit(1, 0, 1, 0, 0, 1, 0);
if (Digit == '8') DisplayDigit(1, 1, 1, 1, 1, 1, 1);
if (Digit == '9') DisplayDigit(1, 1, 1, 1, 0, 1, 1);
}
``````

Either method will display the correct digit.

``````[quote]  Digit = Digit - 48;[/quote]

Would be much simpler if it was written;
[code)  Digit = Digit - '0';
``````

This way you don't have to know the magic number 48 and if it ever changes for any reason (you moved to a system with the EBDIC character set] then you don't have to search your code and replace all the 48s.

In programming language context, the difference between these two symbols : 3 and '3' could also be understood by the following example:

``````byte x1 = 3;
byte x2 = '3';
``````

In response to the top definition, the compiler will assign an 8-bit natural binary value of 00000011 (0x03) to variable x1.

In response to the bottom definition, the compiler will assign an 8-bit ASCII code of 00110011 (0x33) (as per Fig-1) for the digit (character) 3 to the variable x2.

Figure-1: ASCII chart for the characters of English Language Alphabet