It isn't a formula, it is an assignment.

The way you tell the difference is that whilst

x = x + 1;

makes sense to a programmer, it is clearly mathematically nonsensical - how can "x" be the same as "x + 1"?

val = val * 10 + ch - '0';

At the beginning of the day, "val" is zero.

An ASCII digit '0'..'9' is in the variable "ch".

Subtracting '0' from the value converts it to its decimal value.

So, '3' - '0' = 3, for example.

Now, take the previous value of "val" and multiply by ten, and add in the decimal value of the new character entered.

Imagine I'd type '1' and then '6'.

First time through, "val" would end up with the value 1.

Second time through, val = 1 * 10 + 6 = 16.