Not understanding toggle code

GoForSmoke:
Is there a binary logic chip made that uses other than 0 for FALSE/LOW and 1 for TRUE/HIGH?
Memory, controller, processor, register?

But we are not talking about chips.
We are talking about how to write software to use a software API function.
The s/w API says use these defines: "HIGH" and "LOW" to get things done.

Don't get hung up on this specific digitalWrite() function call and how HIGH and LOW
map to the output states of the pin.

BTW, there are some chips that use separate set and clear registers.
In that situation, 1 bits written to a clear register will actually clear bits not set them.
And actually there are examples where true was defined as 0 and false was 1 in software
from the 80's on at least one RISC processor - but lets not delve down there.

Again, it is the concept of properly using an API as it is defined vs stepping outside the defined API
to use it ways that it is not defined.

You guys are getting all hung up over this one simple function.
I'm trying to express a reason to follow an API which a much bigger concept.

To help illustrate, lets look at some other API function.
Maybe something like:

setpixel(int x, int y, int color);

Where color can be WHITE or BLACK.
Without looking could you be sure the values for BLACK or WHITE were? No....
If you were to take a sneak peak at what BLACK and WHITE are, maybe you could write some
sightly better code that uses their raw values rather than use the defines.
Or that took advantage of the library code assuming one color when the value was zero and
used the other other color for any non zero value.

But then I re-write the library to get ready for additional colors and I change the values for
BLACK and WHITE to other values. Those that use BLACK and WHITE vs their raw values will
be fine. Those that went around using using the defines or taking advantage of the library
working a particular way when say the color was non zero will get burned.

C++ could solve this by creating a function that only accepted the proper defines/enums
to ensure that people didn't bypass using the proper types on functions like digitalWrite(),
but today digitalWrite() is a C function using only native types and so there is no way to enforce
that the "val" argument is of the type vs some calculated numeric type made from assuming
the values of of the defines.

--- bill