Bafflingly, Serial.write(arrString); returns exactly the same result as Serial.write(&arrString[0]);...surely, it should return the address of the first element of the array?
I'm sure I've done this in c (years ago) and it worked then - is it something to do with .write trying to be too clever for its own good?
Bafflingly, … returns exactly the same result as … surely, it should return the address of the first element of the array?
Why are you using Serial.write() to send data to the Serial Monitor? What you you expect the BINARY value being sent to mean to an application that expects ASCII data?
Both of those expressions are character pointers (the address of a character in memory). If you want to print the value of the pointer you should cast it as an unsigned integer. If you use .write() on an integer you will get a single character. To display an integer, use .print(). Addresses are traditionally displayed as hexadecimal.
0x10c:
Bafflingly, Serial.write(arrString); returns exactly the same result as Serial.write(&arrString[0]);...surely, it should return the address of the first element of the array?
It's function overloading that works this magic. Serial.write() "knows" the data type of what you send to it.