Going slightly crazy...

I'm having a hell of a time with sending data to my Arduino. You wouldn't think this is rocket science, but apparently it is. sigh

When I send the string "GAZ1{0}{127}{0}" (with the values in {} being a single byte value, 0 to 255), my program reports:

71 65 90 49 0 127 0 10

The data is sent like this: Serial.WriteLine(txtBoardID.Text & Chr(chanCmd) & Chr(127) & Chr(0))

However, if I bump that to 128, this is what I get:

71 65 90 49 0 226 130 172 0 10

I've set the encoding via Serial.Encoding = Encoding.UTF8.

I'm just going in circles at this point. Can someone point out what I've got going on? I'm really puzzled by the Arduino reporting back more bytes than I sent...

Thanks!

g.

On the PC side, are you using Write(String) or Write(Byte(), Int32, Int32)?

The 0 and 10 at the end is the ASCII values for CR LF(carriage return and line feed) coming from the fact that you use writeline, which always append these two chars to whatever you write to the serial port.

Being picky here, but carriage return is 13, not zero.

Coding Badly: I'm using WriteLine() as shown above.

MikMo: Let me try to clarify a bit...

I'm sending (including the lf/cr sequence):

71 65 90 49 0 128 0 10 13

I get echoed back:

71 65 90 49 0 226 130 172 0 10 G A Z 1 0 ................. 0 10

The three bytes underlined are appearing instead of the single byte value of 128 that was sent. The byte value 13 is not shown because in this particular example it's being "eaten" by a ReadLine() call (afaik, ReadRemaining() would return the CR)

The input buffer on the Arduino side is declared thus:

int buffer[128]

I fill it like this:

void loop() { int inByte = 0; int test = 0; if (Serial.available() > 0) { inByte = Serial.read(); buffer[bufIndex] = inByte; bufIndex++; if ((inByte == '*') || (inByte == '\n')) { processBuffer(); clearBuffer(); bufIndex = 0; //havePrefix = false; } } }

I just don't understand where the extra bytes are coming from and it's driving me out of my mind. :)

tnx.

g.

I'd be looking into the Chr() function. I suspect that it only operates the way you want when the input value is in the range 0 to 127. Perhaps values outside that range, like 128, cause it to send an escape sequence or something.

PaulS: It was my understanding that setting the encoding to UTF8 should have fixed any .Net induced weirdness. I know that if I leave the encoding "default", it translates anything greater than 127 to 63.

tnx.

g.

I just don't understand where the extra bytes are coming from and it's driving me out of my mind

According to Microsoft's documentation, it's the "high bit characters" being converted to UTF-8. In other words, Serial is converting a Unicode String to a stream of UTF-8 bytes.

Also according to the documentation, the solution is to output an array of bytes [using Write(Byte(), Int32, Int32)] instead of outputting a String.

I have no idea what you need to change but hopefully you're now pointed in the right direction.

I've tried using Write() before and it didn't work. I'll try it again...

I'll try pulling the Serial.Encoding... call and see if that helps.

I feel sorry for the poor jerk that wants to implement something like ZModem with this framework. sigh

I don't know what jackass at Microsoft thought forcing character encoding on the serial control was a good idea, but he needs to be brutally and publicly executed as a warning to others. :)

g.

It is .NET.. everybody needs to be able to create applications by just dragging some components around. What? you mean coding is meant for coders? :-? Are you sure? microsoft begs to differ.. they even think errors shouldn't be a part of an application, which is why visual basic knows that lovely line 'on error resume next'. Surely you can't mean that coding isn't as easy as microsoft says it is! :-X

Sillyness aside, .NET does some really funky stuff underwater, even if you explicitly set the encoding.. it might still revert to a default (have had this happen). Check the stream on the PC side aswell, make sure it sends it in the format you expect it to be in. Another tip is to remove as many default libraries as possible. It will automatically add libraries such as LINQ to your project, these could potentially screw up some code aswell.. (especially LINQ, as it allows for some new syntax structures).

What kills me is that everything I've read about UTF8 says that it will deliver an unsigned 8 bit character, while the ASCIIEncoding setting delivers only 7 bit characters.

It's also entirely possible that I've over-thought this thing to the point where I should probably scrap it and start over. laughs

Thanks all. [Edit - Setting Serial.Encoding = Encoding.Default fixed the issue]

g.