I've searched around but can't find a solution to this problem. I'm using the simplest example I can think of to test this. This is the processing Code:
void setup() {
Serial.begin(9600);
}
void loop() {
for(int i =48;i<51;i++) {
Serial.write(i); //writes 0-2 in ascii and 48-51 in bytes.
}
}
If I view the output on the serial monitor it correctly prints "012012012012012012012" etc without any problems.
I wrote a simple program in processing to view the data:
import processing.serial.*;
Serial myPort;
void setup() {
String[] ports =Serial.list();
myPort = new Serial(this, ports[1], 9600);
}
void draw() {
if (myPort.available() >=10)
{
byte[] serialIn = new byte[10];
myPort.readBytes(serialIn);
for (int i =0;i<serialIn.length;i++)
{
println(serialIn[i] +" binary:"+ binary(serialIn[i]));
}
}
}
most of the time it prints junk:
-126 binary:10000010
-118 binary:10001010
-110 binary:10010010
-126 binary:10000010
-118 binary:10001010
-110 binary:10010010
And occasionally it prints the correct values:
48 binary:00110000
49 binary:00110001
50 binary:00110010
48 binary:00110000
49 binary:00110001
50 binary:00110010
It looks like each byte gets shifted to the left by 3 bits although I can't work out why it behaved differently each time I run the program. Interestingly, if I get the arduino to send 0,1,2 it never prints garbage.
Really I want to read the data in C#. This is the meat of the C# code which is based on this example:
http://msmvps.com/blogs/coad/archive/2005/03/23/SerialPort-_2800_RS_2D00_232-Serial-COM-Port_2900_-in-C_2300_-.NET.aspx
if (serialPort.BytesToRead >0)
{
byte temp = (byte)serialPort.ReadByte();
Console.WriteLine(temp +"\t binary: " +byte2Binary(temp));
}
It prints some of the correct values but often misses numbers and is interspersed with garbage:
130 binary: 10000010
138 binary: 10001010
48 binary: 00110000
49 binary: 00110001
50 binary: 00110010
146 binary: 10010010
49 binary: 00110001
50 binary: 00110010
49 binary: 00110001
50 binary: 00110010
146 binary: 10010010
49 binary: 00110001
50 binary: 00110010
48 binary: 00110000
138 binary: 10001010
146 binary: 10010010
The binary values of the incorrect data is exactly the same as the processing binary values. Here the 3 bit shift seems random during runtime instead of being consistent for each time I run. I have tried putting a delay of up to 200ms between sending serial commands. It helps a bit but I still get at least 10% junk data. I'm trying to run a control loop so the longest delay I can afford is 3ms.
What can I do to fix this?
Thanks