I am trying to determine if an Arduino will solve my problem! Any help for a new user is highly appreciated!!
I have a TOD clock that puts out an ASCII RS232 data stream that I want to send to a digital clock that has an RS485 input. I can convert the RS232 to RS485 easily. I cannot convert the ASCII strings easily.
My question is can I convert a once per second ASCII text stream from:
*RQTS_U,ddd:hh:mm:ss.0,Q_
= on time point
_ = denotes space
U = UTC
ddd = DAY OF YEAR (1 to 365/366)
hh = HOUR (1 to 23)
mm = MIN (00 to 59)
SS = sec (00-59)
0 = DECIMAL SECONDS
Q = QUALITY BYTE
the format for the remote reading clock is:
I_ ddd_hh:mm:ss _xx
I = CLOCK SYNC INDICATION (SPACE WHEN SYNC'D TO GPS, ? WHEN SYNC IS LOST)
_ = denotes space
hh = HOUR (1 to 23)
mm = MIN (00 to 59)
SS = sec (00-59)
xx = time zone offset (0 to 23)
Q = quality byte (0 = unknown, 4 = <1 microsec offset)
Any thoughts or ideas are greatly appreciated as i am very interested in learning more about the Arduino devices and thought this would be a great entry project.
I dont want to dedicate a PC to running a software conversion as this would defeat the purpose of the master clock. the accuracy should stay at the seconds per day rate with minimal drift or offset from the conversion process.
Yes, it is possible. You will need a board/chip that has two UARTs. The accuracy will not be a problem, but you will have some latency and jitter. What's your required spec?
If you mean that using a UART you need to receive an ascii string like this "*RQTS 2,FRI:23:09:56.0,2 \r\n", convert it and transmit it like this "\r\n FRI 23:09:56 2\r\n" once every second, then all you need is a small Arduino board and a bit of code.
thanasis:
If you mean that using a UART you need to receive an ascii string like this "*RQTS 2,FRI:23:09:56.0,2 \r\n", convert it and transmit it like this "\r\n FRI 23:09:56 2\r\n" once every second, then all you need is a small Arduino board and a bit of code.
True but if the chip doesn't have two UARTs, then one would need to be implemented in software (bit banging) which isn't ideal.
I am collecting the ASCII data via serial (RS232) output from the master clock that is sync'd to GPS. The output format is fixed as stated.
The remote display clock is ASCII RS485 or RS422 data. The formats cannot be changed on either side. I want to perform a data stream conversion and will try to do this with an Arduino if it will work.
I don't have a specific spec in mind as this will only be as accurate as the time output from the master, which is updated continuously via GPS and referenced to its internal master oscillator for accuracy and holdover if GPS goes away. The accuracy of the clock is set at 1x10E-11 when tracking and stabilized.
The once per second ASCII time will always be from the master output. Once into the Arduino and converted it should continue with some level of accuracy that will not drift substantially. I cannot estimate (or measure) the time it will take for the received ASCII to be interpreted, converted and then spit out to the remote clock. The RS232 and RS485 will be configured for a 9.6kbps rate.
Any product recommendations? There is a Fry's store nearby that has some of these in stock, I will take a look over the weekend.
I am collecting the ASCII data via serial (RS232) output from the master clock that is sync'd to GPS. The output format is fixed as stated.
Still not going to show us how, though, I see.
The remote display clock is ASCII RS485 or RS422 data. The formats cannot be changed on either side. I want to perform a data stream conversion and will try to do this with an Arduino if it will work.
It can.
Getting an Arduino Mega will make this process easier because it has more than one UART.
On the other hand, NewSoftSerial/SoftwareSerial is perfectly usable for bit-banging serial output on any two digital pins.
What's the difference between the hardware UART and the software based ones in terms of the latency and jitter?
Define latency. If you mean the time between starting to shift out a byte and that byte shifting out, that is a function of the baud rate, not hardware vs software.
I have no idea what you mean by jitter with respect to serial output.
What's the difference between the hardware UART and the software based ones in terms of the latency and jitter?
Define latency. If you mean the time between starting to shift out a byte and that byte shifting out, that is a function of the baud rate, not hardware vs software.
I have no idea what you mean by jitter with respect to serial output.
I never used the software-based UARTs. While transmitting, can the CPU do other things? I was assuming the worst-case where the library wouldn't return until all the bytes were transmitted.
The master clock device is a TRAK Systems GPS-Master Oscillator that provides an ASCII time code. The time code occurs every secod and is synchronized to the oscillator which is locked to the GPS signal. There is not much more to tell. The RS232 ASCII code has an * as the on-time mark that occurs within +/-50ns of the on-time mark. The Master also outputs Ethernet NTP signals to sync the clocks on all of the computers on the network.
The remote clock is a SPECTRACOM 8176 display clock that has an RS485 input that displays the TOD (Time of Day) from the ASCII time code. It also has an internal oscillator that keeps accurate time should the link disappear.
When I said the formats cannot be changed - I was referring to the internal coding of these devices. I cannot change their software to match the two devices with a common time code format. they are from 2 different manufacturers and do not supply a standardized ASCII time code format.
As for the latency - once the mast sends the on-time pulse there will be nanoseconds to microseconds of transfer time to the Arduino then the code shift and the output to the remote. this could add up over time, but since the time is always accurate from the master it should be a consistent offset that should not amount to a noticeable difference.
Sorry to get so long winded!! I will buy an Arduino and try to get it to connect to the master and then see what the output looks like. i will be back for more.....
Thanks again for all the comments - you have all been helpful.
While transmitting, can the CPU do other things? I was assuming the worst-case where the library wouldn't return until all the bytes were transmitted.
It can not. But, until 1.0, the hardware serial ports couldn't either. I suspect that the software serial class could be modified to buffer output data, too.
While transmitting, can the CPU do other things? I was assuming the worst-case where the library wouldn't return until all the bytes were transmitted.
It can not. But, until 1.0, the hardware serial ports couldn't either. I suspect that the software serial class could be modified to buffer output data, too.
I never said he couldn't use bit-banging for his application. I said it would easier. What I meant by easier is that UART are naturally a hardware problem and not a software one.
PaulS:
It can not. But, until 1.0, the hardware serial ports couldn't either. I suspect that the software serial class could be modified to buffer output data, too.
Not totally. The hardware serial write checked if the send buffer was empty, and if not looped until it was. However during this time interrupts would allow incoming data. So the write was blocking, but didn't disable interrupts.
The software serial however did (and probably still does) disable interrupts because the timing needs to be right.
Certainly. That's why I have several Arduinos. Choose the right one for the job, and you don't even need to think about the problems that might arise from choosing less than ideal hardware.
But, sometimes you need to save time, money, or space. Then, you need to make compromises.
But, since OP doesn't have any Arduino, the whole point is moot.