Project: Dyno. Check it out, for criticism and advice

PeterH:

cntntn:
Yes, I need to passing the data measured to the pc, to save and analyze it. I think i can do it with the serial monitor tool of arduino.
I need to read sensor only to measure torque.
The final output I need is torque in function of engine RPM.

What data will you upload to the PC? Just tuples of timestamp, rpm, roadspeed and torque? Will you be recording throttle angle or MAP? Any EGO readings? You'd really need these if you're going to tune the engine (as opposed to just recording the performance). Are you taking any environmental measurements? For example, pressure, temperature and humidity will all affect the power output and you would need to know them to make any meaningful comparison with readings taken under different conditions.

How will you be measuring torque, engine speed, road speed? I mean in terms of the inputs you're going to present to the Arduino, and what method you're going to use to read them.

I need just to make a graphic, with torque in function of engine RPM. Maybe I will implement a temperature sensore, if it's don't get me many problem. The other enviromental measurement are nonessential. To read arduindo input with pc I'll use the standard serial monitor tool.
You think it's ok?

sbright33:
@Goforsmoke- You can make the Uno clock accurate to about 1ppm by calibrating it. I did this with GPS. It's the same from day to day. I assume it's sensitive to temperature, but I have not tested this.

With the RTC or only with arduino board?

sbright33:
@Goforsmoke- You can make the Uno clock accurate to about 1ppm by calibrating it. I did this with GPS. It's the same from day to day. I assume it's sensitive to temperature, but I have not tested this.

From ATMEL the internal clock can be calibrated to within 1%. The calibration being digital, you can't get 1% off and the next click get you any closer. What that doesn't mean is that you can't get closer than 1% because by sheer luck, temperature and Vcc you might hit 1ppm or less.

Using an External crystal you can get closer than 1% and they do indeed make 1ppm (and up) crystals, really just sort what they made out into different grades. The 1ppm crystals are a bit pricey.

So you go and get the closest clock crystal you can find and a few Serial.print()'s later guess what? Your timer is just a bit off. Can you guess why?

If you want to be certain of accurate time then get an RTC.

cntntn:
I need just to make a graphic, with torque in function of engine RPM. Maybe I will implement a temperature sensore, if it's don't get me many problem. The other enviromental measurement are nonessential. To read arduindo input with pc I'll use the standard serial monitor tool.
You think it's ok?

So I take it you don't need to supply enough data to do any engine tuning, or to be able to correlate readings taken on different days. You could just output the parameters you want to collect in CSV format and redirect to a file which can then be opened in your favorite spreadsheet program. Overall you need to sample the RPM at different times and calculate the acceleration over the preceding interval, then produce an X/Y scatter chart of acceleration versus RPM. It is up to you to decide how much of that analysis to do on the Arduino and how much on the PC. The smaller the interval the more accuracy you'll get but the more data you'll have to deal with, and of course if you're outputting the results in real time you need to make sure you keep within the speed of the serial port.

You'll need an RPM input. Ideally you'd also have a 'road speed' (inertial mass speed) input although you could avoid that if you can assume the gearing is known. What algorithm are you going to use to sample the RPM value, and how frequently do you intend to sample it?

You'll need to think about how much of the analysis you do on the PC versus on the Arduino. Do you want to output

@Goforsmoke-
My DS1307 RTC is less accurate BEFORE I calibrate either one.

I understand what you're saying, and you're half right. You cannot make individual ticks go 1/73,000 faster. But you can multiply the total count since the program started micros/millis() by that number to give you within 1ppm in my case. You can even make the output correct in some cases like this for example. A stepper motor you want to go within 1ppm of it's speed. You need to do a leap step, like a leap day in our calendar, every 73,500 steps until the error value is near 0.

Please correct me if I'm wrong, but interrupts are used for Serial input, < v1.0 not Printing??? So Serial.print() will NOT effect the clock counter? I know this because I've tried it experimentally. If you need to occasionally input a character on a regular basis, simply run this code while you're doing your calibration. At a higher baud rate, 1 char every second, the results are still consistently within 1ppm. If Serial.print uses interrupts do the same thing with it. IDE v1.0 vs 22? This method works with Serial.read() with minimal consistent usage.

I've tested it. My clock is originally way off 20ppm. So I didn't just get lucky.

1% come on we can do better!
That's 10,000ppm!!!

Are you talking about using the 8 MHz Internal Oscillator or an External Crystal?

I measured the time with Millis, that's the code to write sensor value and time

int sensorValue = digitalRead(2);
if (sensorValue==HIGH) {
digitalWrite(ledPin, HIGH);
} else { digitalWrite (ledPin, LOW);
}
time = millis();
Serial.println(sensorValue), (time);

It's ok or someone know a better way?
what about <MsTimer2.h> library?

The one on the Uno board.

That's a crystal. Maybe not the best.

What I wrote:

With the internal oscillator calibrated you can be within 1% and with a crystal I think you can get closer but I dunno just how much -- and still the suckers won't stay in synch.

From ATMEL docs I see you can calibrate the internal oscillator which ships more like within 10%. I don't see a way to calibrate the external clock beyond changing the crystal.

I'm not suggesting we change any hardware. I'm talking about calibrating it in the code.
You're talking about 10%.
I'm talking about 10ppm.
We're not even in the same ballpark.
I have proven it works for my applications nearly 1ppm.

Now I am not into all the UNO stuff can do simultaniously but for this project it should need to do 2 basic functions.
1] Get an accurate time stamp by RTC.
2] Measure the RMP by one optical sensor.
If you can make these 2 basic functions work you can later upgrade by your needs.

With all the knowledge from the forum members swirling around would this become the INERTIA dyno section?
Put all the info here and then after post 100 we have a working set up and code for the baasic as open source?

Paco

sbright33:
I'm not suggesting we change any hardware. I'm talking about calibrating it in the code.

I think I know what you are getting at. In the olden days ships used to carry chronometers for navigation. Now the important thing was not that they showed the correct time of day. The important thing was that any error they had was a constant error. For example, if the clock lost a minute (exactly) a day, then you could compensate by adding one minute per day you were at sea. So this in effect is a "code calibration".

Also, interrupts should not affect the internal timer. It is running in the background anyway (it's a bit of hardware). Doing serial reads or serial prints should not affect it at all.

sbright33:
I'm not suggesting we change any hardware. I'm talking about calibrating it in the code.
You're talking about 10%.
I'm talking about 10ppm.
We're not even in the same ballpark.
I have proven it works for my applications nearly 1ppm.

From Atmel I learned that uncalibrated internal clock accuracy in shipped chips has a tolerance of something like 10% but can be calibrated to within 1%. Add to that that Arduino boards don't run on the chips internal clocks. The 10%/1% applies to what standalone chips can supply.

And the millis() counter not being updated while interrupts are disabled (like during IRQs) is something I got (shown in docs) from Arduino. Why do you suppose that is?

Neither says you can't engineer around them. Step one: identify a need. Step two: indentify and quantize the parts. Etc till all the troubles are shot and it's done or proves impossible.
Please don't think that parts of step 2 are a statement that the task is impossible.

And again, the easy way that should be good is to use an RTC.

GoForSmoke:
And the millis() counter not being updated while interrupts are disabled (like during IRQs) is something I got (shown in docs) from Arduino. Why do you suppose that is?

All they are saying is that if you disable interrupts (eg. by being in another ISR) for too long, then it might miss a tick. But since the timer fires every 1024 uS (roughly once a millisecond) you would have to disable interrupts for a long time for that to be a worry.

If you consider that at 115200 baud you would get an interrupt every 86.8 uS (being 1/11520) then the serial interrupt handler would have to be done in that time or it would miss incoming serial data ... which it doesn't.

I would guess that servicing an incoming serial byte would take around 15 uS, so you could do that, and send a byte (say, another 15 uS) and still have a lot of free time before your timer interrupt fired, once every 1024 uS.

It's people who try to do massive amounts of calculations inside their ISRs that might miss timer ticks, but there is a way around that ... don't do it. Leave the processing for the main loop, once the ISR has exited.

To miss a tick it's only got to be processing an IRQ during the 'tick'. But for 99.999% of use I don't think it's a problem. If you somehow got interrupts even partly synched to the millis though it might indicate a software problem.

IIRC micros() keep counting during IRQs.

We still use that method as a backup to GPS, in case it fails, even today. All you have to do in preparation is to have a decent watch, and a book about navigation. Write down the date and the error compared to GPS every day at the same time. After awhile you don't have to do it anymore, maybe once a month. Now you have an accurate time from a $5 watch. It's the same concept with our 10% Uno clock. I tested 6 differrent Uno's, all well within 100ppm, before calibration.

Nick, my results agree with your theory.

Regardless of how interrupts effect millis(), there are no interrupts with the old IDE during Serial.print

I disagree, My DS1307 RTC is NOT MORE ACCURATE.

It will not miss a tick if it's coincidentally processing another interrupt during that tick. So long as it's not doing it during 2 in row, and the whole time in between. What Nick said.

Again, Serial.print does not cause an interrupt.

To put it into terms that matter, my Uno loses 1 second every 11 days when it is not synced to GPS, by calibrating in code. My PC loses 30 seconds. Not many RTC's can beat that.

UNO clock is not 10%. Standalone chip running on factory default clock is 10%.

Serial is not the only thing that uses interrupts and I think it's been established that 1.0 does not use interrupts.

If an IRQ is running when the millis update IRQ is supposed to be running then how doesn't it miss the tick?

When you have a 1ppm crystal and a more than 1ppm RTC then the technical term to look up is 'tolerance'. And if your 1ppm is due to software correction then why can't the RTC which should be as regular as it gets be corrected in the same way?

What part of "easy way" is turning into "only way to be accurate"?
What part of hardware limitations really means nothing can be done to correct them?

The more I see about the dyno the less I think it's going to need 1ppm timing anyway, it's in the 99.99%.

I am really ever so sorry I brought up problems that others have related on UNO timing. Sue me.

Whow,

What a words flowing.
What is needed:
50 times (Hz) in a second (20ms) you need a timestamp with the corresponding RPM value.
If this timestamp differs 1 sec on 24 hours this can be wiped away as nilhil for the 4 second measurement you will do.
A dyno run should last depending on the interia weight 4 to 5 seconds.
This means 200 to 250 datapoints.

Paco

It's a friendly conversation. v1.0 DOES use interrupts for print. You got it backwards. It does not miss a tick, but I believe that update to the millis() clock will be late. Still the total count will be exactly correct after many minutes of this problem happening. You can correct a RTC in code, the same way as the Uno clock. But what would be the point of this for timing events? They are both accurate after being calibrated in the code. A RTC clock is used if you need to know the time/date after powering down and up. There are many ways to achieve 1ppm accuracy. Software is easier than hardware if it fits your criteria. I'm trying to explain one way to correct the hardware limitations using code to calibrate the millis() clock. There aren't many ways to achieve this specific goal, once you have the constant calculated. But there are many ways of getting this number first. 99.99% would be good for this application, I agree. 10% is unacceptable for many applications.

Then don't run uncalibrated chips without crystals.

It seems to me that calibrating the clock is the least of the problems here - the whole system is going to be one big unknown and unless the whole thing is calibrated there seems little point worrying about whether one component is accurate to better than a few percent. (How accurate are the torque measurements going to be? How accurately do we know the inertia of the mass? What are the effects of friction, air drag etc? How accurately can we measure RPM and roadspeed given that they are varying continually?)

Since the dyno isn't going to provide enough data to correct the readings to standard conditions the accuracy is going to vary day by day, even if it was properly calibrated in the first place.

This dyno will be useful for doing back-to-back comparisons but I don't expect it to produce figures that are meaningfull in absolute terms.