So I've implemented interpolation and it works well! For now, I'm keeping the OCR calculation. My main reason for this is that I can easily play around with the signal capture time and try to find the best relation with the sample frequency. The results are quite good, with resolutions that go below 1 Hz. However, there are some weird errors. When testing with the MyDAQ at a constant frequency, the Serial Monitor would plot, for example, 10 good results that are exactly the same and then plots 3 or 4 results (that are very different from each other) with a large error. This goes on at a constant rate. What could be the problem? To get around this issue I might run the calculation a few times and choose the frequeny that is calculated multiple times as the result, or use the average. See the interpolation code below:
else if (pdState == 2 && (currentSum - previousSum) <= 0) {
// quadratic interpolation
for (int k = 0; k < arraySize - i + 1; k++) nextSum += (rawData[k] - mean) * (rawData[k + i + 1] - mean);
float interpolationValue = 0.5 * (nextSum - previousSum) / (2 * currentSum - previousSum - nextSum);
period = i + interpolationValue;
pdState = 3;
}
Bart