Thanks! I was thinking about the following. The sample rate determines the resolution, and it is limited by the amount of samples that are taken due to the fact that I do not want to use a lot of memory and I need at least two times the largest period for the autocorrelation to work. What if I use interrupts to get the maximum sample rate for the highest frequency, and bring it down for the other strings / frequencies by ignoring some of the samples. If I'm not mistaken, by doing this I can get a fairly good and above all constant resolution. Combined with interpolation it might be accurate enough for the project to work. What do you think about this idea?