how does the micro() function related to arduino reference frequency

I am trying test the reference frequency variation in dependent of temperature as well as the frequency error between each individual arduinos,

the way i plan to do is to have arduino report micro() timing for a reference amount of time at different temperature and compare the difference in reported timing --delta t

now a have delta t (i assume that

 time1 = micros();
    Serial.println(time1);

reports micro seconds, correct me if i am wrong), but i am not sure how to convert that into osciilator frequency, anyone has an equation for that? much appriciated. :o

micros() difference between what it reports and actual microseconds is(oscillator frequency)/(nominal frequency).

Better approach though, if you have a scope is to set up timer1 (assuming a normal 8bit avr) in WGM 13, ICR1= (F_CPU/1000)-1, OCR1A=(F_CPU/2000)-1, set the OC1A pin output, and monitor frequency on scope. That will get you nominal 1kHz squarewave. That's what I tid this past week to better understand the voltage dependence of the internal oscillator on the 841/441/1634/828 for ATTinyCore 1.4.0 release.