I am attaching a code which prints values of angles in degrees and rads which come from a magnetic sensor.
I was wondering how to find and serial print the time taken for a loop to execute?
This would be subtracted from the previous time taken but I am unsure on how to use the millis () function or another method to achieve this?
kind regards
unoproject
#include <SPI.h>
//Set Slave Select Pin
//MOSI, MISO, CLK are handled automatically
int CSN = 10;
int SO = 12;
int SI = 11;
int CLK = 13 ;
unsigned int angle;
int dis = 0;
void setup() {
Serial.begin(115200);
//Set Pin Modes
pinMode(CSN, OUTPUT);
pinMode(SI, OUTPUT);
pinMode(SO, INPUT);
pinMode(CLK, OUTPUT);
//Set Slave Select High to Start i.e disable chip
digitalWrite(CSN, HIGH);
//Initialize SPI
SPI.begin();
}
void loop() {
SPI.beginTransaction(SPISettings(10000000, MSBFIRST, SPI_MODE1));
//Send the Command Frame
digitalWrite(CSN, LOW);
delayMicroseconds(1);
SPI.transfer16(0xFFFF);
digitalWrite(CSN,HIGH);
//Read data frame
digitalWrite(CSN, LOW);
delayMicroseconds(1);
angle = SPI.transfer16(0xC000);
digitalWrite(CSN, HIGH);
SPI.endTransaction;
angle = (angle & (0x3FFF));
int pos = ( (unsigned long) angle)*360UL/16384UL;
int dis = (pos * (3.1415/180));
Serial.print ("Angle in degrees =\t");
Serial.print (pos);
Serial.print("\t Angle in Rads =\t");
Serial.print (dis);
delay(1000);
}
I was wondering how to find and serial print the time taken for a loop to execute?
Record the value returned by "micros()" at the start of "loop()".
Record the value returned by "micros ()" at the end of "loop()" ( but before the delay).
Do some simple arithmetic on the two values, and print the result.
I was wondering how to find and serial print the time taken for a loop to execute?
You don't find it. You measure it.
It's rather pointless to talk about determining how long a process takes, in the hopes on optimizing the time, when you spend several orders of magnitude longer than that with your head stuck in the sand.
TolpuddleSartre:
it depends if you want to measure the time setup takes to execute, or how long setup takes to execute.
It's the time for the setup to execute before the delay until the next loop occurs.
To calculate angular velocity for example I require the calulation W=angle/t however the times that previously I coded was using the millis() function and was an accumulation of time since the start of the entire setup.
PaulS:
Using millis() to time something that takes microseconds to execute is like using a calendar to make sure you get to work/school/home on time.
So I changed the code from jremington to micros() instead of millis() and it outputs 4 microseconds per solution thats printed on the serial monitor. Does this sound reasonable to you guys?