In one part of the code I tried doing this :
delay_time_ms=delay_time*60*1000;
delay_time variable stores a positive integer value. It is usually within 1-20 depending on what the user selects through a touch interface. delay_time_ms is simply a convertion of that value to milliseconds. And the answer always yields to something as big as 4xxxxxxx which I later found out by using a serial print of that variable.
If I split up the line and do this, then the result is calculated just fine:
delay_time_ms=delay_time*60;
delay_time_ms=delay_time_ms*1000;
I can't figure out why this strange behaviour is happening.
Entire code is also attached.
Required libraries are here due to file size restrictions: 7.2 MB file on MEGA
sequence_controller_3_beta_GSLC.h (16.7 KB)
sketch_jan11a.ino (25.2 KB)