If the timing of a response to an input trigger is absolutely critical, use an interrupt. That's exactly why interrupts exist.
That being said, it is possible to know how long a line or lines of code in your program will take to execute, but it is far from easy. You need an understanding of assembly and machine language, and an understanding of how your C code is assembled into machine language to do so.
Something like:
byte test = 10;
if(test < 10)
test = 20;
may only take a handful of instructions to execute (a dozen or so, for example)
where as something like:
int angle = 0;
while(angle < 180)
Serial.println(sin(angle++/57.295));
can literally take millions of instructions to execute.
Even though each is only 3 lines of code (two if you don't count the variable definition/initialization), they compile to drastically different machine code binaries. The first is just some move instructions, conditional jump, and assignments. The second is a whole other ballgame with some advanced math and blocking IO manipulation.
There is, however, an easier way. Simply benchmark your code. Wrap it in some timing code and see how long it takes to execute.
unsigned int time = 0;
time = micros();
byte test = 10;
if(test < 20)
test = 20;
time = micros() - time;
Serial.println(time, DEC);
delay(1000);
On my Atmega168 Arduino, I get 0 or 4 microseconds. In reality it's probably a microsecond or two, but the resolution of the micros() function is 4 microseconds.
The other example though:
unsigned int time = 0;
time = micros();
int angle = 0;
while(angle < 180)
Serial.println(sin(angle++/57.295));
time = micros() - time;
Serial.println(time, DEC);
delay(1000);
takes about 38900 microseconds (or about 39 milliseconds). So only about 600,000 instructions to execute, not quite millions, but still about 4 orders of magnitude longer to execute. It also took only a couple minutes to determine empirically, and requires no knowledge at all of assembly/machine language or compilers.