Should I use delay() in loop, if not particularly needed?

For example, I need to time that user has held the button for 3 seconds. I do this by increasing TIMER value(++timer;) when button is pressed.

That is the wrong approach. Note the time when the switch is pressed (using millis() or micros(), depending on the level of accuracy you need) and note the time when the switch is released. The difference is the time that the switch was held down.

If I don't have any delays restricting the main loop, then I have to wait it to reach numbers like 10 000, instead of like 300. This takes away memory, doesn't it?

300 and 10000 are both ints. No more memory is needed, though, the approach is still wrong.

As you develop more complex code, the number of iterations of a function becomes a poorer and poorer measure of elapsed time.

Or should I use different type of timer instead of increasing variable?(Like millis()?) That would pretty much solve this problem, so I could run the loop at any rate with fixed timings.

Yes, you should use millis(). You can't "run the loop at any rate with fixed timings". The function of the program, external and internal interrupts, and user input will dictate the timing, which can vary from one iteration to the next.

But I'd need to run the device more than 50 days(hopefully, it's battery powered), so would the internal timer overflow?

Your watch overflows every day. When was the last time that caused you a problem?