I think the biggest factors are: The OS you are running on the PC; the hardware of the PC and the software that runs on the PC.
In other words with a Big machine only connected to the USB (no network/ no mouse / no keyboard / ....) with a Real time OS and only the communication software running (no music playing/...) you will have the smallest delay.
Change any of those and the delay will go up.
I'm running windows (which is a bad real time OS), I can't complain about the hardware and I run wait to much software (virus scanners, firewalls ....) but I never had a delay on the serial communication that stroked me as to long.
I think you need to run the time critical part of your application entirely on the Arduino, where you have full control over time.
But then again I always use the same principle as proposed by privatier.
The most time critical communication I had between PC and Arduino was A 3D laser scanner. There the PC send a signal to Arduino who controls a laserline and the PC captures video. As the laser-line position on the camera is at the basis of triangular calculations there is time constraint.
However the actual delay is less important than the repetitive. (For more details on this see this forum post http://www.david-laserscanner.com/forum/viewtopic.php?f=6&t=1216&hilit=arduino
I hope this helps.
Greetings from the sun in Belgium