I ran your code, and got this output, in part:
Calculating how long it takes to print that:
(19 characters) * (10 bits/character) / (9600 bits/second) = .0198 seconds, or, pretty close to 20 milliseconds. Almost all of the processor's time is spent waiting for something to print, to deplete the serial buffer so that new characters can be added.
To see how fast it's really going, you might consider saving the value of micros(), acquiring some number of data points, logging the time at the end of that process, subtracting the start time, and printing the result. That will get you very close.
Note that you won't be able to save a thousand ten-bit ADC readings that you will acquire in one second, because you'd run out of memory. But, you can process them if your purpose doesn't call for knowing the values of individual samples at the end - like, maybe taking the sum of their squares for an RMS calculation, or maybe keeping track of the maximum and minimum for a peak-to-peak measurement.
Note also that variable to is int, while the return value of millis() is unsigned long. If you want to be able to keep track of how long things take after millis() overflows an integer - about 32 or 33 seconds - you'll want to be sure to use an unsigned int, which will get you to about 65 seconds, or unsigned long, which will take you as far as about ten weeks.
You haven't said whether it's important to you to have your samples precisely spaced in time. It might not be important. If it is, though, you'll want to eliminate uncertainty about when the ADC reading is initiatedby autotriggering the ADC with one of the hardware timers, as opposed to letting the sketch manage ADC conversions directly. That's a lot of learning for a guy who's current skill level has him asking this question, but certainly not, I believe, beyond your capabilities.
If precise timing isn't important for you, please take a look at the example program, "Blink without delay," that comes with the IDE. That sketch blinks an LED without using the delay function, by recording the time that the LED last changed state, watching the current time until an appropriate interval has passed, and changing the state again. The method used in that program will generalize to a lot of other timing functions - like this one - and it's a reliable scheme for getting the processor to perform actions at or nearly at known times. It also avoids the use of delay(), which makes the processor wait for an interval to elapse, while doing nothing else. It's customary to call delay() a "blocking" function, because it blocks the preocessor from doing anything other than waiting for the delay to elapse. That may be OK with you today, but you'll need better techniques later, so you will want to go ahead and learn this one.
I note that you're doing a floating point calculation after every sample. Floating point takes a long time, compared to integer calculations. You might be well advised to consider doing your intermediate calculations with integer math - int's or long's, as your data needs require - and then doing the floating point calculations at the end of the sample period. It won't matter much for this application, since you've got a whole millisecond between samples, but it will matter if you do things faster in the future. For the same reason, you'll want to avoid using Strings during the sample period - I don't know how long they actually take, but, intuitively, I'd think they're kind of slow.