ideas for storing/logging a LOT of measured data

I am creating a project that where I log various parameters of data, e.g. in the form of values from a sensor connected to my Arduino Uno. I plan to visualize/graph the values following that.

But I want to store values extremely frequently because I want to track even very small changes in measurement, e.g., maybe even once every 500 milliseconds, in which case I will end up logging 173,000 values in 1 day.

The hurdle is that I can't think of an appropriate solution for storing so many values. If I write each value to an SD card or EEPROM, I will have tons of capacity but only have limited write cycles (only 100,000 to 300,000 cycles guaranteed at best according to the datasheet).

My best thinking so far involves using the Arduino RAM in combination with an SD card:
Pseudocode:
--Initially, on the Arduino RAM itself, define an array with a very large length, e.g. 1,000 elements.
--Stack the first 1,000 measured values in the array on the RAM
--When array fills up, group-write those 1,000 values together from the array to the SD card.
--Then re-fill the array on Arduino RAM with the next set of 1,000 measured values
--... and repeat
(this way, I will only use a few thousands of write cycles to the SD card)

How would you guys approach, or how have you in the past approached, this type of problem? Or is my above solution the best option..

Initially, on the Arduino RAM itself, define an array with a very large length, e.g. 1,000 elements.

So, you have 2K SRAM in which to create arrays. 1000 elements, if they are ints, will take 2000 bytes, of the 2048 you have to deal with. Good luck with that. Look for the FreeMemory function, so you have that handy when your Arduino behaves erratically.

Buffering data, and writing large blocks to the SD card at once will be needed for performance. Look at the block size that makes sense, though. Typically, 512 bytes makes sense. Odd (non-powers of 2) numbers like 1000 (or 2000) do not.

Even if you are writing 100,000 values per day, the SD card will last quite a while. You are not writing all 100,000 values to the same cell. If you fill the card every day, it should still last 100,000 to 300,000 days. You planning to live that long?

From internal memory point of view consider a MEGA it has 8KB of RAM, together with some extra RAM

Might be part of the puzzle ...

@robtillaart: I was trying to keep it smaller (perhaps switch to my own breadboard Arduino), but in this case, seems Mega might be better/necessary indeed.

@PaulS: Ah, so the tech in the SD card (I believe this is what "wear leveling" is?) implies that EACH cell/sector gets 100K erase/write cycles then, which means, assuming a small enough write size each time, I would technically be able to write as many times as 100K multiplied by the number of cells/sectors which would come out to many years.

POSSIBLE TIMING CONCERN
As for the buffer/write strategy, how can one prevent the following problem?:
--Let's say data is being collected from the sensor to a 1000-element array buffer at intervals of 500milliseconds per each of the 1000 datapoints.
--And let's say the write time for each buffered block (of 1000 array elements) to the SD card takes 1000milliseconds, just as an exaggerated example.
Doesn't this mean that, during these 1000milliseconds of SD write time, one or two datapoints will fail to be collected to the RAM from the sensor because the Arduino is busy writing to the SD card?

How about writing the data to one of the very large FLASH memory type shields that have been posted recently, such as the one Rugged Crcuits offers (and there have been posts to an even larger one too).

I would technically be able to write as many times as 100K multiplied by the number of cells/sectors which would come out to many years.

As many as 100K * the number of cells / the number of bytes written.

POSSIBLE TIMING CONCERN
As for the buffer/write strategy, how can one prevent the following problem?:
--Let's say data is being collected from the sensor to a 1000-element array buffer at intervals of 500milliseconds per each of the 1000 datapoints.
--And let's say the write time for each buffered block (of 1000 array elements) to the SD card takes 1000milliseconds, just as an exaggerated example.
Doesn't this mean that, during these 1000milliseconds of SD write time, one or two datapoints will fail to be collected to the RAM from the sensor because the Arduino is busy writing to the SD card?

As stated before, write in blocks of 512 bytes or a multiple thereof. Optionally add "filler bytes" to get this number, it is much faster. 2(?) months ago there was a thread abou SDcard speed, and I don't recall the numbers but IIRC they were fast given the right library (The SD lib is slower than the sdfatlib )
google ==> - http://arduino.cc/forum/index.php/topic,69263.0.html -

Also recall that there is serious difference between cards itself in terms of speed and reliability. Check wikipedia on SD cards.

..in order to discuss properly you have to state how big the element (the datapoint) you want to write somewhere every 500ms actually is.. but it seems you can simply write the datapoint directly to the sdcard. Sdfat shall manage that for you, any sdcard is fast enough to write ~few bytes every half second on it..p.

OK all, Thanks again for the insights. Here's a positive update:
I've used the sdfatlib suggested by robtillaart; huge difference in speed using that library or using certain strategies with the regular SD.h, as suggested at this thread by the author of sdfatlib: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1293975555
And I'm also using 512-byte blocks for storing data before writing to the SD card and things are going smoothly so far, I think. Will post a generic version of the code for general use when I'm done.
I'm able to write each block of 512 bytes in around 60 milliseconds, which is fantastic, and doesn't really interrupt the flow of other things for my requirements.

Sounds like a "speed barrier" is broken and the way is free to a working sketch!

Guys - been following this one for a while, just got back to a project I'd put aside for a while..

I've got everything working with SDFat, but I'm running into what I'm sure is a boneheaded mistake on my part. My data structure is 12 bytes wide, and I'm writing to a 512 byte buffer before flushing to the card. When I look at the data on the card, I see my 512 bytes written out perfectly, but for some reason I'm getting an additional 8 bytes written to the file as well?!

Obviously this is not what I want - I want a file of 12 byte records, and the "extra" 8 bytes causes an offset. Any thoughts?

sizeof(MyBuffer) is 512, and what I get in the file is
512bytes + 8 bytes + 512 bytes + 8 bytes, etc... The "extra" bytes are all 0x00.

EDITED : Ok, so I fail at basic math - problem solved!