I need to store 6 raw outputs from the MPU-6050 to PC for post-processing. The amount of data is a magnitude of 1 million. I was simply copying the data from the serial monitor to excel, but later I found that serial monitor is erasing values after a certain time. I have also tried with the PLX-DAQ, but it is crashing after 2-3 minutes.
Can you suggest me a reliable method to store the data as the experiment may last for 9 hours with sampling frequency around 100 Hz?
Abhishek_parida:
Can you suggest me a reliable method to store the data as the experiment may last for 9 hours with sampling frequency around 100 kHz?
Are you sure it's possible at such rate? If you do the math, 100 KHz is 100000 times per second, and there are 6 outputs, so 100000 * 6 = 600000 times per second. UART (serial) packets require at least 11 bits to transmit 8 bits (a byte) of data (aka payload); thus 600000 * 11 = 6600000 times/bits per second (66 Mbps).
Of course, this is assuming 8-bit outputs; if those require more than one byte, then you have to even multiply 66 Mbps by the number of bytes per output.
As you can see, in theory, is not possible at 100 KHz; for that, you may have to use a MCU clocked at least to 100 MHz, and a faster communication like UDP over "Fast-Ethernet" (or better) wired LAN.
Abhishek_parida:
I was simply copying the data from the serial monitor to excel, but later I found that serial monitor is erasing values after a certain time. I have also tried with the PLX-DAQ, but it is crashing after 2-3 minutes.
I think the data volume is too large to process it in realtime. Maybe you should dump the binary data stream first, then do the rest.
Serial buffers usually aren't that big, so you need to dump the data into a file as soon as is received. At least I know RealTerm can do binary dumps from a serial port, but I'm not sure if there would be an issue with file sizes.
Keep in mind that if you somehow managed to transfer 6 bytes 100 000 times per second, dumping a 9-hour session of a continuous raw data stream, would result in a file as large as 18.1 GB, yikes! :o Definitely don't save that in a FAT32 drive!
Sorry for the mistake. It is 100 Hz, not 100 KHz.
All right then, 100 Hz * 6 bytes * 11 bits per byte (remember, it's protocol overhead) = 6600 bps.
At least as a raw data stream, now even the usual speed of 9600 bps should do the trick.
However, even at this pace, a 9-hour data capture will take 204 MB; it still can fill up the serial buffer (causing data loss when full) unless you save the data, as it comes, somewhere more spacious like a file. This is why I suggested you a direct capture (dump) of the data stream, then process it.
I don't know about the programs you're using, but I believe they do everything in RAM and create the respective file only when you're done and not when their "workspace" (the amount of physical RAM space allocated for that particular program) fills up. This is most likely the case with the one that (as you describe) eventually crashes; the other has a strange behavior that I'll dare to say it's an unfixed bug.
Unless the application is specifically coded to allocate enough RAM for the whole 9-hour session (204 MB is not much unless you run it in an early 2000's machine), direct dump to a file is the option (and again, I know RealTerm can do that).