I'm working on something to measure raindrops with a piezo buzzer and arduino (at this point I've posted a ton of times about my project, but it's due in a week so this is the final stretch). Anyhow, I finally got an opamp to work - I'm using the LTC1050 from Linear Technologies.
This is my circuit: http://imgur.com/9EFf9
Anyhow, I'm seeing this weird phenomenon where all of the raindrop signals I record show a slowly increasing voltage that levels off, instead of a sharp peak and rapidly diminishing oscillations. The signals are shown herehttp://imgur.com/sY9ck- the signal on the top is for a sensor without amplification (it has a ton of noisy oscillations but if you look at the first peak it's a nice response), and the sensor on the bottom is amplified. The different color lines represent different drop sizes that I released onto the sensor (red is the biggest, blue the smallest). X axis is time in microseconds, Y axis is... maybe voltage. I don't quite remember.
Does anybody have an idea of why this would be? I know I have a HUGE gain set, but without that I'm not sure that I could see any drops at all. Is it acting like some kind of voltage integrator? The data still looks nice if I just take the max of each drop response over the 5000 microsecond time period, but I want to understand this curve and I have a really poor circuitry background..