How to measure the power consumption of a device that draws variable current?

Devices like GSM boards, RF transmitters (XBees/BT), ADCPs (Acoustic Doppler Current Profilers) that I've encountered draw the most current when they "transmit" signals. Most of the time, they operate in "idle" mode, drawing a fraction of transmit current. They transmit for milliseconds, on a controller's schedule (in the case of an ADCP).

I'd like to calculate the power consumption of these devices to buy a battery with the appropriate capacity. What I can do is measure the load and voltage drop across the power terminals of each device (using a digital oscilloscope, Rigol DS1054Z).

Can you guys help verify that my method is a reasonable way to measure power consumption? I can think of 2 ways to measure the circuit:

[Method 1: Using oscope probes] Connect the probes of the oscope in-series (at +power terminal to measure current) and in parallel (across +/-power terminals) with the power terminals of the device. Have the device operate while simultaneously programming the oscope to sample the current and voltage (saving the data to the thumbdrive).

Measure the device while it operates in "typical operation mode".

Take the current and voltage data and multiply them sample-by-sample to get the measured instantaneous power consumption. Fit the sampled power consumption curve to a function and integrate that function across the time duration of the measurement. The result of the integration is the amount of energy absorbed by the device over the duration of the measurement.

[Method 2: Using 1-ohm resistor] Same measurement procedures except instead of connecting the probes in the circuit, measure the voltage drop across and current through the 1-ohm resistor. Sample both current and voltage using the oscope and run the same calculations described above.

Another question that's on my mind:

How does the sample rate of the oscope affect our measurements and power-consumption estimates? I've taken an intro course to DSP in which we learned about aliasing. Taking a shot at the question I posed, I think if our sample rate is too slow (relative to the frequency content of the current/voltage fluctuations) our measurements are aliased and unreliable as measures of power consumption. To figure out how fast we need to sample, we need to take the FFT of the current and voltage time series, observe the highest frequency and make sure that our oscope can sample twice that.

Set the 'scope to math/integrate, capture/measure the charge for one of the transmit pulses, multiple by the number of pulses divide by time, and add in the idle current?


Let's think first: Will it always be the same data to be sent?

Because if they are not, then no use measuring. Each time the device is used it should promote different consumption,

It may be simpler to use the device for a period with a compatible battery and note how long the battery charge lasts.

Then you can know if you need to buy a larger battery, or you can even use a smaller battery.

Maybe a Power Bank phone can serve the first test.

Take the case of a smartphone, hardly the manufacturer will estimate precisely how long the battery charge will last.

Easiest method is to use a battery charger that can also measures battery discharge capacity, a lot can.

Fully charge the battery run a disharge test so you know its capacity.

Charge it again, run it for a period on the project, run a discharge test on it again.

Can you tell if the transmit current is approximately the same during the entire transmit time? Perhaps method #2 but instead of making complex measurements simply look at the average (by inspection) current during transmitting. Caution, verify the voltage drop across the 1 ohm resistor is not so large as to affect your measurement.

Then you simply calculate the average power by using the proportional time the device is in each state. This method makes it easy to look at the effect of different transmit periods.