However, I am having some trouble determining the best way to measure the voltage and current due to them being AC signals. Here is my best guess at a method, let me know if it sounds reasonable:

Use a biasing circuit to make sure all AC signals are translated into a 0-5V measurement range for all operating conditions. Was going to use something such as this: https://openenergymonitor.org/emon/buildingblocks/ct-sensors-interface

I would then convert this measurement to a value by saying:

VALUE=abs((INPUT-2.5))*scalingfactor

(where scalingfactor is the correction factor to convert measurement voltage back to "real" V or A)

Calculating an RMS value for my power calculation is where I am struggling. Is it reasonable to simply collect a good amount of data points (say 20) over a period of 0.25 seconds, store them in an array, square each value and then divide out the number of samples and take the square root?

RMS=sqrt((sum(VALUE

**2, i=0..20))/i)*

(where i is the number of samples)

Let me know if I am totally confused or on the right track

(where i is the number of samples)

Let me know if I am totally confused or on the right track