AC voltmeter

what's the best practice for AC voltmeter ? i made DC and i want to improve it now .

I know English is not my native language... But I really don't understand what you expect :confused:

EDIT: I might be too tired lol.

You can use a [u]precision full-wave rectifier[/u] to convert AC to DC. You have an option of calculating true RMS, using a true-RMS chip, or just estimating the RMS from the peak. (True RMS is usually found only on more-expensive meters.)

You can also "get by" with a precision half-wave rectifier. In fact, if you are reading power-line voltage you can use a regular diode (or bridge rectifier) in front of the voltage divider, and you can add-back the voltage drop across the diode(s).

You'll probably want to have some switchable voltage dividers so you can read voltages higher than 5V peak. You can make it auto-ranging if you wish.

You can also switch between the 1.1V reference and the 5V reference. With the 1.1V reference you can get resolution down to about 1mV. If you want to read lower voltages, you'll need to add an amplifier.

It's also good practice to protect the input from expected high voltages. You won't damage a good multimeter by applying 120 or 240VAC when the meter is set to the 1V range or if it's set to resistance. (You generally will blow a fuse if your meter is set to read current and you apply power-line voltage, or any "voltage source", across it.)

How can I protect it ? With big resistances ?