I am not sure where to start with
This is an advanced project... You have a lot of studying (and experimenting) ahead of you!
Regulated power supplies work by comparing the output voltage to a reference voltage in a feedback loop. The voltages can be scaled (with a voltage divider, etc.) so the reference doesn't have to match the output voltage. They just have to be proportional. Usually, that's done with a comparator circuit so you don't need need an ADC and you don't have to actually measure the voltage. But you do need a DAC to "program" the reference voltage.
Current limiting/control is similar except you need a current sensor (or "small resistor") to "convert" voltage to current. If you use a resistor the voltage regulation goes after the resistor to compensate for the voltage drop.
- --There is no limit for Input voltage and current
What? 1,000V or 10,000V? There's always a voltage limit.
The current depends on the power supply design and the current/power-out. (You can't get more power out than you put-in.)
- ---Current output 0-3A (1mA resolution)
A normal (constant voltage*) power supply can't also have fixed current (Ohm's Law). The current limit can be fixed or variable/programable. If you "try" to exceed the current limit either the voltage is reduced to hold at that current or the the voltage is cut-off (or nearly cut off) to cut-off the current (similar to a circuit breaker or fuse).
A constant-current* power supply adjusts the voltage as necessary to maintain the current. If the load is "too light" (resistance is too high) the voltage goes-up to the maximum of the particular power supply.
If you have a current-limited power supply you can "make" a constant-current power supply by setting the current limit and then setting the voltage to maximum. But of course you might not achieve the current depending on the maximum available voltage and the resistance of the load.
- ---Voltage output 0-5vdc (1mV resolution)
1mV (or 1mA) resolution is not "easy" and you'll get voltage drop in the wires & connections.** There is something called a 4-wire connection where a separate pair of wires is use for voltage-monitoring. That way, the voltage drop across the current-carrying wires is automatically adjusted-for by the voltage regulator.
Input voltage or output voltage? Output voltage is presumably already regulated. Input voltage protection is often used along with a fuse. That's called a "crowbar" circuit. The analogy is a crowbar dropped across the input (when voltage exceeds the limit), shorting it out and blowing a fuse.
and over-current protection (maybe send alert to Arduino)
That's already built-in to your current control/limiting.
- ---Short circuit protection
Again, built into your current limiting.
- In the case of a variable power supply the voltage (or current) isn't actually constant, but it's controlled and independent of the load.
** I'm testing some boards at work right now. The run at 5V and about 250mA. The power supply has voltage & current displays and variable voltage (with a knob, no "programming") but no current control, just current limiting somewhere over 1 Amp. When I compare the voltage at the power supply to the voltage on the board, I'm getting about 150mV of voltage-drop through the wires/connections. So, 1mV accuracy (from the power supply) would be useless... It is nice to have 1mV (or better) from the multimeter.