That regulator requires 1.5V minimum differential across the regulator, so you can only get to about 10.5V if feeding it 12V. The minimum output is 1.3V.
Do you -really- need 0V output? Everyone says they want a sig gen that does 0-30MHz, but 0Hz is DC.
I'm in the process of modifying a PC power supply so that it puts out about 18V from the 12V output. My plan is to rewire it so that the 3.3V secondary windings go to the 5V line, and a voltage divider from there makes it look like the 3.3V output is still there. Then the power supply should boost the PWM so that the 3.3V output will be putting out 5V, about 1.5 times more. So the 12V output should put out 18V.
OId PC PS used to just regulate 5V and trust the other outputs to be within spec. Newer power supplies regulate 3.3V, but check all the other voltages. So I'll need to spoof the other outputs back to the chip.
I'm not sure yet if it is really checking each one, or just feeding them all into a voltage divider so if any one output goes dead, it shuts down.
I am going to use the same linear regulator, and use 10 turn potentiometers to adjust voltage and maximum current. I really, really hate having a "coarse" and "fine" control, that is the mark of a cheap POS power supply.