Ok I can but I will need to find some time to do it so in the meantime, I'm just explaining what the different parts of the circuit are for.
I think it's sort of irrelevant what exact voltage I need because whether it's 0.7V or 1V, the spec sheets I've looked at for amps say they will only operate 3-5V higher than the low rail and 3-5V lower than the high rail. Therefore even if I were charging 2 cells at nominal voltage (2.4 in the case of Ni-Cd), I'd still be unable to do so.
Another reason it's important to have 0V output is because if I don't, the amplifier will always have voltage on the output, which will screw up my discharge mode, since the battery will be fighting against a constant voltage being applied against it and I won't be able to implement any sort of "rest" period and there will always be heat and load on the components. To me, it just seems sloppy to have a design limitation like that and say it's good enough or to be able to charge 90% of batteries except the small single cells. I could never sell a product like that. It would look Mickey Mouse. I'd rather find a way to reliably charge all batteries from 0 to 25V like commercial products are capable of. I'm sure they have already figured out the power supply issues rather than compromising on the capabilities of the charging circuit. Granted, most probably don't use amplifiers but instead rely on switched power but I can't figure out how such a circuit would achieve what I'm after and maybe that's why their design is proprietary... because they've figured out an elegant way to do it with switches, gaining the benefits of high efficiency along the way.