Charging circuit verification

So I've been playing with electronics a while now, but have never done a real big project and I decided for my first one I wanted to make a sealed lead acid battery charging circuit.

My eventual goal is to layout the board and have a PCB manufactured and solder up the board myself. I then intend to use it to charge 12V sealed lead acid batteries I want to use in mid sized robot chassis.

I have done quite a bit of research and managed to get a schematic drawn up and was wondering if anyone here could look at it as well as my charging logic (below) and tell me if I am on the right track or not. The schematic is attached.

Charging logic:

  1. set ctrl pin(pwm) such that current into battery is always limited to max charging current on battery spec (should this current be at 14.7 V?)
  2. as voltage come towards 14.7 the current will start to drop inherintely as battery resistance rises (is this correct)
  3. once current is low enough light up LED to indicate charge complete

  1. set ctrl pin(pwm) such that current into battery is always limited to max charging current on battery spec (should this current be at 14.7 V?)

Does it mean you gonna to regulate charging current using PWM?
What is approx. max current? IRF510 isn't logic level, and at +5 V Gate could pass a few amps, have you verified with data sheet?
You aware, that smoothing out PWM (1k and 10 uF) you gonna dissipate significant amount of power on transistor?

  1. as voltage come towards 14.7 the current will start to drop inherintely as battery resistance rises (is this correct)
  2. once current is low enough light up LED to indicate charge complete
  1. Isn't correct, battery resistance lowers.

I was planning on using the low pass filter to turn the pwm signal into a pseudo Digital to Analog converter. And the transistor is not logic level but data sheet for battery specified charging current of 100mA so setting the MOSFET to threshold voltage (4V if I remember) would allow about 100mA per the MOSFET data sheet.

I am aware the MOSFET will get hot and expecting to need a heat sink of some kind.

And ok good to know I had the resistance thing backwards but from my understanding the current will decrease as the battery charges at a constant voltage.

My biggest question is when I am providing the constant current what voltage should that current be at (the full charge voltage of 14.7?)

Depends on battery, wiki says:

Open-circuit (quiescent) at full charge: 2.10 V
Open-circuit at full discharge: 1.95 V
Loaded at full discharge: 1.75 V
Continuous-preservation (float) charging: 2.23 V for gelled electrolyte; 2.25 V for AGM (absorbed glass mat) and 2.32 V for flooded cells
All voltages are at 20 °C (68 °F), and must (for a 6 cell battery) be adjusted by ?0.0235 V/°C for temperature changes.
Float voltage recommendations vary among manufacturers.
Precise float voltage (±0.05 V) is critical to longevity; insufficient voltage (causes sulfation) is almost as detrimental as excessive voltage (causing corrosion and electrolyte loss)
Typical (daily) charging: 2.37–2.4 V (depending on temperature and manufacturer's recommendation)
Equalization charging (for flooded lead acids): 2.5 V for no more than 2 hours.[citation needed] Battery temperature must be absolutely monitored.
Gassing threshold: 2.4 V

This is the datasheet for the battery I was looking at: http://www.batteryspace.com/prod-specs/LA-12V1.2.pdf

I will be charging it at room temp (which for me at the moment is 22 deg C) so 20 is a plenty close enough value. And the datasheet actually says to ignore temp variations unless outside range of 5 to 35 deg C which I am not.

So from my understanding form a code perspective all I need to do is set PWM value such mosfet sees 4V at gate such that current is limited to .1A. And taking the datasheets average value of charge voltage for cycle use for 6 cell battery is how I came up with 14.7V so it would seem that's correct.

So here's my scenario:
The battery will use up as much current as is available (.1A in this case) until it gets closer to 14.7 at which point the current will start to drop off? Is that it?

If so then I am confident my circuit should work.

So from my understanding form a code perspective all I need to do is set PWM value such mosfet sees 4V at gate such that current is limited to .1A. And taking the datasheets average value of charge voltage for cycle use for 6 cell battery is how I came up with 14.7V so it would seem that's correct.

No, there are a lot of variables, temperature, mosfet threshold voltage, battery state, you can't control a current by setting specific voltage. Better to measure current itself, and adjust voltage accordingly to keep 0.1A. Charging chart shows 13.8 V as max voltage and they recommend constant current / constant voltage as a method of charging. It looks like it takes only 7 hours for fully discharged battery to get up to 13.8 V ( current exact 0.1A) and after that current drops but voltage stays 13.8 V, if it's me I wouldn't reinvent a wheel and follow their recommendations

Ok.

I put the resistor in and the two voltage dividers in series so I could effectively measure the voltage drop across the resistor and from that determine the current so it should be easy enough to implement a feedback loop that adjusts voltage on gate of MOSFET to maintain .1A

Threshold is between 2V and 4V but 5V corresponds to 1A so I will never need to get that high so I can definitely adjust my voltage output between 2 adn 4.x to maintain the 0.1A

I am going to choose to ignore temp for now because I know they will be being charged in a steady climate controlled room very close to 20 deg C.

I was following their recommendations but was doing so for cycle service not for standby use and their graph only shows standby use so I was going off of the table a little below that graph and to the left.

O'K, I see, and you choose 2.45V per cell?
Using high side current sensor, you 'd have only 0.1V drop across a resistor, plus after voltage divider it diminishes to about 1/3, and after all 30 mV at analog inputs difference 'd be present. That accounted to 6 "tick" of the ADC w/o voltage amplifier. I think it's better to transfer current sensing resistor down into source path of the mosfet. As 0.1V voltage loss 'd not make much difference, same time current control loop 'd looks locked and more accurate.

Correct: I chose 2.45 per cell.

If the resistance is 0.1 ohms and the current is 0.1A the voltage drop would be .01V correct? Which would mean I would only be seeing 3mV difference at the analog pins which is probably extrodinarily useless... But thanks for the heads up! I wasn't thinking of that being an issue.

So what would I expect the voltage drop to be if I move the current sense resistor the other other side of the mosfet (source path)?

If the resistance is 0.1 ohms and the current is 0.1A the voltage drop would be .01V correct?

Opps, right, I thought it's 1 OHm. Probably, you should make it 1 ? Moving resistor to gnd path, you may select internal voltage reference of the ADC , and get 5x ( or 2x) better accuracy. I'm not ready to comment on ATTINY ADC internals, if there is a PGA?

I'm not sure what a PGA is, but attiny has 1.1V or 2.56V internal reference, so if I used the 1.1V that would definitely help improve accuracy. And unless I am mistake making resistor value would also help with accuracy so ya I see no reason not to make it 1 ohm or even 10 ohm. Of course the higher the value the more power it has to dissipate but as long as the resistor is rated correctly not an issue.

So if I made it 1.1V and used 1 ohm resistor I would expect to see 30 ticks at 0.1amp or a 10 oihm would give 310 ticks

Could I set internal voltage ref and yet leave the resistor on the side it is currently? Since I am only looking at the difference between the two voltages?

PGA is programmable Gain Amplifier.

Could I set internal voltage ref and yet leave the resistor on the side it is currently? Since I am only looking at the difference between the two voltages?

No, as it shown on your circuits there is about 5V at voltage divider outputs, and ref. voltage can't be lower than that.
BTW, what part number of ATTINY you gonna to use? Look in data sheet, if it supports differential mode and PGA. Something to think, otherway 10 OHm resistors may be an answer.

I was thinking attiny85.

I had no idea you could do a differential analog setup or use gain until you mentioned it here but after looking at data sheet it appears a though attiny85 has both of those. Seems to be a very good option for my goals here.

No, as it shown on your circuits there is about 5V at voltage divider outputs, and ref. voltage can't be lower than that.

I am likely going to use the differential method, but for curiosity sake:

If I set voltage ref to 1.1 and setup resistors in such a way to give me 1.1V out when I had 14.7V at the top; would that give better resolution or worse or the same as compared to just leaving the voltage dividers as they are now and voltage ref at 5V?

Data sheet says it has PGA = 20x, may be enough.

18.7.2
Unipolar Differential Conversion
If differential channels and an unipolar input mode are used, the result is
( V POS – V NEG ) ? 1024
ADC = ------------------------- ? GAIN
V REF

I did some math and it appears since my target is 0.1 V difference the best resolution I can get is using a Vref of 2.56 and the multiplier of 20x

Thanks for the help!