I bought some micrometal gear motors (1006:1 ratio) specified to have a stall torque of about 5.3kgcm, able to handle 0.8kgcm long term and managing 2kg*cm for reasonable periods. I've run them through a 3D printed worm gear system which slows by 45:1 and should give a correspnding increase in torque of a factor 45, if the efficiency is poor (say 50%) for the gear train then maybe only 22 fold but still a pretty big factor.
My application has torques of around 90kg*cm but does not need much speed, not much at all.
I'm using four motors working together to provide the torque at the start of the gear train, they are in two pairs. In each pair two motors drive a shaft together, then the shafts from the two pairs are combined via a kind of system a little like an automotive differential.
I've tested the torques and my system is producing, at the slow end, only about 18kgcm of torque, a fifth of that needed, and about a 20th of what I reasonably calculate the system should do, what with the motors managing, according to the datasheet, about 2kgcm for periods.
I could beleive losses of 50% overall to various inefficiencies, but I'm losing 20 fold the torque I expect. Any idea what to check for?
My printed gearing is smooth when turned by hand (although this is when the slow end isn't connected to anything heavy it has to move, as when it does get connected there isn't any way to reach the gearing to hand turn it so motor power is the only option then).
The motors seem to be working as well as when new, no apparent damage.
My motors are getting enough voltage, I've checked, and there shouldn't be any trouble with current limits on the supply. The motors don't take much current overall as despite high torques they are slow, so the power isn't that much.
The gearing is taking the stress ok, it's multi-perimetered internally solid ABS done on a high quality printer.
I'm at a loss as to where the torque is all being lost to.