Digital scale-driven robot

I want to make a robot that reacts to values from a digital scale. I can get an analog output scale or USB output scale. I'm not entirely sure how the USB shield would work for arduino or if that's the way to go. Analog would be easier but the 10bit ADC limitation makes me sad. I'm using an arduino UNO. I'd also like the arduino to be able to move a servo or motor, preferably multiple motors.
Is this feasible? Has anyone worked with USB input? I'm still leaning toward analog...thinking about trying to add a higher bit ADC to it but I know the UNO is pretty limited on memory.

Thanks everyone!

In case anyone is curious--planning on using loadstarsensors.com. They're the only place I could find that offers nice outputs for their scales. I'll want up to 1kg with a .1g precision. Money isn't an issue but I'd rather it not be horribly expensive. Willing to tr a different micro controller if I have to as well. I like arduino though

.01% resolution is pretty tight. In theory that's 14 bits, but you will need 16-18 (or more) because of noise and other errors. You will need a precision reference of the highest quality and calibrate the system regularly.

I don't know of any available shields with higher accuracy ADCs, but they can be made pretty easily as in the links below.

http://forums.adafruit.com/viewtopic.php?f=31&t=12269\
http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1261408885

You can gain some accuracy by oversampling and changing the reference, but you probably won't get .01% using the ADCs on Arduino.

IMO your best option is to get a scale with a serial port and read it digitally. The OEM has done all the critical design work and you can buy as much accuracy as you can afford.

A scale with RS-232, or one that can be hacked to intercept SPI or I2C would be the easiest. If you get a USB scale I don't know if the Arduino can be a USB Host, but I've not done anything with USB. Hopefully others can enlighten us, or ask in the other forum categories where they may be more familiar with the issues of USB interfacing.

The simplest, though not the cheapest, solution may be use a cheap, old PC to host the USB and a simple program (Python, Visual .NET etc) to read USB and pass instructions over serial to the Arduino which controls the robot movement. If you need any heavy duty calculations or lots of memory, it can be done on the PC as well.

Steve

P.S. Please post more details about your project when you have it. I imagine it is some kind of sorting operation which might be similar to a future project of mine to sort spent cartridge brass by weight, length and possible other factors (caliber, headstamp, brass/nickel/steel, split/damaged etc.). I am an EE, so the electronics are pretty straightforward, but I need a brilliant idea for the mechanical handling side of things that doesn't cost a fortune.

If you could give me a little more information on dealing with noise that'd be cool haha. I just did the simple log(10000)/log(2) and figured I'd get away with 14bit. The website that I had posted seems to specialize in easy-accessible outputs so I was hoping to be able to focus on the rest of the project more than just reading the scale.
"The base output from our digital load cells (iLoad Series and iLoad Pro Series) and interfaces (DI-100) is Serial TTL output (5V level digital output)."
That came from their website, seems to be what you're talking about. Thanks for pointing me in the right direction. I'm an EE student but I'm only a sophomore so I'm pretty confused. Sounds like with serial output I can forget about both USB and analog altogether, which is awesome.

I want to avoid using a PC because I want this robot to be able to run independently when I'm not around and, in theory, continue to run after I leave the state.

For the curious reader (sdturner):
The purpose of the robot is to fill up unknown liquids to an exact amount (100mL,250mL, 500mL). The robot will be given a density input. Between that and the known mL that it needs to fill to, it can figure out how many grams the liquid should weigh at 100mL, etc. It will be instructing a servo attached to a spicket that will turn the spicket on and off while filling or once filled. Aside from that, it needs to be able to move a new bottle under the spicket, put the cap back on the filled bottle, and put the filled bottle into a storage area. I can talk to you about mechanics a little later on, I'm working hard to use as few moving parts as I can (as they are most likely to fail).

Serial TTL is ideal; that is what the Arduino (or any micro with UART) actually has built in. It means you don't need an RS-232 level shifter (such as MAX232 family).

I tend to lump all sources of error into the label 'noise'. This can be RF and audio frequency noise as most think of it, but also temperature, voltage, aging and anything else that causes error in the measurement.

Most ADCs will spec their total error as +/- 1/2 to 1 bit (some even more). This means that you need to ignore the least significant bit when comparing 2 values because the real value could be 1 bit higher or lower. Also pay attention to the test conditions; is it at 25C or from max temp to min temp? Does the error get worse when sampling faster? Does it need a certain settling time when you change channels?

The ADC reference is critically important. If you calibrate the system when the voltage is 5.00V and later it is 5.05V, all measurements will be 1% higher. If the voltage reference is noisy, for example because it is also power digital circuitry that is switching on and off, then you can take consecutive reading with the exact same input and get different results, or different readings depending on which outputs are on and off. This can be a significant source of error on the Arduino since the default reference is the main power supply for everything. Precision References http://de.mouser.com/Search/Refine.aspx?Keyword=precision+reference which supply only the ADC reference and the sensor will significantly reduce this source of error.

Silicon changes with temperature. Normally 'room temp' changes of +/- a few degrees are not a problem, but if you are trying to measure with extreme accuracy then even small temperature changes can be significant. That is one of the primary reasons laboratory scales say they should be turned on for 30 minutes before they are used. Some precision instruments use a heater to keep the critical circuitry (sensor, ADC & reference) at the same temperature all the time.

I'm sure you've had basic circuits and know RC filters. If you've had a filters / op-amp class then you will have seen some active filters. If you have had classes in sampling you will have heard of the Nyquist frequency Nyquist frequency - Wikipedia. If you use a single pole filter with a cutoff frequency of 1/2 your sample rate, it will not eliminate the higher frequencies, only attenuate them. You need a multiple pole filter and/or a cutoff frequency far lower than your sample rate. Ideally you want to filter the heck out of signal, but the R's and C's (and op-amps) will also be sensitive to all the sources of error, so it is a trade off.

The simple way to get around all this is to use the best components you can get, sample with higher resolution than the calculations say you need, over-sample and average to eliminate short term noise and then ignore a few least significant bits.

I think your best bet for this project is to use a commercial scale and read it via serial TTL, but...

If I had to do this, I would look on the websites for Analog Devices, Linear Technology, Burr-Brown etc and see if there are any reference designs and/or white papers for something similar. The maker of the loadcell should also have some tips and hopefully, reference designs.

If I had to do it all myself, I would want a load cell rated at least 2x my needed resolution (0.05g, preferably even better) with very low error specs, the best precision reference I could get and an 18 or 24 bit ADC. I would oversample 32x or more if there was time, sum the readings and then right-shift to get a 14 or 15 bit results. Then I would put a calibration weight on the scale and log continuous readings for a day or so to see what my max and min is. I would test it with the external power supply voltage at min and max and if possible, I would test it in a thermal chamber across slightly greater temperatures than would be expected. If no thermal chamber is available, I'd test it warm with a blow-dryer, space heater, black box in the sun etc. and I'd test it cool/cold in a refrigerator, outside or in front of an air-conditioner vent etc.

This assumes the accuracy is really needed. In my experience people will tell you they need a lot more accuracy than they are willing to pay for. In a laboratory situation the researchers may have thought about what they need, but it may be even more likely they are asking for higher precision because the last they want is to publish a paper and then find out their great discovery was all just a figment of poor measurement.

Good luck,
Steve

I really appreciate your in-depth response, I'm learning a lot from just attempting this project. I got in touch with loadstar and they do have the serial TTL output, which I'm going to go ahead and jump on since that seems to be the best option. The things are so damn expensive, I really hope the project works. Trying to figure out what makes it so expensive. The scales are nearly a grand and their precision is .1-.2 grams. I didn't even see an option for .05grams so I'm curious what exactly they're charging so much for. Will take a look at what other companies offer the serial TTL option because that truly does seem like the right option. Thanks a lot, I'll update with pictures and more information or questions when I can.