I’m getting confused lately with the various serial protocols. Do the serial pins and the usb on a 2009 or a mega utilize the RS232 protocol or what?

The Atmel chips use 0-5V ttl logic level serial on pins 0 and 1

RS-232 on computer systems is a -12v to +12v swing and must be converted to be compatible

The USB is something totally different.

As for the serial pins…The RS-232 spec defines a number of things including the timing, but especially for our purposes the voltage. The serial pins on the Arduino use 0V for a logic 0 and 5V for a logic 1. RS-232 uses plus and minus voltages (minus is logic 1), and voltages near 0 are undefined. So you can do RS-232 with the Arduino using a simple voltage level converter (look at the MAX232 chip). RS-232 does not define what the bits mean (if you send a 65, that’s an “A” in ascii, but it doesn’t have to mean “A” with RS-232. And framing is not part of the spec, you don’t need 8 bits in a byte.

Do the serial pins and the usb on a 2009 or a mega utilize the RS232 protocol or what?

RS-232 is not a protocol. The RS-232 STANDARDS describe mostly the electrical charactristics of the standard, of which the Arduino and any TTL serial do not follow without electrical modifications. Even what type of data, number of bits, stop bits are not part of the “standard”.

From Wikipedia:

"The Electronics Industries Association (EIA) standard RS-232-C[1] as of 1969 defines:

Electrical signal characteristics such as voltage levels, signaling rate, timing and slew-rate of signals, voltage withstand level, short-circuit behavior, and maximum load capacitance.

Interface mechanical characteristics, pluggable connectors and pin identification.

Functions of each circuit in the interface connector.

Standard subsets of interface circuits for selected telecom applications.

The standard does not define such elements as

character encoding (for example, ASCII, Baudot code or EBCDIC)
the framing of characters in the data stream (bits per character, start/stop bits, parity)

protocols for error detection or algorithms for data compression
bit rates for transmission, although the standard says it is intended for bit rates lower than 20,000 bits per second. Many modern devices support speeds of 115,200 bit/s and above

power supply to external devices. "