The compiler is seeing three ints - 10, 10, and 1000. It performs integer arithmetic on them. 10 * 10 = 100. 100 * 1000 = -31072.
On the other hand, 10UL * 10UL * 1000UL would indicate to the compiler that the values are unsigned longs, and 10UL * 10UL = 100UL. 100UL * 1000UL = 100000UL.
Compilers are pretty complex, but not all that smart. They don't know that an intermediate value should be promoted to a larger type, unless you tell it. The UL on the end is how you tell the compiler to use a larger type for intermediate results.