The second bit of code defines two different variables (both called i) in different scopes. The first bit of code defines only one variable called i in the global scope.
I'd bet the compiler is detecting that nothing is done with the i in the inner scope, so it doesn't bother assigning 5 to it - it just optimizes it out.
In the first case, the global i is equal to 5 after setup runs. In the second case, the global i will probably be 0 (or might just be whatever the last value was in memory there) since you never assign to it.