Differences in machine code from different IDE versions? (analog I/0 timing)

I have a (reasonably complex) analog I/O routine, which works just fine when compiled on my Linux host, but does not work at all when compiled on my Windows box. (Needless to say, the source-code is identical and compiles without errors on both platforms.)

Is it possible that the same hardware has radically different timing (or other) behaviors under one compiler versus another?

The "successful" IDE is identified as "Arduino 2:1.0.5+dfsg2-4" running on a Debian OS. The "unsuccessful" IDE is 1.6.5 on Windows 10

Has anyone seen this kind of issue? Thanks! John

FWIW: My sensor requires a sequence of actions in order to be read, which I've coded as follows:

pinMode(myPin, OUTPUT); digitalWrite(myPin, HIGH);

// do other stuff for awhile

pinMode(myPin, INPUT); rawValue = analogRead(myPinAnalogChannel); pinMode(myPin, OUTPUT); digitalWrite(myPin, HIGH):

// do stuff until time to read the sensor again

With the "good" compiler, I get a nice spread of readings that agree with the physical situation. With the "bad" compiler, every reading is around 320 +/- 20.

Is it possible that the same hardware has radically different timing (or other) behaviors under one compiler versus another?

Yes. You not only have different compilers, but also different core libraries. Offhand, I don't see anything that should be different, though.