MCP3301 13 bit ADC problem

Playing with a MCP3301 ADC with SPI interface. The following code should work fine, but on an experimental setup with a pot it gives max of only 3990 instead of 4096, and at the other end it dropps down nicely to about 4 or 5 and then suuddenly outputs 4025.

#include <SPI.h>
int result;
byte highByte;
byte lowByte;

void setup()
{
pinMode (10, OUTPUT);
SPI.begin();
Serial.begin(19400);
}
void loop()
{
digitalWrite(10, LOW); //start it off
delayMicroseconds(10);
highByte = SPI.transfer(0x00); //get high & low bytes
lowByte = SPI.transfer(0x00);
digitalWrite(10, HIGH); //stop
delay(20);
result = ((0xF & highByte) << 8) | (lowByte) ; //put them together
Serial.println(result);
delay(200);
}

In the case of the 3301, lowByte fills 8 places so nothing is needed. "xxxxxxxx BBBBBBBB"
The highByte is in the form "xxxxxxxx xxx-AAAA"
and has to shift left 8 spaces, giving "xxx-AAAA xxxxxxxx"
They're then joined with OR (|), giving "xxx-AAAA BBBBBBBB"

So the formula is ((0xF & highByte) << 8) | (lowByte)

I can't understand why I get this sudden jump from 2 or 3 to 4025.

Is the pot in good shape? It could have a mechanical problem. I might try a different one. Better yet, I might try fixed resistors, and also connecting the ADC directly to ground and Vcc (or whatever voltages represent min and max).

Thanks for your input Jack. No, I tried a couple of pots and a resistor wheel with the same result. But another test I did suggests it is something to do with timing. The code herewith has a section with a 'delay 1000' allowing one to see the construction of a read bit by bit. This works as it should, so obviously the 1 second delay has some effect. In spite of not using SPI.h it seems to be (I think the term is) asynchronous.

int readvalue;

void setup()
{
pinMode(10, OUTPUT);
pinMode(12, INPUT);
pinMode(13, OUTPUT);

digitalWrite(10,HIGH); // disable adc
digitalWrite(13,LOW); // set clock low ready

Serial.begin(9600);
}

void loop()
{
readvalue = 0; // zero output value

digitalWrite(10,LOW); // enable adc
delayMicroseconds(10); // wait a bit

clockCycle(); // lose first 4 bits
clockCycle();
clockCycle();
clockCycle();

for (int i=0; i<=11; i++) // setup 12X loop
{
readvalue += digitalRead(12);
readvalue <<=1; // shift byte left so next
// bit will be left

Serial.println(readvalue,BIN); // show binary to date
delay(1000);

clockCycle(); // cycle clock to read
// next bit
}
digitalWrite(10, HIGH); //turn off device
Serial.println(readvalue,DEC); // show final reault
}

void clockCycle()
{
digitalWrite(13,HIGH);
digitalWrite(13,LOW);
}

Sorry I cocked up the above code. I can't remember which of the lines below work -- anyhow one of them does.

for(int i=0; i<=11; i++){
readvalue += digitalRead(12);
readvalue <<=1;

or

for(int i+11; i>=0; i--){
readvalue += digitalRead(12) <<i;

Oopsy !!