#define passing wrong values within calculations

I'll not post my code here as it covers 13 pages, but if developers want a copy please ask....

This is the second occasion were I have positively identified that defines are not being passed correctly! Makes you doubt your sanity!

The code concerned is selecting a chunk of data to present to a graph. The size of the chunk/memory locations is calculated on the fly

In this case I have several defines, one is a calculated data-stripe width for stored data (16 bytes in this case), In this routine I also use a memory start location, and the number of data stripes required.

All values within the defines are confirmed to be correct (Serial.print within the subroutine)

My graph was showing rubbish! When I examined the values I eventually found that the data-stripe size was using a value of 14 not 16! So my data was not being read from a data-stripe boundary.

I passed the defines to integers and it worked perfectly - kind of destroys the point of a define!

What's going on?

The best way to proceed is to write a minimal, complete, sketch that demonstrates the problem. This will make it easy for us to provide an answer. I've found that usually by the time I get done writing such a sketch I understand the problem much better.

I've found that usually by the time I get done writing such a sketch I understand the problem much better.

And that the problem is with MY code, and MY assumptions, NOT the preprocessor, NOT the compiler, etc.