in a part of my code, i have to convert binary into decimal which i included the code below:
long BINtoDEC(byte b[]){
long n = 0, p = 1;
for (int i = 15; i >= 0; i--){
n += p * b[i];
p = p * 2;
}
return(n);
}
this works just fine for input numbers of maximum 2 bytes(16 0s and 1s) but I want to go higher than that to 3 or even 4 bytes of data. to do this I replaced number 15 in my for loop with "sizeof(b) - 1" which should work well for any size of input data. but it doesn't work. when I feed it a byte array of length 20 and print "sizeof(b) - 1" inside the function on the serial monitor, i get 1 instead of 19:
long BINtoDEC(byte b[]){
long n = 0, p = 1;
Serial.println(sizeof(b) - 1);
for (int i = sizeof(b) - 1; i >= 0; i--){
n += p * b[i];
p = p * 2;
}
return(n);
}
can someone please help me figure out what's going on here? finding the size of an array inside a function is an integral part of my code for the rest of the project. any help would be highly appreciated.
Your function is basically taking a pointer to an array so it has no knowledge of the size of the array that you passed. You will have to pass the size of the array as an extra argument to the function.
always post full compileable code please.
You can't determine the size of your array inside the function, because it's just a pointer (2-1=1)
Do the sizeof before you call the function and hand over the length as a second parameter.
long BINtoDEC(byte* b, int len){
long n = 0;
while (len--) {
n <<= 1;
n += *b++;
}
return n;
}
Although I’d question why you even need to array.
Wherever these bits are coming from in the first place, just shift them into a long as they arrive. Job done.
thank you for your help, im super new to c++ and im still learning as i code. in a different part of my code, I get 2 bytes of data in an array where i split them in the middle and convert each of the resulting arrays into decimal to be sent over SPI lines to the master. the master then receives the two bytes and smash them together and then turn it back to decimal. this allows me to send 16 bits of data from slave instead of 8. it seems redundant and overcomplicated but it does the job...
There is undoubtedly a much easier way to solve the real problem, if you could take the time to describe that, instead of the problem you are having with your attempt to solve it.
In CC++ arrays are utterly primitive. If you want to take account of the array size at runtime, you
have to explicitly pass the size as another function argument. And you have to get it right as
the language does no checking for you, its dead easy to shoot yourself in the foot, unlike most
high level languages.
sizeof() is a compile-time only thing, and is only meaningful for things whose type implies their size,
such as an int, or a constant array. Its undefined for arrays passed to functions as only a pointer is
passed.
jremington:
There is undoubtedly a much easier way to solve the real problem, if you could take the time to describe that, instead of the problem you are having with your attempt to solve it.
so i have a master (arduino uno) and 4 identical slaves (Arduino nanos) that are connected to each other using SPI. the master sends commands to each of the slaves and they do whatever the master told them to do, then they need to send the result back to the master (upon masters request ofc). the result often consists of 2 bytes of data but I can only send it one byte at the time, so I'm trying to turn my result to a binary number and then split it in half and send each of the resulting bytes to the master. then I'm gonna mash the bytes together at the master's side and then turn that back into decimal for further processes.
im pretty sure there is a better way to do this but im new to this and it seems like its gonna work.
jremington:
Everything in computer memory is binary.
Please describe the "result". If it is a multibyte number it is trivial to split it into individual bytes for transmission and reception.
i know that everything is in binary. lets say my "result" is the number 54321, this number is 2 bytes. now i want to send that number over spi to the master, so as you said i need to send it byte by byte. i will convert the 54321 into binary on the slave side, split the two bytes and then send it one byte at the time over the spi to the master.
on the master side, i will receive the two bytes, put them back together, then convert it back to decimal for my dead reckoning calculations.
i hope this makes more sense now.
Please pay attention to the first line of reply #2. Along with fulfilling that request, please explain in detail, how the behaviour of the program differs from your expectations.