So first my goal: I'm looking to get an 8 byte token for wireless communication. To achieve this I'm still looking at using the built in random().
The problem I'm having is that generating 8 bytes of randomness using repeated calls of random(), will only have 4 bytes of randomness in them. As even an ideal random starting seed is only 4 bytes long, and the code will be the same on each startup.
Is it possible to run 2 instances of the random function in parralel, each with a different seed? So I can do something like:
Was thinking some more on this.
Would it be feasable, and would it be smart to make a copy of the library that contains random(), and only change the function name to random1() to have a different instance? If this would be smart how would I go about this?
Alternatively, is there another way to acchieve my wishes?
You are probably overthinking this. If all you want is 8 random bytes, just call random() twice in a row. Each time you will get 4 random bytes. Whatever you said above about that not working, makes no sense.
I know I'll be able to get 8 random bytes by calling random twice. However, my problem is the fact that these 8 bytes are based on a single 4 byte seed. In other words if you look at the code:
long Token[2];
RandomSeed(LongRandomNumber);
Token[0] = random();
Token[1] = random();
Token will be 64 bits long, but have only 2^32 possible outcomes. It is thus unfair to call it a 64 bit random number.
monitou:
I know I'll be able to get 8 random bytes by calling random twice. However, my problem is the fact that these 8 bytes are based on a single 4 byte seed. In other words if you look at the code:
You could call RandomSeed() between calls to Random() . I see what you are thinking, that for a given RandomSeed() subsequent random numbers will always be the same sequence, but that is the same effect as using a 64-bit random function, or two different 16-bit random functions that start with different seeds, the two functions will always produce the same combined sequence once you get both seeds correct.
@sherzaad
Ohh union, seems like a more proper way to code than the way I did it. That said, no I would not concider Token.val a 64 bit random number.
The reason is that random() does not provide true random numbers. For a given seed the first call of random() will give the same "random" number every time, and the second time random() is called it will always give the same number as well. And with only 2^32 possible seeds, theres only that many amount of possible tokens.
An other way to explain my point would be to think of random as a cyclic (extremely long) list of numbers, and a call of random gives the next number in the list, but it always starts at the first place.
As the amount of possible lists is equal to the seed, this would be 2^32 lists. Meaning the first token wil only have 2^32 possible outcomes, after all, you are looking at the first two places in one of the lists.
@aarg
Require is a big word. I'm aiming to use wireless communication for sensors and controls. The communication is encrypted using AES. However to prevent capture and resend attacks I want to add a token to each message. So my communication protocol is basically:
node 1: Request token (is plain text message)
node 2: Send token in encrypted AES message (token is generated here)
node 1: Send actual message with token to verify authenticity
As an attacker, using a 32 bit token means I could capture the encrypted token, and encrypted message, and request tokens as a fake slave untill the encrypted token I receive equals captured encrypted token. Then I could send back the actual encrypted message. With a 32 bit token, requesting at a rate of 1 request per ms, it would "only" take about 25 days for an attack. And that is when capturing a single packet. If you capture multiple encrypted token-encrypted message pairs, time would go down.
And yes, this is long enough for practical purpouses. And yes, it becomes not as simple after you call random a few times. And yes, theres also security by obscurity.
So require is the wrong word to use. But by now I'm invested and want to try to make it work, and overly secure. So I'd rather say that I'd like to have a 64 bit random number.
david_2018:
You could call RandomSeed() between calls to Random() . I see what you are thinking, that for a given RandomSeed() subsequent random numbers will always be the same sequence, but that is the same effect as using a 64-bit random function, or two different 16-bit random functions that start with different seeds, the two functions will always produce the same combined sequence once you get both seeds correct.
Calling RandomSeed in between calls is a sollution, exept it takes quite some time to get a proper 32 bit seed by repeated calls of analogRead and grabbing the LSB.
Also if that means that for each call of random() I'd need a random 32 bit seed. I now have no need for the random() function at all, after all, I have two random 32 bit seeds.
Also a proper 64-bit random function would have a 64 bit (or higher) seed. Meaning theres just that many more possible options.
DO ya really think the difference between 2^32 different numbers and 2^64 different numbers is significant in your application? The Arduino itself will die of old age long before you can actually generate those 2^32 different numbers....
sherzaad:
something like this maybe in code (compiles)
If it is possible like that, that'd be ideal for me. But I'll have to admit I'm not shure how I should intepret the random() function like that. I know how to work with normal functions. But last time I tried looking into libraries I got confused. So how should I interpret the random() function inside the random(uint_64 howbig) function. Is this some kind of recursiveness?
Also not sure how to modify RandomSeed to work with this, after all as long as you have a 32 bit seed, the argument for limited random outcomes remains.
RayLivingston:
DO ya really think the difference between 2^32 different numbers and 2^64 different numbers is significant in your application?
Significant, yes. Will it matter a lot? Definetly no. But I'd like to do this properly.
Also I sure don't hope an arduino dies after 4 billion operations. After all, the milisecond timer on it is known to be able to overflow after 50 days, and that doesn't brick the arduino.
If you're really aiming for security, using random() is a bad idea. The codes are pseudorandom and so follow a predictable pattern that can easily be replicated - especially as the source code is open. Seeding won't help you with this. As you observed, the seed values have to be random and if you have a truly random source to use as a seed, you should just use that as a random number instead.
Here is a sketch I developed, that coaxes some extra randomness out of the Analog read. It's 32 bit but you could adapt it easily. I haven't tested it statistically, but I think you will find it performs well:
// random seed 1.04
//
// produces a generally random number based on processing
// error voltages from the readings taken from the analog to digital
// conversion port
// on the Arduino
// 2016-02-17 32 bit test/demo version
const byte analogPort = A1;
const int BUCKET_SIZE = 64;
int buckets[BUCKET_SIZE];
void setup() {
Serial.begin(9600);
}
void loop() {
long temp = getRandomSeed(31);
buckets[temp%BUCKET_SIZE]++;
for (int i = 0; i < BUCKET_SIZE; i++)
{
if (buckets[i] == 0)
{
Serial.print(' ');
}
else
{
Serial.print(buckets[i]);
}
Serial.print(' ');
}
Serial.println('*');
}
long getRandomSeed(int numBits)
{
// magic numbers tested 2016-03-28
// try to speed it up
// Works Well. Keep!
//
if (numBits > 31 or numBits <1) numBits = 31; // limit input range
const int baseIntervalMs = 1UL; // minumum wait time
const byte sampleSignificant = 7; // modulus of the input sample
const byte sampleMultiplier = 10; // ms per sample digit difference
const byte hashIterations = 3;
int intervalMs = 0;
unsigned long reading;
long result = 0;
int tempBit = 0;
Serial.print("randomizing...");
pinMode(analogPort, INPUT_PULLUP);
pinMode(analogPort, INPUT);
delay(200);
// Now there will be a slow decay of the voltage,
// about 8 seconds
// so pick a point on the curve
// offset by the processed previous sample:
//
for (int bits = 0; bits < numBits; bits++)
{
Serial.print('*');
for (int i = 0; i < hashIterations; i++)
{
// Serial.print(' ');
// Serial.print( hashIterations - i );
delay(baseIntervalMs + intervalMs);
// take a sample
reading = analogRead(analogPort);
tempBit ^= reading & 1;
// take the low "digits" of the reading
// and multiply it to scale it to
// map a new point on the decay curve:
intervalMs = (reading % sampleSignificant) * sampleMultiplier;
}
result |= (long)(tempBit & 1) << bits;
}
Serial.println(result);
// Serial.println();
return result;
}
westfw:
Which cpu are you using?
Yeah, but a modern desktop can loop through 2**32 crack attempts in less time than it take to go get coffee...
Which is completely irrelevant. I think it safe to assume this application will be doing something more than spending ALL of its time generating one 64-but number after another. If it generates one new number every second (and I suspect the actual rate would be FAR lower), it will take more than 132 YEARS to generate 2^32 numbers. Do you think the Arduino will still be functioning at that point?
This seems to be truly trying to pick the fly poop out of the pepper....
monitou:
Significant, yes. Will it matter a lot? Definetly no. But I'd like to do this properly.
Significant how, exactly??
monitou:
Also I sure don't hope an arduino dies after 4 billion operations. After all, the milisecond timer on it is known to be able to overflow after 50 days, and that doesn't brick the arduino.
We're not talking about "4 billion operations", whatever that means. We're talking about however many random number generations you will actually perform in the expected lifetime of whatever this thing is. I think I am quite safe in assuming in the lifetime of the device you will actually generate FAR fewer than 4 billion 64-bit numbers, and instead spend the VAST majority of your time doing other things. You will never get anywhere near 4 billion numbers generated.
If the random number is part of the key to cracking an encryption or authentication scheme, and an attacker "only" had to try 2^32 guess to successfully crack the scheme, then it's essentially not secure enough for any "real" modern purpose.
I took a Cryptography MOOC some years ago, and IIRC one of the assignments was specifically to attack an algorithm that used a "weak" random number generator, and I think it involved a loop of ~2^32 possible iterations. It didn't take very long to run...
Could you take two 32-bit random numbers and combine the individual bits in a non sequential order to increase the difficulty?
Sure. One of the standard techniques for generating longer crypto random numbers is to gather a bunch of different sources of "entropy" (PRNG here, timestamp there, first few bytes of a heavily used bugger, etc), and then "mix" them using irreversible math (the one I did used MD5.) But the point is that with the Arduino random number generator, your second 32bits isn't at all random, if you know the first 32bits. It's good enough for games (maybe), but not for any cryptographic purposes.
(The reason that I asked "which CPU" is that some (many!) of the newer chips have a hardware TRUE random number generator that is much better for such applications.)