Every language has a random() function or something similar to generate a pseudo-random number. I am wondering what happens underneath to generate these numbers? I am not programming anything that makes this knowledge necessary, just trying to satisfy my own curiosity.
How does volatile actually work?
Marking a variable as volatile in Java ensures that every thread sees the value that was last written to it instead of some stale value. I was wondering how this is actually achieved. Does the JVM emi
How does actually “&~” work?
I wonder how does it work in Unix: user$ i=5 user$ echo $((i &~ 1)) 4 what happens inside the parentheses?
How Does Middleware Actually Work? [closed]
I’m curious about NodeJS and Express. How does middleware actually work? How would I go about creating my own middleware? Does it rely on a type of Dependency Injection? As I understand you add middle
How does FBConnect actually work?
Basically, how does it work technologically? I was under the impression that SiteA could not read the cookies that SiteB set. So what exactly is going on under the engine? Basically, what would the fl
How does a Binding actually work?
I’ve been learning WPF for a few months now and I’m curious about one thing. How does a Binding actually work? I mean, what happends, under the hood. I don’t expect that anyone here would give a detai
How does file streaming actually work?
I’ve been wondering for a while now, how exactly does file streaming work? With file streaming, I mean accessing parts of a file without loading the whole file into memory. I (believe to) know that th
How does data mining actually work?
Suppose I want to do some data mining on the database of a supermarket. What does that actually mean? 1) What will the output/results be like? 2) Will the output be different every day or change over
How does .NET ExecutionContext actually work?
I am trying to discover how the ExecutionContext actually works in version 4.0 and above of the .NET Framework. The documentation says that the managed principle, synchronization, locale and user cont
How does math.min actually work?
I understand that all of the math functions in java are built in. But I was wondering out of curiosity how math.min() actually works? I checked the java documentation and couldn’t find anything to hel
how does cout << actually work?
I was wondering how std::cout is able to use << as it does. My main puzzlement is with whether std::cout as an instance of something. Basically, how is << defined? If I do this for a custo
The entire first chapter of Donald Knuth’s seminal work Seminumerical Algorithms is taken up with the subject of random number generation. I really don’t think an SO answer is going to come close to describing the issues involved. Read the book.
The Wikipedia page is a good reference.
The actual algorithm used is going to be dependent on the language and the implementation of the language.
To exactly answer you answer, the random function is provided by the operation system (usually).
But how the operating system creates this random numbers is a specialized area in computer science. See for example the wiki page posted in the answers above.
random() is a so called pseudorandom number generator (PRNG). random() is mostly implemented as a Linear congruential generator. This is a function of the form X(n+1) (aXn +c) modulo m. Xn is the sequence of generated pseudorandom numbers. The genarated sequence of numbers is easy guessable. This algorithm can’t be used as a cryptographically safe PRNG.
And take a look at the diehard tests for PRNG PRNG Diehard Tests
It turns out to be surprisingly easy to get half-way-decent pseudorandom numbers. For decades the gold standard was a remarkably simple algorithm: keep state x, multiply by constant A (32×32 => 64 bits) then add constant B, then return the low 32-bits, which also become the new x. If A and B are chosen carefully this actually works fairly well.
Pseudorandom numbers need to be repeatable, too, in order to reproduce behavior during debugging. So, seeding the generator (initializing x with, say, the time-of-day) is typically avoided during debugging.
In recent years, and with more compute cycles available to burn, more sophisticated algorithms are available, some of them invented since the publication of the otherwise quite authoritive Seminumerical Algorithms. Operating systems are also starting to provide hardware and network-derived entropy bits for specialized cryptographic purposes.
One thing you might want to examine is the family of random devices available on some Unix-like OSes like Linux and Mac OSX. For example, on Linux, the kernel gathers entropy from a variety of sources into a pool which it then uses to seed it’s pseudo-random number generator. The entropy can come from a variety of sources, the most notable being device driver jitter from keypresses, network events, hard disk activity and (most of all) mouse movements. Aside from this, there are other techniques to gather entropy, some of them even implemented totally in hardware. There are two character devices you can get random bytes from and on Linux, they behave in the following way:
- /dev/urandom gives you a constant stream of bytes which is very random but not cryptographically safe because it reuses whatever entropy is available in the pool.
- /dev/random gives you cryptographically safe random numbers but it won’t give you a constant stream as it uses the entropy available in the pool and then blocks while more entropy is collected.
Note that while Mac OSX uses a different method for it’s PRNG and therefore does not block, my personal benchmarks (done in college) have shown it to be every-so-slightly less random than the Linux kernel. Certainly good enough, though.
So, in my projects, when I need randomness, I typically go for reading from one of the random devices, at least for the seed for an algorithm in my program.