Randomness & Chance in Art and Computation

The concept of randomness has long been explored in art. Marcel Duchamp made several works using  ‘chance operations’. Duchamp’s 3 Stoppages Étalon employed the curves of dropped threads to derive novel units of measurement.


He also created several compositions based on chance operations. His “Erratum Musical” of 1913, is a score for three voices derived from a chance procedure. Duchamp composed this vocal piece with his two sisters by randomly picking up twenty-five notes from a hat, the notes were recorded in the score according to the sequence of the drawing.

When his work, The Bride Stripped Bare by her Bachelors, was smashed in transit following an exhibition at the Brooklyn Museum in 1926–27. Duchamp laboriously glued it back together ten years later, securing the original glass between new panes and housing it in an aluminum frame.  He famously stated that only then was the work finished.


Many of the Dada artists of the 1930’s also developed chance procedures.

This is procedure implemented by artist Triztan Tzara who composed poems by drawing words from a hat:

To make a Dadaist poem/Take a newspaper/ Take a pair of scissors/ Choose an article as long as you are planning to make your poem/ Cut out the article/Then cut out each of the words that make up this article and put them in a bag/Shake it gently/Then take out the scraps one after the other in the order in which they left the bag/ Copy conscientiously/The poem will be like you/And here you are a writer, infinitely original and endowed with a sensibility that is charming though beyond the understanding of the vulgar.

Hans Arp would create compositions by tearing pieces of paper and letting them fall onto a canvas.

According to the Laws of Chance 1933 by Jean Arp (Hans Arp) 1886-1966

Much later, many of John Cage’s musical compositions explored chance and ways of incorporating deliberate random processes into the production of music. He aimed to open up his compositions to include unforeseen or unintended sounds, attempting to eliminate subjective or conscious arrangement of sounds.

He would make use of chance procedures to compose scores, inventing rule based chance systems to render unfixed compositional techniques, a sort of programmed indeterminacy.

Screen Shot 2015-02-23 at 1.49.28 PM

Fontana mix consists of a selection of tools and rules for how to make the composition. This leaves it possible for the performer to both render and perform the musical score, a musical score that would render differently every time.

This development comes to a culmination in Cages composition, 4:33, known as the silence piece, It consists of three passages amounting to 4 minutes and 33 where the performers are instructed not to play their instruments. Here’s a version by the BBC Symphony orchestra.

And here’s a death metal cover.

For more examples:

So what about randomness in computation?

Randomness is very important in computation. It enables the realization of convincing simulations, it is useful in decision making and is used to achieve specific aesthetic results.


Very early on in the development of computation, people started searching for ways to obtain random numbers, however it has been an ongoing challenge as computers are precise calculating machines.

The RAND Corporation published the book A Million Random Digits with 100,000 Normal Deviates in 1955 to address this probem. You can grab your own copy right here. Look up tables like these were one solution to the problem.


John von Neumann first suggested calculating random numbers in 1946; his idea was to take the square of the previous random number and to extract the middle digits. There is a fairly obvious objection to this technique: How can a sequence generated in such a way be random, since each number is completely determined by its predecessor? The answer is that the sequence isnrandom, but it appears to be, this is the concept of pseudorandomness. It is random simulation and it turns out that this particular method often results in predictable sequences of numbers.

From the wikipedia on pseudorandomness,

To generate truly random numbers requires precise, accurate, and repeatable system measurements of absolutely non-deterministic processes. Linux uses, for example, various system timings (like user keystrokes, I/O, or least-significant digit voltage measurements) to produce a pool of random numbers. It attempts to constantly replenish the pool, depending on the level of importance, and so will issue a random number.

Perlin Noise

Perlin noise, was developed by Ken Perlin and helps add realistic randomness to CG renderings of natural elements like smoke, fire, and water. Perlin developed it for the early ’80s for a strange little film he was working on called Tron. In 1997, he won an Academy Award for his discovery. He also never patented the technique which is why we can all use it freely today.

For more details, see this presentation. Or this gives some insight into the math behind it.



It works by interpolating between random values to create smoother transitions than the numbers returned from random(). The noise function in Processing can be used to generate Perlin noise.

Noise is used in the following example to generate the y position of these rectangles. The larger the noise variable, the more varied the results.

Here is some example code of the above example but where the noise variation is set to 0.1:

float v = 0.0;  //starting variable
float inc = 0.1; //amount we change noise variable for each rect.
void setup() {
  size(600, 100);
  for (int i = 0; i < width; i = i+4) { //loop through value for x of each rectangle
    float n = noise(v) * 70.0; //generate y value for each rectangle
    rect(i, 10 + n, 3, 20); //draw rectangle
    v = v + inc; //generate next noise value for next rectanlge
void draw() {

Here’s a random selection of contemporary examples: