Cellular Automata and
Amorphous Computing
Melanie Mitchell
Portland State University and Santa Fe Institute
Complex Systems Summer School
Friday June 20, 2008
Copyright © 2008 by Melanie Mitchell
What are cellular automata?
Game of Life
Review (?) of Computation Theory
• Hilbert’s problems
• Turing machines
• Universal Turing Machines
• Uncomputability of the halting problem
Turing machine
Example of Turing machine rule set
Two fundamental theorems of computation theory:
1. There exists a universal Turing machine
2. There is no Turing machine that can solve the halting
problem.
Interesting problem: Given an initial configuration, can
you calculate analytically how many steps Life will run for
before it reaches a fixed configuration?
Universal Computation in the Game of Life
What is the feasibility of using this kind of
universal computation in practice?
Von Neumann’s self-reproducing automaton
Von Neumann’s self-reproducing automaton
John von Neumann
1903–1957
Von Neumann’s self-reproducing automaton
• After his key role in
designing the first
electronic computers, von
Neumann became
interested in links between
computers and biology
John von Neumann
1903–1957
Von Neumann’s self-reproducing automaton
In the last years of his life, von
Neumann worked on the
“logic” of self-reproduction
and devised the first instance of
a self-reproducing “machine”
(in software, finally
implemented in 1990s).
John von Neumann
1903–1957
Von Neumann’s self-reproducing automaton
Von Neumann’s design is complicated, but some
of its key ideas can be captured by a simpler
problem:
Von Neumann’s self-reproducing automaton
Von Neumann’s design is complicated, but some
of its key ideas can be captured by a simpler
problem:
Design a computer program that will print out a
copy of itself.
A candidate self-copying program
A candidate self-copying program
program copy
A candidate self-copying program
program copy
print( “program copy”);
A candidate self-copying program
program copy
print( “program copy”);
print( “ print(“program copy”);”);
A candidate self-copying program
program copy
print( “program copy”);
print( “ print(“program copy”);”);
print(“ print( “ print(“program copy”);”);”);
A candidate self-copying program
program copy
print( “program copy”);
print( “ print(“program copy”);”);
print(“ print( “ print(“program copy”);”);”);
A candidate self-copying program
program copy
print( “program copy”);
print( “ print(“program copy”);”);
print(“ print( “ print(“program copy”);”);”);
“A machine can’t reproduce itself; to do so it would
have to contain a description of itself, and that
description would have to contain a description of itself,
and so on ad infinitum”.
Some commands we will need in our
programming language
Some commands we will need in our
programming language
• mem
– the memory location of the instruction currently being
executed
Some commands we will need in our
programming language
• mem
– the memory location of the instruction currently being
executed
1
2
3
4
5
program test
print(“Hello, world”);
print(“Goodbye”);
end
computer
memory
Some commands we will need in our
programming language
• mem
– the memory location of the instruction currently being
executed
mem = 2
1
2
3
4
5
program test
print(“Hello, world”);
print(“Goodbye”);
end
computer
memory
Some commands we will need in our
programming language
Some commands we will need in our
programming language
• line(n)
– the string of characters in memory location n
Some commands we will need in our
programming language
• line(n)
– the string of characters in memory location n
1
2
3
4
5
program test
print(“Hello, world”);
print(“Goodbye”);
end
Some commands we will need in our
programming language
• line(n)
– the string of characters in memory location n
print(line(2));
will print
print(“Hello, world”);
1
2
3
4
5
program test
print(“Hello, world”);
print(“Goodbye”);
end
Some commands we will need in our
programming language
Some commands we will need in our
programming language
• loop until condition
– loops until the condition is true
Some commands we will need in our
programming language
• loop until condition
– loops until the condition is true
x = 0;
loop until x = 4
{
print(“Hello, world);
x = x+1;
}
Some commands we will need in our
programming language
• loop until condition
– loops until the condition is true
x = 0;
loop until x = 4
{
print(“Hello, world);
x = x+1;
}
Output:
Hello, world
Hello, world
Hello, world
Hello, world
A self-copying program
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
program copy
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
program copy
L = mem + 1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
program copy
L = mem + 1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
program copy
L = mem + 1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=3
program copy
L = mem + 1;
print(“program copy”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=4
program copy
L = mem + 1;
print(“program copy”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=4
program copy
L = mem + 1;
print(“program copy”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=4
program copy
L = mem + 1;
print(“program copy”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=4
program copy
L = mem + 1;
print(“program copy”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=4
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=5
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=5
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=5
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=5
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=5
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=6
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=6
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=6
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=6
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=6
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=7
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=7
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=7
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=7
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=7
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=8
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=8
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=8
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=8
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=8
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=9
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=9
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=9
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=9
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=9
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=10
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=10
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=10
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=10
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
end
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
end
A self-copying program
Output
1 program copy
2 L = mem + 1;
3
print(“program copy”);
4
print(“ L = mem + 1;”);
5
loop until line[L] =“end”
6 {
7
print(line[L]);
8
L = L+1;
9
}
10 print(“end”);
11 end
L=11
program copy
L = mem + 1;
print(“program copy”);
print(“ L = mem + 1;”);
loop until line[L] =“end”
{
print(line[L]);
L = L+1;
}
print(“end”);
end
Significance of the self-copying program
• The essence of self-copying in this program is to use the
same information stored in memory in two ways:
– interpret it as instructions in a computer program
Significance of the self-copying program
• The essence of self-copying in this program is to use the
same information stored in memory in two ways:
– interpret it as instructions in a computer program
– interpret it as “data” to be used by the instructions in
the computer program
Significance of the self-copying program
• The essence of self-copying in this program is to use the
same information stored in memory in two ways:
– interpret it as instructions in a computer program
– interpret it as “data” to be used by the instructions in
the computer program
• This is also a key mechanism in biological selfreproduction
Significance of the self-copying program
• The essence of self-copying in this program is to use the
same information stored in memory in two ways:
– interpret it as instructions in a computer program
– interpret it as “data” to be used by the instructions in
the computer program
• This is also a key mechanism in biological selfreproduction
– DNA = program and data
Significance of the self-copying program
• The essence of self-copying in this program is to use the
same information stored in memory in two ways:
– interpret it as instructions in a computer program
– interpret it as “data” to be used by the instructions in
the computer program
• This is also a key mechanism in biological selfreproduction
– DNA = program and data
• This principle was formulated by von Neumann in the
1940s, before the details of biological self-reproduction
were well-understood.
Programs and interpreters
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
• Rough analogy to biology:
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
• Rough analogy to biology:
– DNA = program and data
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
• Rough analogy to biology:
– DNA = program and data
– RNA, ribosomes, and enzymes = interpreter
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
• Rough analogy to biology:
– DNA = program and data
– RNA, ribosomes, and enzymes = interpreter
• DNA contains instructions for copying itself, and also for
building its own interpreter
Programs and interpreters
• Notice that the self-copying program needs an external
interpreter—the part of the computer that carries out the
instructions
• Rough analogy to biology:
– DNA = program and data
– RNA, ribosomes, and enzymes = interpreter
• DNA contains instructions for copying itself, and also for
building its own interpreter
• Von Neumann’s self-reproducing automaton also did this.
What are cellular automaton actually used for?
• Different perspectives:
– CAs are models of physical (or biological or social) systems
– CAs are alternative methods for approximating differential
equations
– CAs are devices that can simulate standard computers
– CAs are parallel computers that can perform image processing,
random number generation, cryptography, etc.
– CAs are a framework for implementing molecular scale
computation
– CAs are a framework for exploring how “collective computation”
might take place in natural systems (and that might be imitated in
novel human-made computational systems)
Dynamics and Computation in Cellular Automata
http://math.hws.edu/xJava/CA/EdgeOfChaosCA1.html
Wolfram’s classes for elementary CAs
1. Fixed point
2. Periodic
3. Chaotic
4. “Complex”
– long transients
– universal computation?
ECA 110 is a universal computer
(Matthew Cook, 2002)
Rule:
Wolfram’s numbering of ECA:
0 1 1 0 1 1 1 0 = 110 in binary
Outline of A New Kind of Science (Wolfram,2001)
(from MM review, Science, 2002)
• Simple programs can produce complex, and random-looking
behavior
– Complex and random-looking behavior in nature comes
from simple programs.
• Natural systems can be modeled using cellular-automata-like
architectures
• Cellular automata are a framework for understanding nature
• Principle of computational equivalence
Principle of Computational Equivalence
1. The ability to support universal computation is very
common in nature.
2. Universal computation is an upper limit on the
sophistication of computations in nature.
3. Computing processes in nature are almost always
equivalent in sophistication.
How can we describe information
processing in complex systems?
majority on
majority off
A cellular automaton evolved by
the genetic algorithm
Stephen Wolfram's last problem from ``Twenty problems in
the theory of cellular automata'' (Wolfram, 1985):
20. What higher-level descriptions of information
processing in cellular automata can be given?
"It seems likely that a radically new approach is needed"
What is needed?
1. How can we characterize patterns as computations?
2. How can we design computations in the language of
patterns?
1. How can we characterize patterns as computations?
1. How can we characterize patterns as computations?
Components of computation in spatially extended systems:
– Storage of information
– Transfer of information
– Integration of information from different spatial locations
1. How can we characterize patterns as computations?
Components of computation in spatially extended systems:
– Storage of information
– Transfer of information
– Integration of information from different spatial locations
First step: What structures in the observed patterns implement
these components?
Second step: How do these structures implement the computation?
– Transfer of information:
moving particles
From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html
– Transfer of information:
moving particles
From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html
– Transfer of information:
moving particles
– Integration of information
from different spatial
locations: particle collisions
From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html
– Transfer of information:
moving particles
– Integration of information
from different spatial
locations: particle collisions
From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html
How to automatically identify information-carrying
structures in spatio-temporal patterns?
How to automatically identify information-carrying
structures in spatio-temporal patterns?
Three proposals:
• Filter by regular languages (Crutchfield and Hanson,
1993; Crutchfield et al., 2002)
• Filter by local statistical complexity (Shalizi et al., 2006)
• Filter by local information measures (Lizier et al., 2007)
Filter by regular languages
(Crutchfield and Hanson, 1993; Crutchfield et al., 2002)
Regular language: Simple-to-describe periodic pattern
Examples:
CA for performing
density classification
Regular domains:
(0)*, (1)* , (01)*
Regular domains
filtered out
Particles
(spatially localized,
temporally periodic
boundaries or
“defects” between
regular domains)
Rule 54
Regular domains:
(0001)* , (1110)*
Regular domains
filtered out
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
light cone of past
influence on site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
light cone of past
influence on site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
light cone of past
influence on site i
site i at time t
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
light cone of past
influence on site i
site i at time t
light cone of future
influence of site i
Filter by local statistical complexity
(Shalizi et al., 2006)
• Local statistical complexity of site i:
– amount of information from past needed for optimal
prediction of future in the vicinity of site i
How well does past
light cone predict future
light cone?
light cone of past
influence on site i
site i at time t
light cone of future
influence of site i
Original
Filtered
Example: Rule 110
Filtering by local
statistical
complexity
Original
Filtered
Example: Rule 110
Filtering by local
statistical
complexity
Note: This filter requires no prior determination of “regular domains”.
But more computationally expensive than filtering by regular domains.
Filter by local information measures
(Lizier et al., 2006)
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
site i at time t
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
site i
k time steps in past
site i at time t
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
site i
k time steps in past
site i at time t
Degree of information storage:
how predictable is current state
from past states?
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
site i
k time steps in past
site i at time t
Degree of information storage:
how predictable is current state
from past states?
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Degree of information storage
at each site
Rule 54
From Lizier et al., 2007
Positive information
storage
Positive: information has been stored
at this site
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Amount of information in
the state of the source (site
i-j) about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Amount of information in
the state of the source (site
i-j) about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i  j at time t1
(j  radius of neighborhood)
site i at time t
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Amount of information in
the state of the source (site
i-j) about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i  j at time t1
Degree of information transfer:
how predictable is state at site i
from previous state at site i  j ?
site i at time t
Degree of information transfer
at each site (for j = 1)
Rule 54
Positive t (i, j, n+1)
Positive: information has been
transferred from site i  j to site i
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i
k time steps in past
site i at time t
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
site i  j at time t1
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i
k time steps in past
site i at time t
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
site i  j at time t1
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i
k time steps in past
site i at time t
Degree of information modification:
how unpredictable is state at site i
from storage and transfer?
Filter by local information measures
(Lizier et al., 2006)
Degree of information
storage at site i:
Mutual information
between state at site i at t
and states of site i at k
previous time steps.
site i  j at time t1
Degree of information
Degree of information
transfer from site i-j to site modification
i:
at site i:
Average information in the
state of the source (site i-j)
about next state of
destination (site i)
Degree to which sum of
information storage and
information transfer do not
predict state at site i.
site i
k time steps in past
site i at time t
Degree of information modification:
“local separable information”:
how unpredictable is state at site i
positive: no info modification
from storage and transfer?
negative: info modification
Local information modification
Rule 54
Negative s(i, n)
Negative: State at site i is not well
predicted by information storage or
transfer.
Negative s (i, n)
plotted on top of
information transfer
First step: What structures in the observed patterns implement
these components?
First step: What structures in the observed patterns implement
these components?
First step: What structures in the observed patterns implement
these components?
Second step: How do these structures implement the computation?
majority on
majority off
A cellular automaton evolved by
the genetic algorithm
(Performance  80%)
From Crutchfield et al., 2001
Regular domains:
(0)*, (1)* , (01)*
Particles
laws of
“particle physics”
Hordijk, Crutchfield, and Mitchell, 1996: Models of
CA computation in terms of particle kinematics
(Note “condensation phase”)
Particle model of CA computation
CAs evolved
for density classification
CAs evolved
for synchronization
generation 17
generation 18
First step: What structures in the observed patterns implement
these components?
Second step: How do these structures implement the computation?
How can we design computations in the language of
patterns?
“Programming” the CA in the language of the high-level
structures, and compiling the program to the level of the
CA rule
Open problem!
Amorphous Computing
Abelson, Sussman et al., MIT
Amorphous Computing
(Abelson et al., 2000; 2007)
• Main ideas:
– Produce vast quantities of tiny, unreliable computing
elements (“particles”) at very low cost
– Give them limited wireless communication abilities so
each particle can communicate with nearby particles
– Spray paint them onto a surface. Spatial arrangement,
and thus communication, will be irregular. Processing
and communication will be asynchronous.
– Have them self-organize into a reliable network that
does something useful
Some possible applications
• Smart buildings that sense usage and adjust to save energy
• Smart bridges that monitor traffic load and structural
integrity
• Smart arrays of microphones for opimizing acoustics
• Self-assembly of nano-machines
One example:
Origami-based self-assembly
(R. Nagpal et al.)
• Set-up:
– “Spray paint” thousands of MEMS “agents” on a 2D
square foldable material.
– Agents have no knowledge of global position or
interconnect topology.
– All agents have identical program.
– Agents communicate locally (out to distance r)
– Agents run asynchronously
– Agents will collectively “fold” material to desired
shape.
• High-level language: Origami Shape Language
– Six paper-folding axioms that can be used to construct a
large class of flat folded shapes.
• Low-level language primitives:
– gradients
– neighborhood query
– cell-to-cell contact
– polarity inversion
– flexible folding
Folding a cup
Origami Shape Language
Primitives: points pi, lines Li, regions Ri
Axioms (Huzita):
1. Given two points p1 and p2, fold a line through them.
2. Given two points p1 and p2, fold p1 onto p2 (creates crease
that bisects the line p1p2 at right angles
3. Given two lines L1 and L2, fold L1 onto L2 (constructs the crease
that bisects the angle between L1 and L2)
4. Given p1 and L1, fold L1 onto itself through p1 (constructs
a crease through p1 perpendicular to L1).
5. Given p1 and p2 and L1, make a fold that places p1 on L1
and passes through p2 (constructs the tanget to the
parabola (p1 L1) through p2)
6. Given p1 and p2 and lines L1 and L2, make a fold that
places p1 on L1 and p2 on L2 (constructs the tanget to two
parabolas)
Primitive operations we’ll need:
(create-region p1 L1)
(within-region r1 op1...): restricts operations to give region
(execute-fold L1 type landmark-point)
Fold types: Valley (apical), Mountain (basal)
apical: puts apical surface on inside
basal: puts basal surface on inside
Landmark-point: defines new “apical” and “basal” surfaces after fold
by defining side of fold that will reverse polarity
(intersect L1 L1): returns a point that is the intersection of the two lines
Corner points: c1-c4
Edge lines: e12-e41
Low-level agent operations
(Nagpal, 2001)
How to implement OSL in low-level cell programs
Example
I will put a link to all my slides on the CSSS wiki (under
“Readings” “Melanie Mitchell”).
Thanks for listening!
Descargar

Upcoming schedule