History of Complexity
Lance Fortnow
NEC Research Institute
History of Logic
• Edited by Dirk van Dalen, John Dawson
and Akihiro Kanamori.
• Published by Elsevier.
• Chapter: History of Complexity
• Authors: Lance Fortnow and Steve Homer
• This talk
• Lessons learned from writing this chapter.
Lesson One
• Impossible to please everyone.
• Often disagreements on who is responsible for
what and which results are important.
• Everyone wants a mention.
• Resolutions
• Can’t mention everything in 75 minutes.
• Opinions in this talk are due to me alone.
• How do I mention everyone?
Birth of
Computational Complexity
General Electric
Research Laboratory
Niskayuna, New York
November 11, 1962
Birth of
Computational Complexity
• Juris Hartmanis and Richard Stearns 1965
• On the Computational Complexity of
Algorithms, Transactions of the AMS
• Measure resources, time and memory, as a
function of the size of the input problem.
• Basic diagonalization results: More time
can compute more languages.
No “Immaculate Conception”
• Idea of algorithm goes back to ancient
Greece and China and beyond.
• Cantor developed diagonalization in 1874.
• Kleene, Turing and Church formalized
computation and recursion theory in 30’s.
• Earlier work by Yamada (1962), Myhill
(1960) and Smullyan (1961) that looked at
specific time and space bounded machines.
Complexity in the ’60s
• Better simulations and hierarchies
• Relationships between time and space,
deterministic and nondeterministic.
• Savitch’s Theorem
• Blum’s abstract complexity measure
• Union, speed-up and gap theorems.
Polynomial Time
• Cobham (1964) – Independence of
polynomial-time in deterministic machine
models.
• Edmonds (1965)
• Argues that polynomial time represents
efficient computation.
• Gives informal description of nondeterministic
polynomial time.
P versus NP
• Gödel to von Neumann letter in 1956.
• Cook showed Boolean formula satisfiability
NP-complete in 1971.
• Karp in 1972 showed several important
combinatorial problems were NP-complete.
• Industry in the 1970’s of showing that
problems were NP-complete.
Europe in 1970
Complexity in the Soviet Union
• Perebor – Brute Force Search
• 1959 – Yablonski – On the impossibility of
eliminating Perebor in solving some
problems of circuit theory.
• 1973 – Levin – Universal Sequential Search
Problems
Importance of P versus NP Today
• Thousands of natural problems known to be
NP-complete in computer science, biology,
economics, physics, etc.
• A resolution of the P versus NP question is
the first of seven $1,000,000 prizes offered
by Clay Mathematical Institute.
• We are further away than ever from settling
this problem.
Structure of NP
• Ladner – 1975 – If P different than NP then
there are incomplete sets in NP.
• Berman-Hartmanis – 1977 – Are all NPcomplete sets isomorphic?
• Mahaney – 1982 – Sparse complete sets for
NP imply P = NP.
Alternation
• Development of the polynomial-time
hierarchy by Meyer and Stockmeyer in
1972.
• Chandra-Kozen-Stockmeyer – 1981
• Alternating Time = Space
• Alternating Space = Exponential Time
Relativization
• Baker-Gill-Solovay – 1975
• All known techniques relativize.
• There exists oracles A and B such that
• PA = NPA
• PB  NPB
• Many other relativization results followed.
Oracles and Circuits
• Is there an oracle where the polynomialtime hierarchy is infinite or at least different
than PSPACE?
• Sipser relates to question about circuits:
• Can parity be computed by a constant-depth
circuit with quasipolynomial number of gates?
• In 1983, Sipser solves an infinite version of
this question.
Oracles and Circuits
• Furst, Saxe Sipser/Ajtai - Parity does not
have constant depth poly-size circuits.
• Yao – 1985 – Separating the polynomialtime hierarchy by oracles
• Håstad – 1986 – Switching lemma and
nearly tight bounds for parity
Circuits and Polytime Machines
• 1975 – Ladner – Every language in P has
polynomial-size circuits.
• 1980 – Karp-Lipton – If NP has poly-size
circuits then polytime hierarchy collapses.
• To show P  NP, need only show that some
problem in NP does not have poly-size
circuits.
Circuit Results
• Razborov – 1985 – Clique does not have
poly-size monotone circuits.
• Razborov-Smolensky – 1987 – Lower
bounds for constant depth circuits with
modp-gates.
The Fall of Circuit Complexity
• No major results in circuit complexity since
1987, particularly for non-monotone
circuits.
• Razborov – 1989 – Monotone techniques
will not extend to non-monotone circuits.
• Razborov-Rudich – 1997
• “Natural Proofs”
Different Models
• As technology changes so does the notion
of what is “efficient computation”.
• Randomized, Parallel, Non-uniform, AverageCase, Quantum computation
• Complexity theorists tackle these issues by
defining models and proving relationships
between these classes and more traditional
models.
Randomized Computation
• Solovay-Strassen – 1977 – Fast randomized
algorithm for primality.
• 1977 – Gill
• Probabilistic Classes: ZPP, R, BPP
• Sipser – 1983 – A complexity theoretic
approach to randomness
• BPP in polynomial-time hierarchy.
• Various oracle results like BPP = NEXP.
Derandomization
• Cryptographic one-way functions give
pseudorandom generators that can save on
randomness.
• Hard languages in nonuniform models give
pseudorandom generators.
• Derandomization results for space-bounded
classes.
Randomness and Proofs
• Goldwasser-Micali-Rackoff – 1989
• Cryptographic primitive for not releasing
information.
• Babai-Moran – 1988
• Classifying certain group problems.
• Interactive Proof Systems
• Public = Private; One-sided error
Power of Interaction ’89-’91
• IP = PSPACE
• MIP = NEXP
• FGLSS – Limits on approximation based on
interactive proof results.
• NP = PCP(log n,1)
• Better bounds on PCPs and approximation
Audience Poll
• What was more surprising in early 90’s?
• The power of interactive proofs and their
applications to hardness of approximation.
• The end of the cold war, the collapse of the
Soviet Union and the Eastern Bloc, the fall of
the Berlin wall and the reunification of
Germany.
The Role of Mathematics
• Computation Complexity has often drawn
insights, definitions, problems and
techniques from many different branches of
mathematics.
• As complexity theory has evolved, we have
continued to use more sophisticated tools
from our mathematician friends.
Logic
• Complexity has its foundations in logic.
• Turing machines, Diagonalization, Reductions,
and the polynomial-time hierarchy.
• Logical characterizations of classes have led
to NL = coNL and formalization of
MAX-SNP.
• Proof complexity studies limitations of
various logical systems to prove tautologies.
Probability
• Probabilistic Models
• BPP, Interactive Proofs, PCPs
• Resource-Bounded Measure
• Basic Techniques
• Chernoff Bounds
• Probability of OR bounded by Sum of Prob
• Dependent Variables
• Probabilistic Method
Algebra
• NC1 = Bounded-Width Branching Programs
• Polytime Hierarchy reduces to Permanent
• Mod3 requires large constant-depth parity
circuits.
• Interactive Proofs/PCPs
• Coding Theory
Discrete Math/Combinatorics
• Lower Bounds
• Circuit Complexity
• Branching Programs
• Proof Systems
• Ramsey Theory/Probabilistic Method
• Expanders/Extractors
Information Theory
•
•
•
•
•
•
Entropy
Kolmogorov Complexity
Cryptography
VLSI/Communication Complexity
Parallel Repetition
Quantum
The Future
P = NP?
Showing P  NP
• Other areas of mathematics
• Algebraic Geometry
• “Higher Cohomology”
• New techniques for circuits, branching
programs or proof systems.
• Completely new model for P and NP.
• Diagonalization.
Besides P = NP?
• Same Old, Same Old
• Handling new models
• Complex Systems: The Other “Complexity”
• Financial Markets, Biological Systems,
Weather, The Internet
• The Big Surprise
Conclusions
• Juris Hartmanis Notebook Entry 12/31/62:
• “This was a good year.”
• This was a good forty years.
• Who knows what the future will bring?
• Fasten your seatbelts!
Descargar

History of Complexity - Department of Computer Science