/
Computational Complexity and the limits of computation Computational Complexity and the limits of computation

Computational Complexity and the limits of computation - PowerPoint Presentation

tatiana-dople
tatiana-dople . @tatiana-dople
Follow
345 views
Uploaded On 2019-06-27

Computational Complexity and the limits of computation - PPT Presentation

Todays class 1 Lecture 2 Blackbox presentations 3 Guest Lecture Jonathan Mills O rganized complexity organized complexity study of organization whole is more than sum of parts Systemhood ID: 760375

time complexity problems problem complexity time problem problems algorithm computer solve algorithms hanoi computational system disk information number polynomial

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Computational Complexity and the limits ..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Computational Complexity and the limits of computation

Slide2

Today’s class

1) Lecture

2)

Blackbox

presentations

3) Guest Lecture: Jonathan Mills

Slide3

Organized complexity

organized complexitystudy of organizationwhole is more than sum of partsSystemhood propertiesHolism vs. reductionismNeed for new mathematical and computational toolsMassive combinatorial searchesProblems that can only be tackled with computers: “computer as labUnderstanding functionOf wholesSystems biologyEvolutionary thinkingSystems thinkingEmergenceHow do elements combine to form new unities?

Organized

simplicity

Disorganized complexity

Organized Complexity

Complexity

Randomness

From systems science to informatics

Slide4

What is complexity?

Dictionary

Having many varied or interrelated parts, patterns or elements

Quantity of parts and extent of interrelations

Subjective or epistemic connotation

Ability to understand or cope

Complexity is in the eyes of the observer

Brain to a neuroscientist and to a butcher

Quantity of information

required to describe a system

Slide5

complexity and information

Descriptive complexity:

Proportional to the amount of

information

required to describe the system

In a syntactic way

Measure number of entities (variables, states, components) and variety relationships among them

General requirements for indicators:

Nonnegative quantity

If system A is a

homomorphic

image of B, then the complexity of A should not be greater than B

If A and B are isomorphic, then their complexity should be the same

If system C consists of two non-interacting subsystems B and neither is a

homomorphic

image of the other, then the complexity of C should be equal to the sum of the complexities of A and B

Size of shortest description or program in a standard language or universal computer

Applicable to any system

Difficult to determine shortest description

aka

Kolmogorov complexity

Slide6

Uncertainty-based complexity and information

Proportional to the amount of information needed to resolve any uncertainty with the system involvedIn a syntactic wayRelated to number of alternatives left undecided to characterize a particular elementExamplesHartley MeasureShannon Entropy

Slide7

Trade-off between descriptive and uncertainty-based complexity:

When one is reduced, the other is likely to increase

Trade certainty for acceptable descriptive complexity

Models of phenomena in the realm of organized complexity require large descriptive complexity

But to be manageable, we must simplify by accepting larger uncertainty (and smaller descriptive complexity)

Slide8

Computational complexity of algorithms

Descriptive and uncertainty-based complexity pertain to

systems

Characterized by information

Computational complexity pertains to

systems problems

Characterization of the time or space (memory) requirements for solving a problem by a particular algorithm

Slide9

Types of Computational Problems

Algorithms are

P

rocedures for

Solving Problems

Types of Problems

Search

Find an X in input satisfying property Y

Find a prime number in a random sequence of numbers

Structuring Problems

Transform the input to satisfy property Y

Sort a random sequence of numbers

Construction Problems

Build an X satisfying property Y

Generate a random sequence of numbers with a given mean and standard deviation

Optimization Problems

Find the best X satisfying property Y

Find the largest prime number in a given

sequence?

Decision Problems

Decide whether the input satisfies property Y

Is the input number a prime

number?

Adaptive Problems

Maintain property Y over time

Grow a sequence of numbers such that there are always m prime numbers with a given mean and standard deviation

Slide10

Problem Difficulty

Conceptually Hard ProblemNo algorithm is known to solve the problemAnalytically Hard ProblemAn algorithm exists to solve the problem, but we don’t know how long it will take to solve every instance of the problemComputationally Hard ProblemAn algorithm exists to solve the problem, but relatively few instances take millions of years (or forever) to solveComputationally unsolvable ProblemNo algorithm can exist to solve the problem.Computability, decidability

Artificial Intelligence

Algorithmic Complexity Theory

Computability Theory

Slide11

Hanoi Problem

Invented by French Mathematician

Édouard Lucas in 1883At the Tower of Brahma in India, there are three diamond pegs and sixty-four gold disks. When the temple priests have moved all the disks, one at a time preserving size order, to another peg the world will come to an end. If the priests can move a disk from one peg to another in one second, how long does the World have to exist?See Clarke’s “The Nine Billion Names of God”

Slide12

Solving the Hanoi Problem

Solve for the smallest instances and then try to generalizeN=2N=3

0

4

3

7

Use

Hanoi_2

(H2) as building block

(3

moves)

H3 uses H2 twice, plus 1 move of the largest

disk: 2 x 3 + 1 moves

1

2

3

Slide13

Hanoi Problem for n disks

Algorithm to move n disks from A to CMove top n-1 disks from A to BMove biggest disk to CMove n-1 disks on B to C RecursionUntil H2

Use Hanoi_2 (H2) as building block (of 3 moves)

H3 uses H2 twice, plus 1 move of the largest disk

An Algorithm that uses itself to solve a problem

Slide14

Pseudocode for Hanoi Problem

Hanoi (Start, Temp, End, n)If n = 1 thenMove Start’s top disk to EndElseHanoi (Start, End, Temp, n-1)Move Start’s top disk to EndHanoi (Temp, Start, End, n-1)

Start

Temp

End

Slide15

Computational Complexity

Resources required during computation of an algorithm to solve a given problemTimehow many steps does it take to solve a problem?Spacehow much memory does it take to solve a problem?The Hanoi Towers Problemf(n) is the number of times the HANOI algorithm moves a disk for a problem of n disksf(1)=1, f(2)=3, f(3)=7f(n)= f(n-1) + 1 + f(n-1) = 2  f(n-1) + 1 Every time we add a disk, the time to compute is at least doublef(n) = 2n - 1

585 billion years in seconds!!!!!!!!

Earth

: 5 billion years

Universe: 15 billion yearsFastest Computer: 1petaflops - 1015 (approx 250 instructions a second)214 s needed = 4.6 hours

"FLOPS"

(

FLoating

Point Operations Per Second)

Slide16

Fastest Computers

Fastest Computer (early 2005): 135.5 teraflops - 135.5 trillion calculations a second --- Approaching petaflops: 250

Fastest Computer (late 2005): 280.6 teraflops - 280.6 trillion calculations a second --- Approaching petaflops: 3 petaflops in late 2006????

Fastest Computer (june 2006): 1 petaflop !!! – 1 quadrillion calculations per second --- MDGRAPE-3 @ Riken, Japan

MDGRAPE-3: Not a general-purpose computer

Fastest

general-purpose computer (may 2008):

1 petaflop !!! – 1 quadrillion calculations per second --- Roadrunner @ Los Alamos--- aprox 214 s needed = 4.6 hours for Hanoi problem (assuming one disk change per operation)

IBM

BlueGene

/L

IBM Roadrunner

RIKEN & Fujitsu “K” Computer: 8.162

petaflops

672 racks, 68544 CPUs…

K computer

Slide17

Bremermann's Limit

Physical Limit of ComputationHans Bremmermann in 1962“no data processing system, whether artificial or living, can process more than 2 × 1047 bits per second per gram of its mass.” Based on the idea that information could be stored in the energy levels of matterCalculated using Heisenberg's uncertainty principle, the Hartley measure, Planck's constant, and Einstein's famous E = mc2 formulaA computer with the mass of the entire Earth and a time period equal to the estimated age of the Earthwould not be able to process more than about 1093 bits= transcomputational problems

Slide18

Transcomputational Problems

A system of n variables, each of which can take k different stateskn possible system statesWhen is it larger than 1093?Pattern RecognitionGrid of n = q2 squares of k colorsBlackbox: 10100 possible states!The human retina contains millions of light-sensitive cellsLarge scale integrated digital circuitsK= 2 (bits): a circuit with 308 inputs and one output!Complex problems need simplification/compression!

k2345678910n3081941541331191101029793

Slide19

What happens to Moore’s law ?

Slide20

BUT!

Even

more remarkable – and even less widely understood – is that in many areas,

performance gains due

to

improvements in algorithms

have

vastly

exceeded even the dramatic performance gains due to increased

processor speed

http://

www.whitehouse.gov

/sites/default/files/microsites/

ostp

/pcast-nitrd-report-2010.pdf

Slide21

Is the singularity near?

Ray Kurtzweil, Vernor VingeTechnological progress grows exponentially and reaches infinity in finite time (=singularity)Sigmoidal function?

http://www.shirky.com/herecomeseverybody/2008/04/looking-for-the-mouse.html

Also, noteworthy:

metasystem

transition (

Turchin

/

Heylighen

), Clay

Shirky’s

“gin analogy”

Slide22

computational complexity (algorithms)

Computational complexity pertains to systems problems

Characterization of the time or space (memory) requirements for solving a problem by a particular algorithm

Time complexity function

f

(

n

)

is the largest amount of time needed for an algorithm to solve a problem instance of size

n

.

Polynomial time algorithms

Complexity

:

O

(

n

k

)

: can

be

computed by deterministic

turing

machine

in a polynomial time function of order

k

.

“Efficient”

or

“tractable” algorithms,

P class

Exponential time algorithms

Exponential time function, e.g.

O

(2

n

),

O

(10

n

),

O

(

n

n

),

O

(

n

!)

Inefficient or intractable algorithms

Non-deterministic Polynomial time algorithms (NP):

Solvable by non-deterministic Turing machine in polynomial time (“guessing”)

Answers verifiable by deterministic Turing Machine in polynomial time (“verification”)

NP-complete: no polynomial time solution is known, although answers can be verified by deterministic Turing machine in polynomial time (e.g. traveling salesman problem)

P \subset NP, but P = NP? Need for approximation algorithms

Slide23

Computational complexity

Growth rates for time complexity functions

Assuming a million operations per second

Slide24

Computational complexity

Effect of increasing computing speed