Nadel February 2017 Agenda Introduction Early Days of SAT Solving Core SAT Solving Conflict Analysis and Learning Boolean Constraint Propagation Decision Heuristics Restart Strategies ID: 715635
Download Presentation The PPT/PDF document "1 SAT Solving Overview Alexander" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
1
SAT Solving Overview
Alexander
Nadel
February 2017
Slide2
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
2Slide3
What is SAT?
Find a variable assignment (AKA solution or model) that satisfies a propositional formula
or prove that there are no solutionsSAT solvers operate on CNF formulas:Any propositional formula can be reduced to CNF
3
CNF
Formula:
clause
negative literal
positive literal
F
= (
a
+
c
) (
b
+
c
)
(
a’
+
b’
+
c’
)Slide4
SAT: Theory and Practice
Theory:SAT is the first known NP-complete problem
Stephen Cook, 1971One can check a solution in polynomial timeCan one find a solution in polynomial time?The P=NP question…
Practice:
Amazingly, nowadays SAT solvers can solve industrial problems having
millions of clauses and variables
SAT has numerous
applications
in formal verification, routing, planning, scheduling, bioinformatics,
combinatorics
, …
4Slide5
Approaches to SAT Solving
Backtrack search: DFS search for a solution The baseline approach for industrial-strength solvers. In focus today.Look-ahead: BFS search for a solution
Helpful for certain classes of formulasRecently, there were attempts of combining it with backtrack searchLocal searchHelpful mostly for randomly generated formulas
5Slide6
Early Days of SAT Solving Agenda
ResolutionBacktrack Search6Slide7
a + b + g + h’ + f
a + b + g + h’
Resolution: a Way to Derive New Valid Clauses
Resolution
over
a pair of clauses with exactly
one
pivot variable
: a variable appearing in different polarities:
a + b + c’ + f
g + h’ + c + f
- The
resolvent clause
is a logical consequence of the two
source clauses
Known to be invented by
Davis&Putnam
,
1960
Had been invented independently by
Lowenheim
in early 1900’s (as well as the DP algorithm, presented next)
According to
Chvatal
&
Szemeredy
, 1988 (JACM) Slide8
DP Algorithm: Davis&Putnam, 1960
8
(a + b)
(a + b’)
(a’ + c)
(a’ + c’)
(a + b + c)
(b + c’ + f’)
(b’ + e)
(a + c + e)
(c’ + e + f)
(a + e + f)
(a’ + c)
(a’ + c’)
(c)
(c’)
( )
SAT
UNSAT
(a)
Remove the variables one-by-one by resolution over all the clauses containing that variableSlide9
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + bSlide10
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a’Slide11
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a’
Decision level
1
a
is the
decision variable
;
a’
is the
decision literalSlide12
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a’
b’
Decision level 2Slide13
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
a’
b’
A
conflict
.
A
blocking clause
– a clause, falsified by the current assignment – is encountered.Slide14
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
a’
b’
b
Backtrack and flipSlide15
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
a’
b’
b
c’
Decision level 1
Decision level 2Slide16
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
a’
b’
b
c’
c
Decision level 1Slide17
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
a’
b’
b
c’
c
aSlide18
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
a’
b’
b
c’
c
a
bSlide19
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
b’ + c
a’
b’
b
c’
c
a
b
c’Slide20
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
b’ + c
b’ + c’
a’
b’
b
c’
c
a
b
c’
cSlide21
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
b’ + c
b’ + c’
a’ + b
a’
b’
b
c’
c
a
b
c’
c
b’Slide22
Backtrack Search or DLL: Davis-
Logemann-Loveland, 1962a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
b’ + c
b’ + c’
a’ + b
a’
b’
b
c’
c
a
b
c’
c
b’
UNSAT!Slide23
Core SAT Solving: the Principles
DLL could solve problems with <2000 clausesHow can modern SAT solvers solve problems with millions of clauses and variables?The major principles:Learning and pruning Block already explored paths
Locality and dynamicityFocus the search on the relevant dataWell-engineered data structures
Extremely fast propagation
23Slide24
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
24Slide25
Duality between Basic Backtrack Search and Resolution
One can associate a resolution derivation with every invocation of DLL over an unsatisfiable formulaSlide26
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + bSlide27
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a’Slide28
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a + b
a’
b’Slide29
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a + b
a’
b’
A
parent clause
P
(
x
)
is associated with every flip operation for variable
x
. It contains:
The flipped literal
A subset of previously assigned falsified literals
The parent clause
justifies the flip
: its existence proves that the explored subspace has no solutions
bSlide30
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
a’
b’
b
c’Slide31
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
a’
b’
b
c’
cSlide32
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a + b
b’ + c
b’ + c’
a’
b’
b
c’
cSlide33
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’
a + b
b’ + c
b’ + c’
a’
b’
b
c’
c
Backtracking over a flipped variable
x
can be associated with a resolution operation:
P = P(x)
P
P is to become the parent clause for the upcoming flip
P is initialized with the last blocking clause
P
old
P(c)
P
newSlide34
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a
b’
a + b
b’ + c
b’ + c’
a’
b’
b
c’
c
Backtracking over a flipped variable
x
can be associated with a resolution operation:
P = P(x)
P
P is to become the parent clause for the upcoming flip
P is initialized with the last blocking clause
P
new
P
old
P(b)Slide35
b’
b
c’
c
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a
b’
a + b
b’ + c
b’ + c’
a’
a
(a)
The parent clause
P
(
a
) is derived by resolution.
The resolution proof
(a)
of the parent clause is called
parent resolutionSlide36
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a’
a
b
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’Slide37
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’ + c
a’
a
b
c’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’Slide38
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’ + c
a’
a
b
c’
c
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
P(c)Slide39
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’ + c
b’ + c’
a’
a
b
c’
c
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’Slide40
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
a
b
c’
c
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
P
old
P(c)
P
newSlide41
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
(b)Slide42
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’Slide43
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
P
old
P(b)
P
newSlide44
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
P
old
P(a)
P
newSlide45
Duality between Basic Backtrack Search and Resolution
a + b
b’ + c
b’ + c’
a’ + b
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’Slide46
Duality between Basic Backtrack Search and Resolution
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
The final trace of DLL is both a
decision tree
(top-down view) and a
resolution refutation
(bottom-up view)
Variables associated with the edges are both
decision variables
in the tree and
pivot variables
for the resolution
A
forest
of
parent resolutions
is maintained
The forest converges to
one resolution refutation
in the end (for an UNSAT formula)Slide47
Conflict Clause Recording
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
The idea: update the instance with
conflict clauses
, that is some of the clauses generated by resolution
Introduced in SAT by
Bayardo&Schrag
, 1997 (
rel_sat
)Slide48
Conflict Clause Recording
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
Assume the
brown
clause below was recordedSlide49
Conflict Clause Recording
a’
b’
b’ + c
b’ + c’
a’ + b
a’
a
b
c’
c
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
Assume the
brown
clause below was recorded
The
violet
part would not have been explored
It is redundantSlide50
Conflict Clause Recording
a’
a’ + b
a’
a
b
b’
b’
b
c’
c
a
b’
a + b
b’ + c
b’ + c’
Assume the
brown
clause below was recorded
The
violet
part would not have been explored
It is redundantSlide51
Conflict Clause Recording
Most of the modern solvers record every non-trivial parent clause (since Chaff) : recorded : not recorded
a’
b’
c’
c
d’
d
f’
f
g’
g
b
e’
e
aSlide52
Enhancing CCR: Local Conflict Clause Recording
The parent-based scheme is asymmetric w.r.t polarity selection
a’
b’
c’
c
d’
d
f’
f
g’
g
b
e’
e
aSlide53
Enhancing CCR: Local Conflict Clause Recording
The parent-based scheme is asymmetric w.r.t polarity selectionSolution: record an additional local conflict clause
: a would-be conflict clause if the last polarity selection was flippedDershowitz&Hanna&Nadel, 2007 (Eureka – Fiver’s predecessor at Intel / Fiver)
: local conflict clause
a’
b’
c’
c
d’
d
f’
f
g’
g
b
e’
e
aSlide54
Managing Conflict Clauses
Keeping too many clauses slows down the solverDeleting irrelevant clauses is very important. Some of the strategies:Size-based: remove too long clauses
Marques-Silva&Sakallah, 1996 (GRASP)Age-based
: remove clauses that weren’t used for BCP
Goldberg&Novikov
, 2002 (Berkmin)Locality-based (glue): remove clauses, whose literals are assigned far away in the search tree
Audemard&Simon
, 2009 (Glucose)
54Slide55
Modern Conflict Analysis
Next, we present the following two techniques, commonly used in modern SAT solvers:Non-chronological backtracking (NCB)GRASP1UIP schemeGRASP
&ChaffBoth techniques prune the search tree and the associated forest of parent resolutionsSlide56
Non-Chronological Backtracking (NCB)
b’
b
c’
c
a + b + f
b’ + c
b’ + c’
a’ + b
a + f
b’
a +
b + f
b’ + c
b’ + c’
a’
…
d’
NCB
is an additional
pruning
operation before flipping: eliminate all
the decision levels adjacent to the decision level
of the flipped literal, so that the parent clause is still falsified
e
(e)
e’
Assume we are about to flip
a
fSlide57
Non-Chronological Backtracking (NCB)
c’
c
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
…
d’
NCB
is an additional
pruning
operation before flipping: eliminate all
the decision levels adjacent to the decision level
of the flipped literal, so that the parent clause is still falsified
e
(e)
e’
Assume we are about to flip
a
Eliminate irrelevant decision levels
f
a +
b + f
b’
b
a + f
a’
a + b + fSlide58
Non-Chronological Backtracking (NCB)
c’
c
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
…
NCB
is an additional
pruning
operation before flipping: eliminate all
the decision levels adjacent to the decision level
of the flipped literal, so that the parent clause is still falsified
Assume we are about to flip
a
Eliminate irrelevant decision levels
Flip
a
a +
b + f
f
b’
b
a + f
a’
a + b + fSlide59
1UIP Scheme: Pruning the Last Decision Level While BacktrackingSlide60
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variableSlide61
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
b’
b
c’
c
P
d
a + b + f
a +
b + fSlide62
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
b’
b
c’
c
P
a + b + f
a +
b + f
dSlide63
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
b’
b
c’
c
P
a + b + f
a +
b + f
dSlide64
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
b’
b
c’
c
a + b + f
a +
b + f
dSlide65
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
A
pruning
technique: eliminate all the disconnected variables of the
last decision level
(along with their parent resolutions)
b’ + c
b’ + c’
a’ + b
b’
b’ + c
b’ + c’
a’
b’
b
c’
c
a + b + f
a +
b + f
dSlide66
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
A
pruning
technique: eliminate all the disconnected variables of the
last decision level
(along with their parent resolutions)
b’ + c
b’ + c’
a’ + bb’
b’ + c
b’ + c’
b
c’
c
a + b + f
dSlide67
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
A
pruning
technique: eliminate all the disconnected variables of the
last decision level
(along with their parent resolutions)
b’ + c
b’ + c’
a’ + bb’
b’ + c
b’ + c’
b
c’
c
a + b + f
d
NCBSlide68
1UIP Scheme: Pruning the Last Decision Level While Backtracking
1UIP scheme consists of: A
stopping condition for backtracking: stop whenever P contains one variable of the
last decision level
, called the 1UIP variable
A rewriting
operation: consider the 1UIP variable as a decision variable and P as its parent clause
A
pruning
technique: eliminate all the disconnected variables of the
last decision level
(along with their parent resolutions)
b’ + c
b’ + c’
a’ + bb’
b’ + c
b’ + c’
b
c’
c
b’
a + b + fSlide69
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean
Constraint PropagationDecision HeuristicsRestart StrategiesInprocessing
Extensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions GenerationHigh-level (group-oriented) MUC Extraction
69Slide70
The
unit clause ruleA clause is unit if all of its literals but one are assigned to 0. The remaining literal is unassigned, e.g.:
Boolean
Constraint Propagation (BCP)
Pick
unassigned variables
of unit clauses as decisions
whenever
possible
80-90
% of running time of modern SAT solvers is spent
in BCP
Introduced already in the original DLL
a = 0, b
= 1, c
is unassigned
a
+
b’
+
c
Boolean Constraint Propagation
70Slide71
Data Structures for Efficient BCP
Goal: identify all the unit clauses
Naïve: for each clause hold pointers to
all its literals
How to
minimize
the number of clause
visits
?
2-Watch scheme
Pick two literals in each clause to
watch
and ignore any assignments to the other literals in the clause.
Introduced by Zhang, 1997 (SATO solver); enhanced by Moskewicz& Madigan&Zhao&Zhang&Malik, 2001 (Chaff)
71Slide72
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
72Slide73
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
73
a’Slide74
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
The clause is visited
The corresponding watch moves
to
any unassigned literal
No pointers to the previously visited literals are saved
74
a’Slide75
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
75
a’
c’Slide76
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
The clause is
not visited
!
76
a’
c’Slide77
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
77
a’
c’
g’
e’Slide78
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
The clause is
not visited
!
78
a’
c’
g’
e’Slide79
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
79
a’
c’
g’
e’
h’Slide80
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
The clause is visited
The corresponding watch moves
to
any unassigned literal
No pointers to the previously visited literals are saved
80
a’
c’
e’
g’
h’Slide81
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
81
a’
c’
e’
g’
h’
f’Slide82
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
82
a’
c’
e’
g’
h’
f’Slide83
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
83
a’
c’
e’
g’
h’
f’
b’Slide84
Watched Lists : Example
a
b
c
d
e
f
g
h
W
W
The watched literal b is visited. It is identified that the clause became unit!
84
a’
c’
e’
g’
h’
f’
b’Slide85
Watched Lists : Example
a
b
c
d
e
f
g
h
W
b is unassigned : the
watches do
not move
No need to visit the clause w
hile
b
acktracking!
W
85
a’
c’
e’
g’
h’
f’
Backtrack
b’Slide86
Watched Lists : Example
f is unassigned : the
watches do not move
Backtrack
a
b
c
d
e
f
g
h
W
W
86
a’
c’
e’
g’
h’
f’
b’Slide87
Watched Lists : Example
a’
c’
e’
g’
h’
When all the literals are unassigned, the
watches
pointers do not get back to their initial positions
f’
Backtrack
a
b
c
d
e
f
g
h
W
W
87
b’Slide88
Watched Lists : Caching
Chu&Harwood&Stuckey, 2008Divide the clauses into various cache levels to improve cache performance
Most of the modern solvers put one literal of each clause in the WLSpecial data structures for clauses of length 2 and 3
88Slide89
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessing
Extensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions GenerationHigh-level (group-oriented) MUC Extraction
89Slide90
Decision Heuristics
Which literal should be chosen at each decision point?Critical for performance!Slide91
Old-Days’ Static Decision Heuristics
Go over all clauses that are not satisfied Compute some function f(A) for each literal—based on frequencyChoose literal with maximal f(A)Slide92
Variable-based Dynamic Heuristics: VSIDS
VSIDS was the first dynamic
heuristic (Chaff)Each literal is associated with a counter
Initialized
to number of occurrences in input
Counter is increased when the literal
participates in a conflict clause
Occasionally
, counters are halved
Literal with the
maximal
counter is chosenBreakthrough compared to static heuristics:
Dynamic: focuses search on recently used variables and clausesExtremely
low overheadSlide93
VSIDS Example
Heuristic-related data
Literal
Score
a
0
a
’
0
b
0
b
’
0
c
0
c
’
0
d
0
d
’
0
e
0
e
’
0
…
…
Search tree
Conflicts till now: 0Slide94
VSIDS Example
Heuristic-related data
Literal
Score
a
4
a
’
5
b
3
b
’
3
c
2
c
’
3
d
2
d
’
4
e
2
e
’
6
…
…
Search tree
Conflicts till now: 0
Count literal appearances in the initial formulaSlide95
VSIDS Example
Heuristic-related data
Literal
Score
a
4
a
’
5
b
3
b’
3
c
2
c’
3
d
2
d’
4
e
2
e’
6
…
…
Search tree
Conflicts till now: 0
e’
{h,i}
Pick a literal with the maximal scoreSlide96
VSIDS Example
Heuristic-related data
Literal
Score
a
4
a’
5
b
3
b’
3
c
2
c’
3
d
2
d’
4
e
2
e’
6
…
…
Search tree
Conflicts till now: 0
e’
{h,i}
a’
{d}
Pick an unassigned literal with the maximal scoreSlide97
VSIDS Example
Heuristic-related data
Literal
Score
a
4
a’
5
b
3
b’
3
c
2
c’
3
d
2
d’
4
e
2
e’
6
…
…
Search tree
Conflicts till now: 0
e’
{h,i}
a’
{d}
A conflict is encountered Slide98
VSIDS Example
Heuristic-related data
Literal
Score
a
4
5
a’
5
b
3
b’
3
4
c
2
3
c’
3
d
2
d’
4
e
2
e’
6
…
…
Search tree
Conflicts till now:
1
e’
{h,i}
a’
{d}
h
’
+ a + c + b
’
+ k
Increment scores for conflict clause literalsSlide99
VSIDS Example
Heuristic-related data
Literal
Score
a
10
a
’
12
b
18
b’
6
c
12
c’
6
d
2
d’
6
e
16
e’
6
…
…
Search tree
Conflicts till now:
1000
e’
{h,i}
a’
{d}
Suppose, 1000 conflict threshold is reachedSlide100
VSIDS Example
Heuristic-related data
Literal
Score
a
10
5
a’
12
6
b
18
9
b’
6
3
c
12
6
c’
6
3
d
2
1
d’
6
3
e
16
8
e’
6
3
…
…
Search tree
Conflicts till now: 1000
e’
{h,i}
a’
{d}
Halve the scoresSlide101
Enhancements to VSIDS
Adjusting the scope: increase the scores for every literal in the newly generated parent resolution (Berkmin)Additional dynamicity: multiply scores by 95% after each conflict, rather than occasionally halve the scores
Eén&Sörensson, 2003 (Minisat)
101Slide102
The Clause-Based Heuristic (CBH)
The idea: use relevant clauses for guiding the decision heuristicThe Clause-Based Heuristic or CBH Dershowitz&Hanna&Nadel
, 2005 (Eureka)All the clauses (both initial and conflict clauses) are organized in a list
The
next variable is chosen from the top-most unsatisfied clause
After a conflict:
All
the
clauses that participate in the newly derived parent resolution are
moved to the top, then
The
conflict clause is placed at the topPartial clause-based heuristics: Berkmin, HaifaSATSlide103
CBH: More
CBH is even more dynamic than VSIDS: prefers variables from very recent conflictsCBH tends to pick interrelated variables
:Variables whose joint assignment increases the chances of:
Satisfying clauses in satisfiable branches
Quickly reaching conflicts in unsatisfiable branches
Variables appearing in the same clause are interrelated:
Picking variables from the same clause, results in either that:
the clause becomes satisfied, or
there’s a contradiction
103Slide104
Polarity Selection
Phase Saving:Strichman, 2000; Pipatsrisawat&Darwiche, 2007 (RSAT)Assign a new decision variable the last polarity it was assigned:
locality rules again
104Slide105
Core SAT Solving: the Major Enhancements to DLL
Boolean Constraint PropagationConflict Analysis and LearningDecision HeuristicsRestart StrategiesInprocessing
105Slide106
106
RestartsRestarts: the solver backtracks to decision level 0, when certain criteria are met
crucial impact on performanceMotivation: Dynamicity: refocus the search on relevant data
Variables identified as
important
will be pick first by the decision heuristic after the restart
Avoid spending
too much time in ‘bad’ branchesSlide107
107
Restart CriteriaRestart after a certain number of conflicts has been encountered either:
Since the previous restart: globalGomes&Selman&Kautz, 1998
Higher than a certain decision level:
local
Ryvchin&Strichman
, 2008 (Eureka, Fiver)
Next: methods to calculate the threshold on the number of conflicts
Holds for both global and local schemesSlide108
108
Restarts StrategiesArithmetic (or fixed) series.
Parameters: x, y
.
Init(t) = x
Next(t)=
t+ySlide109
109
Restarts Strategies (cont.)Luby
et al. series. Parameter: x.
Init(t) = x
Next(t) =
t
i
*x
Ruan&Horvitz&Kautz
,
2003
t
i
=1 1 2 1 1 2 4 1 1 2 1 1 2 4 8 1 1 2 1 1 2 4 1 1 2 1 1 2 4 8 16 1 1 2 1 1 2 4 1 1 2 1 1 2 4 8 …Slide110
110
Restarts Strategies (cont.)Inner-Outer Geometric series.
Parameters: x,
y
,
z. Init(t) = x
if (t*y < z)
Next(t) = t*y
else
Next(t) = x
Next(z) = z*y
Armin Biere, 2007 (Picosat)Slide111
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessing
Extensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions GenerationHigh-level (group-oriented) MUC Extraction
111Slide112
Preprocessing and Inprocessing
The idea:Simplify the formula prior (pre-) and during (in-) the searchHistory:Freeman, 1995 (POSIT)
: first mentioning of preprocessing in the context of SATEén&Biere, 2005 (SatELite
):
a commonly used efficient preprocessing procedure
Heule&Järvisalo&Biere (2010-2012
): a series of papers on
inprocessing
proposing a variety of techniques
Used in Armin
Biere’s
Lingeling
Nadel&Ryvchin&Strichman (2012-2014): apply SatELite in incremental SAT solving
Used in Fiver112Slide113
SatELite
Subsumption: remove clause (C+D) if (C) exists
Self-subsuming resolution: given (C+l),
replace
(
C+F+l’) by (C+F)
Example
: (
x+l
)(
x+y+
l’)
(x+l)(x+y)
Variable elimination: apply DP for variables, whose elimination does not increase the number of clausesExample: (a+b)(a+b’)(a’+c)(a’+c
’) (a)(a’+c)(a’+c’)
113Slide114
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
114Slide115
Extensions to SAT
Nowadays, SAT solving is much more than finding one solution to a given problemExtensions to SAT:MAXSAT: maximize the number of satisfied clauses
Incremental SAT under assumptionsSimultaneous SAT (SSAT): SAT over multiple properties at onceDiverse solution generation
Minimal Unsatisfiable Core (MUC) extraction
Push/pop
supportModel minimization
ALL-SAT
XOR
clauses support
ISSAT: assumptions are implications
…
115Slide116
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
116Slide117
Incremental SAT Solving under Assumptions
The challenge: speed-up solving of related SAT instances by enabling re-use of relevant dataNumerous applications
Incremental solving since Minisat:Do:Add clauses using
AddClause
()
Solve(Assumptions Ai={li1
,
l
i2
, …,
lin})Uses all the clauses added so far, but the current assumptions Ai only
Stop, if the problem is solved117Slide118
Incremental SAT Solving in Minisat
Reuse the same SAT solver instanceModel assumptions as the first decision literalsA solver is finished when either:A model is found: SATAn assumption is negated or a global contradiction is learnt: UNSAT
118Slide119
119
a + b + c
a’
+
b + c
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: ExampleSlide120
120
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
Assumptions are picked firstSlide121
121
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’Slide122
122
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’
a
a’+
b+cSlide123
123
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’
a
a’+
b+c
a+b+c
a’Slide124
124
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’
b+c
a
a’+
b+c
a+b+c
a’Slide125
125
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’
b+c
a
a’+
b+c
a+b+c
a’
b + c
New clause learnedSlide126
126
a + b + c
a’
+
b + c
b’
Clauses:
Current assumptions: {b’}
Incremental SAT Solving in Minisat: Example
c’
b+c
a
a’+
b+c
a+b+c
a’
c
a
SAT
b + cSlide127
127
a + b + c
a’
+
b + c
Clauses:
Current assumptions:
{c’}
Incremental SAT Solving in Minisat: Example
b + cSlide128
128
a + b + c
a’
+
b + c
Clauses:
Current assumptions: {c’}
Incremental SAT Solving in Minisat: Example
b + c
c’
b’
b+c
b
Deduced by unit propagation in the learnt clauseSlide129
129
a + b + c
a’
+
b + c
Clauses:
Current assumptions: {c’}
Incremental SAT Solving in Minisat: Example
b + c
c’
b’
b+c
b
a
SATSlide130
Incremental SAT Solving Through Pervasive Clauses (Pre-Minisat)
Use a new SAT solver instance per invocationModel assumptions as unit clausesShare pervasive (assumption-independent) conflict clauses
130Slide131
131
a + b + c
a’
+
b + c
Clauses:
Current assumptions: {b’}
Pervasive Clause-based Incremental SAT Solving: Example
Solve the following instance:
a +
b + c
a’
+
b + c
b’
Solved, assume the following conflict clauses have been learnt:
a + b + c
a’
+
b + c
b’
b + c
a’ + c
Pervasive, to be reused in the next invocation
Temporary, to be removedSlide132
132
a + b + c
a’
+
b + c
Clauses:
Current assumptions:
{c’}
Pervasive Clause-based Incremental SAT Solving: Example
b + c
Solve the following instance:
a +
b + c
a’
+
b + c
c’
b + c
SATSlide133
Minisat
Pervasive Clause-based
Instances
One
Many
Heuristics
Incremental
Initialized
from scratch
Learning powerAll conflict clauses are pervasive
Conflict clauses might
be temporaryAssumption
PropagationNot propagated
Propagated
as facts before any decision
Conflict clause size
Long, since pervasive clauses must
contain assumptions
Short, no assumptions
Preprocessing
Incompatible with
SatELite
(
eliminated variables might be part of future clauses)
Compatible with
SatELite
133Slide134
Incremental Minisat is Incompatible with SatELite
First incremental call: (a + b) (a’ + c)eliminated a
the problem is reduced to (b + c)Assume the assumption set A={} is empty. The problem is SAT.
More clauses in the second call:
(b’)(a’)
(b + c)
(
b’)(a
’)
is SAT, whereas
the original problem
(a + b) (a’ + c)(
b’)(a’) is UNSAT
134Slide135
Minisat
Pervasive Clause-based
Fiver (Nadel&Ryvchin
&
Strichman, 2012-14)
Instances
One
Many
One
Heuristics
IncrementalInitialized
from scratch
Incremental
Learning powerAll conflict clauses are pervasive
Conflict
clauses m
ight
be temporary
All
conflict clauses are pervasive
Assumption
Propagation
Not propagated
Propagated
with BCP at the beginning
Propagated
with BCP at the beginning
Conflict clause size
Long, since pervasive clauses must
contain assumptions
Short, no assumptions
Short, no assumptions
Preprocessing
Incompatible with
SatELite
Compatible with
SatELite
Compatible with
SatELite
135Slide136
Incremental Solving in Fiver in a Glance
Compatibility with SatELite: eliminated variables are re-introduced or re-eliminatedAssumption propagation: Assumptions are propagated and eliminated
Pervasive vs. temporaryPervasive clauses are created from temporary clauses by inserting relevant assumptions to each
Bottom line:
incremental
solving in Fiver/Hazel is
substantially faster
than in other solvers
136Slide137
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
137Slide138
DiversekSet: Generating Diverse Solutions
DiversekSet in SAT: generate a user-given number of diverse
solutions, given a CNF formulaEureka, Nadel, 2011
138Slide139
New Initial
states
New Initial
states
New Initial
states
initial
states
deep bugs
Max
FV bound
Application: Semi-formal FPVSlide140
Multi-Threaded Search to Enhance Coverage
Choosing a single path through waypoints might miss the bugMust search along multiple diverse paths calculated:Slide141
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2Slide142
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2Slide143
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1Slide144
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1
2Slide145
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1
1
2Slide146
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1
1
2
3Slide147
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1
1
2
3
Variables
Solutions
Hamming DistanceSlide148
Diversification Quality as the Average Hamming Distance
Quality: the average Hamming distance between the solutions, normalized to [0…1]
a b c
1
0 0 0
2
1 1 0
3
0 1 14
1 0 01
23
41
2
3
4
Hamming distances matrix
2
2
1
1
2
3Slide149
Algorithms for Diversek
Set in SAT in a GlanceThe idea: Adapt a modern CDCL SAT solver for Diverse
kSetMake minimal changes to remain efficientCompact
algorithms:
Invoke the SAT solver once to generate all the solutions
Restart after a solution is generatedModify the polarity and variable selection heuristics for generating diverse solutionsSlide150
Algorithms for Diversek
Set in SAT in a Glance Cont. Polarity-based algorithms:
Change solely the polarity selection heuristicpRand: pick the polarity randomlypGuide
: pick the polarity so as to improve the diversification quality
Balance the number of 0’s and 1’s assigned to a variable by picking
{0,1} when variable was assigned
’
more times
pGuide
outperforms
pRand
in terms of both diversification quality and performanceQuality can be improved further by taking
BCP into account and adapting the variable orderingSlide151
Agenda
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
151Slide152
Unsatisfiable Core Extraction
An unsatisfiable core is an unsatisfiable subset of an unsatisfiable set of constraintsAn unsatisfiable core is minimal
if removal of any constraint makes it satisfiable (local minima
)
Has numerous applicationsSlide153
Example Application: Proof-based Abstraction Refinement for Model Checking;
McMillan et al.,’03; Gupta et al.,’03
No Bug
Valid
Model Check
A
BMC(
M,P,k
)
Cex C at depth k
Bug
No
A
A
latches/gates in the
UNSAT
core of
BMC(
M,P,k
)
Inputs: model M, property P
Output: does P hold under M?
Abstract model A
{ }
Spurious?
The UNSAT core is used for refinement
The UNSAT core is required
in terms of latches/gates
Yes
Turn latches/ gates into free inputsSlide154
Example Application 2: Assumption Minimization for Compositional Formal Equivalence Checking (FEC);
Cohen et al.,’10
FEC verifies the
equivalence
between the design
(RTL) and its implementation (schematics).
The
whole design is
too large
to be verified
at once
.
FEC is done on
small sub-blocks
, restricted with
assumptions
.Assumptions
required
for the
proof
of equivalence of sub-blocks
must be proved
relative to the driving logic.
MUC extraction
in terms of assumptions
is
vital
for feasibility.
Inputs
Outputs
Assumption
AssertionSlide155
Traditionally, a Clause-Level UC Extractor is the Workhorse
Clause-level UC extraction: given a CNF formula, extract an unsatisfiable subset of its
clauses
F
= (
a
+ b
) (
b’
+
c
)
(c’ ) (a’ + c
) ( b + c ) ( a + b + c’ )
U1 = (
a + b ) (b’
+
c
) (
c’
)
(
a’
+
c
)
(
b
+ c
) ( a +
b
+
c’
)
U2
=
(
a
+ b
)
(
b’
+
c
) (
c’
)
(
a’
+
c
) (
b
+ c
)
( a +
b
+
c’
)
U3
=
(
a
+ b
)
(
b’
+
c
) (
c’
)
(
a’
+
c
)
(
b
+ c
)
( a +
b
+
c’
)
Rich literature on clause-level UC extraction since 2002Slide156
Traditional UC Extraction for Practical Needs: the Input
An
interesting constraint
The
remainder
(the rest
of the formula)
The user is interested in a MUC in terms of these constraintsSlide157
Traditional UC Extraction: Example Input 1
An
unrolled latch
The rest of the unrolled circuit
Proof-based abstraction refinementSlide158
Traditional UC Extraction: Example Input 1
An
assumption
Equivalence between sub-block RTL and implementation
Assumption minimization for FEVSlide159
Traditional UC Extraction:
Stage 1: Translate to Clauses
An
interesting constraint
The
remainder
(the rest
of the formula)
Each small square is a propositional clause, e.g. (
a
+
b
’
)Slide160
Traditional UC Extraction:
Stage 2:
Extract a Clause-Level UC
An
interesting constraint
The
remainder
(the rest
of the formula)
Colored
squares belong to the clause-level UCSlide161
Traditional UC Extraction:
Stage 3:
Map the Clause-Level UC Back to the Interesting Constraints
An
interesting constraint
The
remainder
(the rest
of the formula)
The UC contains three interesting constraintsSlide162
High-Level Unsatisfiable Core Extraction
Real-world applications require reducing the number of interesting constraints in the core rather than clauses
Latches for abstraction refinementAssumptions for compositional FEVMost of the algorithms for UC extraction are
clause-level
High-level UC
: extracting a UC in terms of interesting constraints onlyLiffiton&Sakallah
, 2008; Nadel, 2010
;
Ryvchin&Strichman
, 2011;
Nadel&Ryvchin&Strichman
, 2014Slide163
Small/Minimal Clause-Level UC
Small/Minimal High-Level UC
A small clause-level UC, but the high-level UC is the largest possible:
A large clause-level UC, but the high-level UC is empty:Slide164
High-Level Unsatisfiable Core Extraction: Main Result
High-level MUC extractors solve families that are out of reach for clause-level MUC extractorsSlide165
Covered:
IntroductionEarly Days of SAT SolvingCore SAT SolvingConflict Analysis and LearningBoolean Constraint Propagation
Decision HeuristicsRestart StrategiesInprocessingExtensions to SAT
Incremental SAT Solving under Assumptions
Diverse Solutions Generation
High-level (group-oriented) MUC Extraction
165Slide166
166
Thanks!