Gerald Steinbauer Institute for Software Technology 1 Advanced Topics of AI – Constraint Satisfaction Problem Gerald Steinbauer Institute for Software Technology Advanced Topics in AI - Constraint Satisfaction Problem -
Gerald Steinbauer
Institute for Software Technology
1
Advanced Topics of AI – Constraint Satisfaction Problem
Gerald Steinbauer
Institute for Software Technology
Advanced Topics in AI - Constraint Satisfaction Problem -
Gerald Steinbauer
Institute for Software Technology
2
Advanced Topics of AI – Constraint Satisfaction Problem
Map Coloring
– Map Coloring Problem: assign colors to each region s.t. no
neighboring regions have the same color
– Representation as CSP:
• 𝑉 = {𝑊𝐴,𝑁𝑇, 𝑆𝐴, 𝑄,𝑁𝑆𝑊,𝑉, 𝑇}; 𝑑𝑣𝑖 = {𝑟𝑒𝑑, 𝑔𝑟𝑒𝑒𝑛, 𝑏𝑙𝑢𝑒};
• 𝐶 = {𝑊𝐴 ≠ 𝑁𝑇,𝑊𝐴 ≠ 𝑆𝐴,𝑁𝑇 ≠ 𝑆𝐴,𝑁𝑇 ≠ 𝑄, 𝑆𝐴 ≠ 𝑄,
𝑆𝐴 ≠ 𝑁𝑆𝑊, 𝑆𝐴 ≠ 𝑉,𝑄 ≠ 𝑁𝑆𝑊,𝑁𝑆𝑊 ≠ 𝑉}
[ArtInt2002]
Gerald Steinbauer
Institute for Software Technology
3
Advanced Topics of AI – Constraint Satisfaction Problem
Map Coloring
– example solution: • 𝑊𝐴 = 𝑟𝑒𝑑,𝑁𝑇 = 𝑔𝑟𝑒𝑒𝑛, 𝑆𝐴 = 𝑏𝑙𝑢𝑒, 𝑄 = 𝑟𝑒𝑑,𝑁𝑆𝑊 = 𝑔𝑟𝑒𝑒𝑛, 𝑉 = 𝑟𝑒𝑑, 𝑇 = 𝑟𝑒𝑑
[ArtInt2002]
Gerald Steinbauer
Institute for Software Technology
4
Advanced Topics of AI – Constraint Satisfaction Problem
N-Queens Problem
• Positioning of N queens on
a NxN board so that none
of the queens endangers
any of the other queens
an example solution for the
8-queens problem
Gerald Steinbauer
Institute for Software Technology
5
Advanced Topics of AI – Constraint Satisfaction Problem
Motivation
• many important and interesting problems can be
efficient and elegant expressed by • a set of variables with domains
• a set of constraints on that variables – constrain the domain of a
variable
• constraint satisfaction problem (CSP) deals with this
representation
• many real-world application domains • configuration
• task and resource allocation
• scheduling
• planning
• generation of test cases
Gerald Steinbauer
Institute for Software Technology
6
Advanced Topics of AI – Constraint Satisfaction Problem
Relations
• definition • a relation over sets 𝑋1, … , 𝑋𝑛 is a subset
• 𝑅 ⊆ 𝑋1 ×⋯× 𝑋𝑛 =: 𝑋𝑗1≤𝑗≤𝑛
• the number 𝑛 is referred to as arity of 𝑅
• an 𝑛-ary relation on a set 𝑋 is a subset • 𝑅 ⊆ 𝑋𝑛 ≔ 𝑋 ×⋯× 𝑋 (𝑛 times)
• since relations are sets, set-theoretical operations
(union, intersection, complement) can be applied to
relations as well
Gerald Steinbauer
Institute for Software Technology
7
Advanced Topics of AI – Constraint Satisfaction Problem
Constraints, Relations and Variables
• constraints can be expressed by relations that restrict
value assignments to variables
• consider variables 𝑥1, 𝑥2, 𝑥3 and relations 𝐵, 𝐶 defined
by: • 𝐵 = 𝑥, 𝑦, 𝑧 ∈ 0…3 3 𝑥 < 𝑦 < 𝑧
• 𝐶 = {(𝑥, 𝑦, 𝑧) ∈ 0…3 3|𝑥 > 𝑦 > 𝑧}
Gerald Steinbauer
Institute for Software Technology
8
Advanced Topics of AI – Constraint Satisfaction Problem
Relations over Variables
• let 𝑉 be a set of variables. for 𝑣 ∈ 𝑉 , let 𝑑𝑜𝑚(𝑣) be a
non-empty set (of values), called the domain of 𝑣
• definition: a relation over (pairwise distinct) variables
𝑣1, … , 𝑣𝑛 ∈ 𝑉 is a pair 𝑅𝑣1,…,𝑣𝑛 ≔ ( 𝑣1, … , 𝑣𝑛 , 𝑅)
where 𝑅 is a relation over 𝑑𝑜𝑚 𝑣1 , … , 𝑑𝑜𝑚(𝑣𝑛)
• the sequence (𝑣1, … , 𝑣𝑛) is referred to as the scheme
(or: range), the set {𝑣1, … , 𝑣𝑛 } as the scope, and 𝑅 as
the graph of 𝑅𝑣1,…,𝑣𝑛
• we will not always distinguish between a relation over
variables and its graph (and between scope and
scheme), e. g., we write 𝑅𝑣1,…,𝑣𝑛 ⊆ 𝑑𝑜𝑚 𝑣1 ×⋯× 𝑑𝑜𝑚 𝑣𝑛
Gerald Steinbauer
Institute for Software Technology
9
Advanced Topics of AI – Constraint Satisfaction Problem
• a Constraint Satisfaction Problem (or CSP) is defined by • a set of variables 𝑉 = {𝑣1, 𝑣2, … , 𝑣𝑚}, and
• a set of constraints 𝐶 = {𝑐1, 𝑐2, … , 𝑐𝑛}
• each variable 𝑣𝑖 has a nonempty domain 𝑑𝑣𝑖 of
possible values
• each constraint 𝑐𝑗 involves some subset of the variables and specifies the allowable combinations of values for that subset
Constraint Satisfaction Problem (CSP) I
Gerald Steinbauer
Institute for Software Technology
10
Advanced Topics of AI – Constraint Satisfaction Problem
• a state of the problem is defined by an assignment of values to some or all of the variables
• an assignment that does not violate any constraints is called a consistent or legal assignment
• a complete assignment is one in which every variable is mentioned
• a solution to a CSP is a complete assignment that satisfies all the constraints
• let 𝑑 be the maxiumum domain size → 𝑂(𝑑 𝑚)
Constraint Satisfaction Problem (CSP) II
Gerald Steinbauer
Institute for Software Technology
11
Advanced Topics of AI – Constraint Satisfaction Problem
Map Coloring Interactive
Gerald Steinbauer
Institute for Software Technology
12
Advanced Topics of AI – Constraint Satisfaction Problem
Boolean Satisfiability
• problem instance (Boolean constraint network): • variables: (propositional) variables
• domains: truth values {0,1} for each variable
• constraints: defined by a propositional formula in these variables
• example: (𝑥1 ∨ ¬𝑥2 ∨ ¬𝑥3) ∧ (𝑥1 ∨ 𝑥2 ∨ 𝑥4)
• SAT as a constraint satisfaction problem: • given an arbitrary Boolean constraint network, is the network
solvable?
• SAT is NP-complete!
Gerald Steinbauer
Institute for Software Technology
13
Advanced Topics of AI – Constraint Satisfaction Problem
3-SAT
• 3-SAT: the problem of deciding if a propositional
formula in conjunctive normal form (CNF) with
clauses with at most 3 literals. 3-SAT is NP-complete
• 2-SAT: for this special problem (only 2 literals per
clause) exist polynomial algorithms
Gerald Steinbauer
Institute for Software Technology
14
Advanced Topics of AI – Constraint Satisfaction Problem
Computational Complexity
• in general the Constraint Satisfaction Problem (CSP)
with finite domains is NP-complete
• proof sketch: a CSP in general form can be reduced
to SAT problem
• special cases: some cases like CSPs with binary
variables and binary constraint can be solved faster
Gerald Steinbauer
Institute for Software Technology
15
Advanced Topics of AI – Constraint Satisfaction Problem
Constraint Network
• a constraint network is a triple 𝑁 ≔ 𝑉, 𝑑𝑜𝑚, 𝐶 where: • 𝑉 is a non-empty and finite set of variables
• 𝑑𝑜𝑚 is a function that assigns to each variable 𝑣 ∈ 𝑉 a non-empty
set 𝑑𝑜𝑚(𝑣) (𝑑𝑜𝑚(𝑣) is called the domain of 𝑣, elements of 𝑑𝑜𝑚(𝑣) are called values)
• 𝐶 is a set of relations over variables of 𝑉 (called constraints), i.e.,
each constraint is a relation 𝑅𝑥1,…,𝑥𝑚 over some scheme 𝑆 =
(𝑥1, … , 𝑥𝑚) of variables in 𝑉
Gerald Steinbauer
Institute for Software Technology
16
Advanced Topics of AI – Constraint Satisfaction Problem
Constraint Network
• if we assume an ordering of the variables in 𝑉 , we
can write networks more compactly:
• definition: a constraint network is a triple 𝑁 ≔𝑉,𝐷, 𝐶 where: • 𝑉 = 𝑣1, … , 𝑣𝑛 is a non-empty and finite sequence of variables
• 𝐷 = 𝐷1, … , 𝐷𝑛 is a sequence of domains for 𝑉 (𝐷𝑖 is the domain of
variable 𝑣𝑖)
• 𝐶 is a set of constraints 𝑅𝑥 where 𝑥 = 𝑣𝑖1 , … , 𝑣𝑖𝑚 is a scheme of
variables in 𝑉 and 𝑅 ⊆ 𝐷𝑖1 ×⋯× 𝐷𝑖1𝑚
Gerald Steinbauer
Institute for Software Technology
17
Advanced Topics of AI – Constraint Satisfaction Problem
Example – 4-Queens Problem
Gerald Steinbauer
Institute for Software Technology
18
Advanced Topics of AI – Constraint Satisfaction Problem
Solution of a Constraint Network
• definition: a solution of a constraint network
𝑁 = 𝑉,𝐷, 𝐶 i is a (variable) assignment
𝑎: 𝑉 → 𝐷𝑖𝑖:𝑣𝑖∈𝑉
• such that • 𝑎(𝑣𝑖) ∈ 𝐷𝑖, for each 𝑣𝑖 ∈ 𝑉
• (𝑎 𝑥1 , … , 𝑎 𝑥𝑚 ) ∈ 𝑅 for each 𝑅𝑥1,…,𝑥𝑚 constraints in 𝐶
• 𝑁 is called solvable (or: satisfiable) if 𝑁 has a solution
• 𝑆𝑜𝑙(𝑁) denotes the set of all solutions of 𝑁
Gerald Steinbauer
Institute for Software Technology
19
Advanced Topics of AI – Constraint Satisfaction Problem
Types of Constraints
• Unary Constraints – restrict the values of a single variable
– e.g., 𝑣1 ≠ 5
• Binary Constraints – binary constraints relate two variables
– e.g., 𝑣1 < 𝑣2
• Higher order constraints – involve three or more variables
– e.g., (𝑣1= 3 ⋁𝑣2 = 5)⋀𝑣3 > 4
• Extensional vs. Intensional – 𝑑𝑜𝑚 𝑣1 = {1,2}
– intensional: 𝑣1 = 𝑣2
– extensional: { 1,1 , (2,2)}
Gerald Steinbauer
Institute for Software Technology
20
Advanced Topics of AI – Constraint Satisfaction Problem
Global Constraints
• what are global constraints? • type of similar constraint relations . . .
• . . . differing in the number of variables
• semantically redundant: same constraint can be expressed by a
conjunction of simpler constraints
• similar structure: can be exploited by constraint solvers
• examples • sum constraint
• knapsack constraint
• element constraint
• all-different constraint
• cardinality constraints
Gerald Steinbauer
Institute for Software Technology
21
Advanced Topics of AI – Constraint Satisfaction Problem
all-different Constraint
• let 𝑣1, … , 𝑣𝑛 be variables each with a domain 𝐷𝑖 (1 ≤ 𝑖 ≤ 𝑛): 𝑎𝑙𝑙 − 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡(𝑣1, … , 𝑣𝑛) ≔
(𝑑1, … , 𝑑𝑛) ∈ 𝐷1 ×⋯× 𝐷𝑛|𝑑𝑖 ≠ 𝑑𝑗 for 𝑖 ≠ 𝑗
• the all-different constraint is a simple, but widely used
global constraint in constraint programming
• it allows for compact modeling of CSPs
Gerald Steinbauer
Institute for Software Technology
22
Advanced Topics of AI – Constraint Satisfaction Problem
Sum Constraint
• let 𝑣1, … , 𝑣𝑛, 𝑧 be variables with subsets of ℚ as
domain. for each 𝑣𝑖, let 𝑐𝑖 ∈ ℚ be some fixed scalar,
𝑐 = (𝑐1, … , 𝑐𝑛).
• definition: the sum constraint is defined as: 𝑠𝑢𝑚 𝑣1, … , 𝑣𝑛, 𝑧, 𝑐
≔ (𝑑1, … , 𝑑𝑛, 𝑑) ∈ 𝐷𝑖1≤𝑖≤𝑛
× 𝐷𝑧|𝑑 = 𝑐𝑖𝑑𝑖1≤𝑖≤𝑛
Gerald Steinbauer
Institute for Software Technology
23
Advanced Topics of AI – Constraint Satisfaction Problem
Global Cardinality Problem
• 𝑣1, … , 𝑣𝑛: assignment variables with 𝐷𝑣𝑖 ⊆ 𝑑1∗,… ,𝑑𝑚
∗
• 𝑐1, … , 𝑐𝑚: count variables with the set of integer as
domains
• the global cardinality constraint is defined as:
𝑔𝑐𝑐 𝑣1, … , 𝑣𝑛, 𝑐1, … , 𝑐𝑚 ≔(𝑑1, … , 𝑑𝑛, 𝑜1, … , 𝑜𝑚) ∈ 𝐷𝑣𝑖1≤𝑖≤𝑛 × 𝐷𝑐𝑗1≤𝑗≤𝑚 |
for each 𝑗, 𝑑𝑗∗ occurs 𝑖𝑛 𝑑1, … , 𝑑𝑛 exactly 𝑜𝑗 times
• can be considered a generalization of the all-different
constraint
Gerald Steinbauer
Institute for Software Technology
24
Advanced Topics of AI – Constraint Satisfaction Problem
Circuit Constraint
• let s = (𝑠1, … , 𝑠𝑛) be a permutation of 1,… , 𝑛 .
• define 𝐶𝑠 as the smallest set that contains 1 and with
each element 𝑖 also 𝑠𝑖
• (𝑠1, … , 𝑠𝑛) is called cyclic if 𝐶𝑠 = 1,… , 𝑛 .
• definition: let 𝑣1, … , 𝑣𝑛 be variables with domains
𝐷𝑖 = 1,… , 𝑛 (1 ≤ 𝑖 ≤ 𝑛). 𝑐𝑖𝑟𝑐𝑢𝑖𝑡(𝑣1, … , 𝑣𝑛) ≔𝑑1, … , 𝑑𝑛 ∈ 𝐷1 ×⋯× 𝐷𝑛 (𝑑1, … , 𝑑𝑛 is cyclic
• given an assignment a= (𝑑1, … , 𝑑𝑛) define
𝐴: 𝑣𝑖𝑣𝑑𝑖 𝑑𝑖 ∈ 𝐷𝑖 , 1 ≤ 𝑖 ≤ 𝑛
• then, 𝑎 satisfies 𝑐𝑖𝑟𝑐𝑢𝑖𝑡(𝑣1, … , 𝑣𝑛) iff (𝑉, 𝐴) is a
directed cycle (without proper sub-cycles)
Gerald Steinbauer
Institute for Software Technology
25
Advanced Topics of AI – Constraint Satisfaction Problem
Solving CSP
• basic concept • solving using search
• goal • find an assignment for the variables which is consistent and
complete
• basic approach • select variable and assign a corresponding value
• test consistency between instantiated variables and constraints
• backtrack in the case of an inconsistency
• search with backtracking is a complete search
algorithm • every existing solution is found
Gerald Steinbauer
Institute for Software Technology
26
Advanced Topics of AI – Constraint Satisfaction Problem
State Spaces
• a state space is a 4-tuple 𝒮 = 𝑆, 𝑠0, 𝑆∗, 𝑂 where • 𝑆 is a finite set of states
• 𝑠0 ∈ 𝑆 is the initial state
• 𝑆∗ ⊆ 𝑆 is the set of goal states
• 𝑂 is a finite set of operators, where each operator 𝑜 ∈ 𝑂 is a partial
function on 𝑆, i.e. 𝑜: 𝑆′ → 𝑆 for some 𝑆′ ⊆ 𝑆
• we say that an operator 𝑜 is applicable in state 𝑠 iff 𝑜(𝑠) is defined
Gerald Steinbauer
Institute for Software Technology
27
Advanced Topics of AI – Constraint Satisfaction Problem
State Spaces for Constraint Networks
• states represent different partial variable assignments
• usually operators represent adding an assignment
• the state spaces for constraint networks usually have
two special properties • the search graphs are trees (i.e., there is exactly one path from the
initial state to any reachable search state)
• all solutions are at the same level of the tree
• due to these properties, variations of depth-first
search are usually the method of choice for solving
constraint networks
Gerald Steinbauer
Institute for Software Technology
28
Advanced Topics of AI – Constraint Satisfaction Problem
Backtracking
• backtracking: search systematically for consistent
partial instantiations in a depth-first manner • forward phase: extend the current partial solution by assigning a
consistent value to some new variable (if possible)
• backward phase: if no consistent instantiation for the current
variable exists, we return to the previous variable
Gerald Steinbauer
Institute for Software Technology
29
Advanced Topics of AI – Constraint Satisfaction Problem
Backtracking backtracking(𝑁,𝑎):
input: a constraint network 𝑁 = 𝑉,𝐷, 𝐶 and a
partial assignment 𝑎 of 𝑁
output: a solution of 𝑁 or “inconsistent”
if 𝑎 is inconsistent with 𝑁 then
return “inconsistent”
if 𝑎 is defined for all variables in 𝑉 then
return 𝑎
select some variable 𝑣𝑖 for which 𝑎 is not defined
for each value 𝑥 form 𝐷𝑖
𝑎′ ≔ 𝑎 ∪ {𝑣𝑖 ↦ 𝑥}
𝑎′′ ← 𝑏𝑎𝑐𝑘𝑡𝑟𝑎𝑐𝑘𝑖𝑛𝑔 𝑁, 𝑎′
if 𝑎′′ is not “inconsistent” then
return 𝑎′′
return “inconsistent”
Gerald Steinbauer
Institute for Software Technology
30
Advanced Topics of AI – Constraint Satisfaction Problem
Complexity - Backtracking
• backtracking performs uninformed search
• let 𝑑 the maximum number of values in 𝑑𝑜𝑚 𝑣𝑖 𝑖, … , 𝑛
• in worst case we have to check 𝑂(𝑑𝑛) instantiations
• unfortunately: let 𝑒 be the number of constraints and 𝑟 be the maximum arity of the constraints
• a consistent check (lookup in one related constraints)
has 𝑂(𝑟 log 𝑑 )
• we need at most 𝑒 constraints leading to 𝑂(𝑒 𝑟 log 𝑑 )
• we have to check up to 𝑑 values for a selection
𝑂(𝑑 𝑒 𝑟 log 𝑑 )
Gerald Steinbauer
Institute for Software Technology
31
Advanced Topics of AI – Constraint Satisfaction Problem
Backtracking - Example
Gerald Steinbauer
Institute for Software Technology
32
Advanced Topics of AI – Constraint Satisfaction Problem
Improving Backtracking
• dead ends increases runtime
• we like to improve backtracking in order to avoid as
many as possible dead ends
• minimize the search space
Gerald Steinbauer
Institute for Software Technology
33
Advanced Topics of AI – Constraint Satisfaction Problem
Ordered Search Spaces
• let 𝑁 = 𝑉,𝐷, 𝐶 be a constraint network
• definition (variable ordering) • a variable ordering of 𝐶 is a permutation of the variable set 𝑉
• we write variable orderings in sequence notation: 𝑣1, … , 𝑣𝑛
• definition (ordered search space) • let σ = 𝑣1, … , 𝑣𝑛 a variable ordering of 𝐶
• the ordered search space of 𝐶 along ordering 𝜎 is the state
space obtained from the unordered search space of 𝐶 by
restricting each operator 𝑜𝑣𝑖=𝑎𝑖to states with 𝑠 = 𝑖 − 1
• in other words, in the initial state, only 𝑣1 can be assigned, then
only 𝑣2, then only 𝑣3, . . .
Gerald Steinbauer
Institute for Software Technology
34
Advanced Topics of AI – Constraint Satisfaction Problem
The Importance of Good Orderings
• all ordered search spaces for the same constraint
network contain the same set of solution states
• however, the total number of states can vary
dramatically between different orderings
• the size of a state space is a (rough) measure for the
hardness of finding a solution, so we are interested in
small search spaces
• one way of measuring the quality of a state space is
by counting the number of dead ends: the fewer, the
better
Gerald Steinbauer
Institute for Software Technology
35
Advanced Topics of AI – Constraint Satisfaction Problem
Variable Ordering Example
• three variables 𝑥, 𝑦, 𝑧 with domains 𝐷𝑥 = 2,3,4 , 𝐷𝑦 = 2,5,6 , 𝐷𝑧 = 2,3,5 and the constraint that the
assignment to 𝑧 has to integer divide the
assignments to 𝑥, 𝑦
Gerald Steinbauer
Institute for Software Technology
36
Advanced Topics of AI – Constraint Satisfaction Problem
Dynamic Variable Ordering
• common heuristic: fail-first
• always select a variable whose remaining domain
has a minimal number of elements.
• intuition: few subtrees small search space
• extreme case: only one value left no search
compare unit propagation in DPLL procedure
• should be combined with a constraint propagation
technique such as forward checking or arc
consistency
Gerald Steinbauer
Institute for Software Technology
37
Advanced Topics of AI – Constraint Satisfaction Problem
Look-Ahead and Lock-Back
• look-ahead: invoked when next variable or next value
is selected. for example: • which variable should be instantiated next? prefer variables that
impose tighter constraints on the rest of the search space
• which value should be chosen for the next variable? maximize
the number of options for future assignments
• look-back: invoked when the backtracking step is
performed after reaching a dead end. for example: • how deep should we backtrack? avoid irrelevant backtrack
points (by analyzing reasons for the dead end and jumping back to
the source of failure)
• how can we learn from dead ends? record reasons for dead
ends as new constraints so that the same inconsistencies can be
avoided at later stages of the search
Gerald Steinbauer
Institute for Software Technology
38
Advanced Topics of AI – Constraint Satisfaction Problem
Backtracking with Look-Ahead look-ahead(𝑁,𝑎):
input: a constraint network 𝑁 = 𝑉,𝐷, 𝐶 and a partial assignment 𝑎 of 𝑁
output: a solution of 𝑁 or “inconsistent”
selectValue(𝑣𝑖 , 𝑎, 𝑁): procedure that selects and deletes a consistent value
𝑥𝑖 ∈ 𝐷𝑖; side-effect: 𝑁 is refined; returns 0, if all 𝑎 ∪ {𝑣𝑖 ↦ 𝑥} are inconsistent
if 𝑎 is inconsistent with 𝑁 then
return “inconsistent”
if 𝑎 is defined for all variables in 𝑉 then
return 𝑎
select a variable 𝑣𝑖 for which 𝑎 is not defined
𝑁′ ← 𝑁,𝐷′𝑖 = 𝐷𝑖
while 𝐷′𝑖 is not empty
𝑥,𝑁′ ← 𝑠𝑒𝑙𝑒𝑐𝑡𝑉𝑎𝑙𝑢𝑒 𝑣𝑖 , 𝑎, 𝑁′
if 𝑥 ≠ 0
𝑎′ ← 𝑏𝑎𝑐𝑘𝑡𝑟𝑎𝑐𝑘𝑖𝑛𝑔 𝑁′, 𝑎 ∪ {𝑣𝑖 ↦ 𝑥}
if 𝑎′ is not “inconsistent” then
return 𝑎′
return “inconsistent”
Gerald Steinbauer
Institute for Software Technology
39
Advanced Topics of AI – Constraint Satisfaction Problem
Forward Checking
selectValue-FC(𝑣𝑖 , 𝑎, 𝑁)
select and delete 𝑥 from 𝐷𝑖
for each 𝑣𝑗 sharing a constraint with 𝑣𝑖 for which 𝑎 is not defined
𝐷′𝑗 ← 𝐷𝑗
for each value 𝑦 ∈ 𝐷′𝑗
if not consistent(a ∪ {𝑣𝑖 ↦ 𝑥, 𝑣𝑗 ↦ 𝑦} )
remove 𝑦 from 𝐷′𝑗
if 𝐷′𝑗is empty then
return 0
else
𝐷𝑗 ← 𝐷′𝑗
return 𝑥
Gerald Steinbauer
Institute for Software Technology
40
Advanced Topics of AI – Constraint Satisfaction Problem
Look-Ahead
• forward checking is 𝑂(𝑒 𝑑2) in worst case where 𝑑 is
the cardinality of the largest domain and 𝑒 the
number of constraints
• remark • keeping the balance between pruning the search space and cost of
look-ahead
• forward checking offers good trade-off
Gerald Steinbauer
Institute for Software Technology
41
Advanced Topics of AI – Constraint Satisfaction Problem
Enforcing Consistency
• the more explicit and tight constraint networks are,
the more restricted is the search space of partial
solutions
• idea: infer new constraints without “removing" (by
methods called local consistency enforcing, bounded
consistency inference, constraint propagation)
• consistency-enforcing algorithms aim at assisting
search: How can we extend a given partial solution of
a small subnetwork to a partial solution of a larger
subnetwork?
Gerald Steinbauer
Institute for Software Technology
42
Advanced Topics of AI – Constraint Satisfaction Problem
Arc Consistency I
• let 𝑁 = 𝑉,𝐷, 𝐶 be a constraint network
• variable 𝑣𝑖 is arc-consistent relative to variable 𝑣𝑗 if for
each value 𝑎𝑖 ∈ 𝐷𝑖, there exists an 𝑎𝑗 ∈ 𝐷𝑗 with
(𝑎𝑖 , 𝑎𝑗) ∈ 𝑅𝑖𝑗 (in case that 𝑅𝑖𝑗 exists in 𝐶)
• an arc constraint 𝑅𝑖𝑗 is arc-consistent if 𝑣𝑖 is arc-
consistent relative to 𝑣𝑗 and 𝑣𝑗 is arc-consistent rel. to 𝑣𝑖
• a network 𝑁 is arc-consistent if all its arc constraints are
arc-consistent
Gerald Steinbauer
Institute for Software Technology
43
Advanced Topics of AI – Constraint Satisfaction Problem
Arc Consistency II
• lemma: checking whether a network 𝑁 = 𝑉,𝐷, 𝐶 is arc-
consistent requires at most 𝑒𝑑2 operations (where 𝑒 is
the number of its binary constraints and 𝑑 is an upper
bound of its domain sizes)
Gerald Steinbauer
Institute for Software Technology
44
Advanced Topics of AI – Constraint Satisfaction Problem
Arc Consistency - Example
• consider a constraint network with two variables 𝑣1 and 𝑣2, domains 𝐷1 = 𝐷2 = {1,2,3}, and the binary
constraint expressed by 𝑣1 < 𝑣2.
Gerald Steinbauer
Institute for Software Technology
45
Advanced Topics of AI – Constraint Satisfaction Problem
Arc Consistency III
• we can shrink the domain of 2 variables to achieve arc-
consistency
• if a value is not part of this solution to a subnetwork – it
will not be part of a global solution
• if arc-consistency is not achievable – no solution exists
Gerald Steinbauer
Institute for Software Technology
46
Advanced Topics of AI – Constraint Satisfaction Problem
Domain Revising
revise(𝑣𝑖,𝑣𝑗 , 𝑅𝑖𝑗)
input: a network with two variables 𝑣𝑖,𝑣𝑗,
domains 𝐷𝑖,𝐷𝑗 and constraint 𝑅𝑖𝑗
output: a network with refined 𝐷𝑖 such that 𝑣𝑖 is
arc-consistent relative to 𝑣𝑗
for each 𝑎𝑖 ∈ 𝐷𝑖
if there is no 𝑎𝑗 ∈ 𝐷𝑗 with (𝑎𝑖 , 𝑎𝑗) ∈ 𝑅𝑖𝑗 then
remove 𝑎𝑖 from 𝐷𝑖
Gerald Steinbauer
Institute for Software Technology
47
Advanced Topics of AI – Constraint Satisfaction Problem
Domain Revising
• lemma: the complexity of revise is 𝑂(𝑑2), where 𝑑 is
an upper bound of the domain sizes.
• note: with a simple modification of the revise
algorithm one could improve to 𝑂(𝑡), where 𝑡 is the
maximal number of tuples occurring in one of the
binary constraints in the network.
Gerald Steinbauer
Institute for Software Technology
48
Advanced Topics of AI – Constraint Satisfaction Problem
AC1
AC1(𝑁)
input: a constraint network 𝑁 = 𝑉,𝐷, 𝐶
output: 𝑁 arc-consistent, but equivalent to input
network
repeat
for each arc {𝑣𝑖 , 𝑣𝑗} with 𝑅𝑖𝑗 ∈ 𝐶
revise(𝑣𝑖 , 𝑣𝑗)
revise(𝑣𝑗 , 𝑣𝑖)
endfor
until no domain is changed
Gerald Steinbauer
Institute for Software Technology
49
Advanced Topics of AI – Constraint Satisfaction Problem
Complexity AC1
• lemma: let 𝑁 be a constraint network with 𝑛 variables,
each with a domain of size ≤ 𝑘, and 𝑒 binary
constraints. applying AC1 on the network runs in time
𝑂(𝑒 𝑛 𝑘3).
• proof: one cycle through all binary constraints takes
𝑂(𝑒 𝑘2). In the worst case, one cycle just removes
one value from one domain. moreover, there are at
most 𝑛𝑘 values. This results in an upper bound of
𝑂(𝑒 𝑛 𝑘3). ∎
• note: if the input network is already arc-consistent,
then AC1 runs in time 𝑂(𝑒 𝑘2).
Gerald Steinbauer
Institute for Software Technology
50
Advanced Topics of AI – Constraint Satisfaction Problem
AC3
• idea: no need to process all constraints if only a few domains have
changed. operate on a queue of constraints to be processed.
AC3(𝑁)
input: a constraint network 𝑁 = 𝑉,𝐷, 𝐶
output: 𝑁 arc-consistent, but equivalent to input network
for each pair 𝑣𝑖 , 𝑣𝑗 that occurs in a constraint 𝑅𝑖𝑗
𝑞𝑢𝑒𝑢𝑒 ← 𝑞𝑢𝑒𝑢𝑒 ∪ { 𝑣𝑖 , 𝑣𝑗 , (𝑣𝑗 , 𝑣𝑖)}
endfor
while 𝑞𝑢𝑒𝑢𝑒 is not empty
select and remove (𝑣𝑖 , 𝑣𝑗) from 𝑞𝑢𝑒𝑢𝑒
revise(𝑣𝑖 , 𝑣𝑗)
if revise(𝑣𝑖 , 𝑣𝑗) changed 𝐷𝑖 then
𝑞𝑢𝑒𝑢𝑒 ← 𝑞𝑢𝑒𝑢𝑒 ∪ { 𝑣𝑘 , 𝑣𝑖 |𝑘 ≠ 𝑖, 𝑘 ≠ 𝑗}
endwhile
Gerald Steinbauer
Institute for Software Technology
51
Advanced Topics of AI – Constraint Satisfaction Problem
Complexity AC3
• lemma: let 𝑁 be a constraint network with 𝑛 variables,
each with a domain of size ≤ 𝑘, and 𝑒 binary
constraints. applying AC3 on the network runs in time
𝑂(𝑒 𝑘3).
• proof. consider a single constraint. each time, when it
is reintroduced into the queue, the domain of one of
its variables must have been changed. since there
are at most 2𝑘 values, AC3 processes each
constraint at most 2𝑘 times. Because we have
𝑒 constraints and processing of each is in time 𝑂(𝑘2), we obtain 𝑂(𝑒 𝑘3) . ∎
• note: If the input network is already arc-consistent,
then AC3 runs in 𝑂(𝑒 𝑘2).
Gerald Steinbauer
Institute for Software Technology
52
Advanced Topics of AI – Constraint Satisfaction Problem
Remark AC
• note: enforcing arc consistency may already be
sufficient to show that a constraint network is
inconsistent one or more domains become empty!
• sometimes “enforcing arc consistency" is sufficient for
detecting inconsistent (unsolvable) networks; but . . .
• enforcing arc consistency is not complete for deciding
consistency of networks; because . . .
• inferences rely only on domain constraints and single
binary constraints defined on the domains
Gerald Steinbauer
Institute for Software Technology
53
Advanced Topics of AI – Constraint Satisfaction Problem
Sudoku
Gerald Steinbauer
Institute for Software Technology
54
Advanced Topics of AI – Constraint Satisfaction Problem
Literature
• parts of the lecture are based on the “Constraint
Satisfaction Problems” lecture of the Research Group
Foundations of Artificial Intelligence of the University
of Freiburg (http://www.informatik.uni-freiburg.de/~ki/)
• Rina Dechter. Constraint Processing. The Morgan
Kaufmann Series in Artificial Intelligence. 2003.