Computationally Viable Handling of Beliefs in Arguments for Persuasion Emmanuel Hadoux and Anthony Hunter November 6, 2016 University College London EPSRC grant Framework for Computational Persuasion
Computationally Viable Handling of Beliefs inArguments for Persuasion
Emmanuel Hadoux and Anthony HunterNovember 6, 2016
University College LondonEPSRC grant Framework for Computational Persuasion
Introduction
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
Abstract argumentation framework
A1 A2 A3
Figure 1: Argument graph with 3 arguments
Based on Dung’s abstract argumentation framework [1]
Example (Figure 1)A1 = “It will rain, take an umbrella”A2 = “The sun will shine, no need for an umbrella”A3 = “Weather forecasts say it will rain”
2
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution
→ to positthe right argument
3
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
Refinement of a belief distribution
Each time a new argument is added to the dialogue, thedistribution needs to be updated.
A
B
Figure 2
AB P H1A(P) H0.75A (P)
11 0.6 0.7 0.67510 0.2 0.3 0.27501 0.1 0.0 0.02500 0.1 0.0 0.025
Table 1: Examples of Belief Redistribution
We can modulate the update to take into account differenttypes users (skeptical, credulous, etc.)
5
Splitting the distribution
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other
• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
Metagraphs
A1A2
A3
A4
A5 A6
A7
A8 A9 A10(a)
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
(b)
Figure 3: Argument graph and possible metagraph
7
Creating a split distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 4: Metagraph
We define three assumptions forthe split to be clean:1. Arguments from non directlyconnected flocks areconditionaly independent
2. Arguments in a flock areconsidered connected
3. Arguments in a flock areconditionally dependent
No bayesian networks because: not probabilities, users are notrational, etc.
8
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph
• However, an irreducible split may not be computable• Only the irreducible split is unique, we therefore need torank the others.
9
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph• However, an irreducible split may not be computable
• Only the irreducible split is unique, we therefore need torank the others.
9
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph• However, an irreducible split may not be computable• Only the irreducible split is unique, we therefore need torank the others.
9
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
10
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
Example (of valuation and ranking)Let P be the joint distribution for Figure 3a. Value ofP : 10× 210 = 10, 240
10
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
Example (of valuation andranking)P1 = (P(A5), P(A6 | A5),P(A4 | A5,A6), P(A2,A3 | A4,A7),P(A1 | A2,A3), P(A7,A8,A9,A10)):21+22+23+2×24+23+4×24 = 118
10
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
Example (of valuation and ranking)P2 = (P(A1,A2,A3,A4,A5,A6 | A7), P(A7,A8,A9,A10)):6× 27 + 4× 24 = 832.
We then see that P1 ≻ P2 ≻ P.
10
Experiments
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
Experiments with flocks of different sizes
# flocks # links 1 update 50 updates
2 flocks10 links 2ms 107ms30 links 6ms 236ms
4 flocks10 links 1ms 45ms30 links 3ms 114ms
10 flocks10 links 0.03ms 1.6ms30 links 0.06ms 2.5ms
Table 2: Computation Time for Updates in Different Graphs of 50Arguments (in ms)
12
Experiments with different numbers of arguments
# args Time for 20 updates Comparative %
25 497ns +0%50 517ns +4%75 519ns +4%100 533ns +7%
Table 3: Computation Time for 20 Updates (in ns)
13
Experiments
A new version of the library is currently begin developped inC++ and is available at: https://github.com/ComputationalPersuasion/splittercell.
As a rule of thumb, we should keep flocks to less than 25arguments each.
14
Conclusion
We have presented:
1. A framework to represent the belief of the opponent inthe arguments
2. How to create a split distribution using a metagraph3. How to rank the splits in order to choose the mostappropriate one w.r.t. the problem
4. Experiments showing the viability of the approach
Next step: adapt this work to the whole project to scale.
15
Conclusion
We have presented:
1. A framework to represent the belief of the opponent inthe arguments
2. How to create a split distribution using a metagraph3. How to rank the splits in order to choose the mostappropriate one w.r.t. the problem
4. Experiments showing the viability of the approach
Next step: adapt this work to the whole project to scale.
15
Thank you!
15
Phan Minh Dung.On the acceptability of arguments and its fundamentalrole in nonmonotonic reasoning, logic programming, andn-person games.Artificial Intelligence, 77:321–357, 1995.
Anthony Hunter.A probabilistic approach to modelling uncertain logicalarguments.International Journal of Approximate Reasoning,54(1):47–81, 2013.
15