Software components as monadic, weighted Mealy machines in typed linear algebra J.N. Oliveira (joint work with L.S. Barbosa and D. Murta) DCC/FCUP, Porto, 19th Sep., 2013 Grant PTDC/EIA-CCO/122240/2010 INESC TEC & University of Minho
Software components as monadic, weightedMealy machines in typed linear algebra
J.N. Oliveira(joint work with L.S. Barbosa and D. Murta)
DCC/FCUP, Porto, 19th Sep., 2013
Grant PTDC/EIA-CCO/122240/2010
INESC TEC & University of Minho
Motivation MMM Components Faulty components LAoP Wrapping up References
Motivation
Safety and certification
Interested in the opportunities open for Formal Methodsby RTCA DO 178C for certifying airborne software.
Challenged by
(...) the use of formal methods to be ”at least as goodas” a conventional approach that does not use formalmethods. (Joyce, 2011)
[ ... ”at least as good as” ? ... ]
Motivation MMM Components Faulty components LAoP Wrapping up References
Qualitative vs quantitative
Quoting Jackson (2009):
A dependable system is one (..) in which you can placeyour reliance or trust. A rational person or organizationonly does this with evidence that the system’s benefitsoutweigh its risks.
In formula
dependable system = benefit + risk
one finds:
• benefit = qualitative
• risk = quantitative.
Motivation MMM Components Faulty components LAoP Wrapping up References
P(robabilistic)R(isk)A(nalysis)
NASA/SP-2011-3421 (Stamatelatos and Dezfuli, 2011):
1.2.2 A PRA characterizes risk in terms of three basicquestions: (1) What can go wrong? (2) How likely isit? and (3) What are the consequences?
The PRA process
answers these questions by systematically (...)identifying, modeling, and quantifying scenarios thatcan lead to undesired consequences
Moreover,
1.2.3 (...) The total probability from the set ofscenarios modeled may also be non-negligible eventhough the probability of each scenario is small.
Motivation MMM Components Faulty components LAoP Wrapping up References
Doesn’t work in FMs — why?
Program semantics are usually qualitative — how does onequantify risk in standard denotational semantics?
PRA performed a posteriori — Hmmm... we’ve seen this mistakebefore, eg. in program correctness.
Need for a change:
Programming should incorporate risk as the rule ratherthan the exception (absence of risk = ideal case).
Need for combinators expressing risk of failure, eg. probabilisticchoice (McIver and Morgan, 2005)
bad p� good
between expected behaviour and misbehaviour.
Motivation MMM Components Faulty components LAoP Wrapping up References
In this talk
Interested in reasoning about the risk of faults propagating incomponent-based software (CBS) systems.
Traditional CBS risk analysis relies on semantically weak CBSmodels — eg. over component call-graphs as in (Cortellessa andGrassi, 2007).
Our starting point is the coalgebraic semantics of Barbosa (2001)for (safe) CBS systems, under the lemma:
“Components as coalgebras”
Motivation MMM Components Faulty components LAoP Wrapping up References
Main ideas
Component = (Monadic) Mealy machine (MMM), that is, anF -branching transition structure of type:
S × I → F(S × O)
where F is a monad.
Component-oriented design = Algebra of MMM combinators
Semantics = Coalgebraic, calculational
To this framework we want to add calculation of
Risk = Probability of faulty (catastrophic) behaviour
Motivation MMM Components Faulty components LAoP Wrapping up References
Mealy machines in various guises
F -branching transition structure:
S × I → F(S × O)
Coalgebra:
S → (F(S × O))I
State-monadic:
I → (F(S × O))S
All versions useful in component algebra.
Motivation MMM Components Faulty components LAoP Wrapping up References
Example — stack
A stack at functional level
push = flip (:)
pop = tail
top = head
empty = (0=) · length
is a collection of partial functions on the free monoid of finitesequences.
Each such partial function gives rise to a method, ie. an(elementary) Mealy machine.
The stack component will arise as the sum of its methods.
Motivation MMM Components Faulty components LAoP Wrapping up References
Individual Mealy (Maybe) machines
Example of a method
push′ :: ([b ], b)→ ([b ], 1)
push′ = push M !
which resorts
(a) to the uncurry operator,
f (a, b) = f a b
(b) to the pairing operator,
(f M g) x = (f x , g x)
(c) and to uniquely defined (total) function ! :: b → 1 (’bang’).
Motivation MMM Components Faulty components LAoP Wrapping up References
Partiality — rule rather than exception
Partiality, however, requires ‘Maybe’ (M) Mealy machines, one pertotalized (partial) function, eg.:
pop′ :: ([a ], 1)→M ([a ], a)pop′ = (pop M top) ⇐ (¬ · empty) · π1
where · ⇐ · totalizes a partial function by fusion with aprecondition,
· ⇐ · ::(a→ b)→ (a→ B)→ a→M bf ⇐ p = p → (η · f ) , ⊥
where unit η (of M) means success and ’zero’ element ⊥ meansfailure.
Motivation MMM Components Faulty components LAoP Wrapping up References
Standard stack methods
empty ′ :: ([a ], 1)→M ([a ],B)empty ′ = η · (id M empty) · π1
top′ :: ([a ], 1)→M ([a ], a)top′ = id M top ⇐ (¬ · empty) · π1
push′ :: ([b ], b)→M ([b ], 1)
push′ = η · (push M !)
pop′ :: ([a ], 1)→M ([a ], a)pop′ = pop M top ⇐ (¬ · empty) · π1
Motivation MMM Components Faulty components LAoP Wrapping up References
Component =∑
methods
The stack component
stack :: ([p ], (1 + 1) + p)→M ([p ], (p + p) + 1)stack = pop′ ⊕ top′ ⊕ push′
is built thanks to the MMM sumcombinator
· ⊕ · ::(Functor F)⇒-- input machines((s, a)→ F (s, b))→((s, i)→ F (s, c))→-- output machine(s, a + i)→ F (s, b + c)-- definition
m1 ⊕m2 = (F dr◦) ·∆ · (m1 + m2) · dr
where (next slide)
Motivation MMM Components Faulty components LAoP Wrapping up References
Component =∑
methods
• m1 + m2 is categorial sum (coproduct);
• isomorphism
dr :: (a, c + b)→ (a, c) + (a, b)dr (a, i1 b) = i1 (a, b)dr (a, i2 c) = i2 (a, c)
(resp. dr◦ ) distributes (resp. factorizes) the shared stateacross the sum of inputs (resp. outputs)
• “Cozip” operator
∆ :: (Functor F)⇒ (F a) + (F b)→ F (a + b)∆ = [(F i1), (F i2)]
promotes coproducts through F.
Motivation MMM Components Faulty components LAoP Wrapping up References
Combining components
Queue = two stacks:
out stack (left) interacting with in stack (right)
Motivation MMM Components Faulty components LAoP Wrapping up References
More MMM combinators
For the two stacks to interact we need to be able to compose twoMMM,
· ; · ::(Strong F,Monad F)⇒-- input machines((s, i)→ F (s, j))→((r , j)→ F (r , k))→-- output machine((s, r), i)→ F ((s, r), k)-- definition
m1 ; m2 = ((F a◦) · τl · (id ×m2) · xl) • τr · (m1 × id) · xr
which is the F-Kleisli composition (•) of suitably wrapped,“paired” m1 and m2 .
(Explanatory diagrams in the following slide.)
Motivation MMM Components Faulty components LAoP Wrapping up References
More MMM combinators
Kleisli composition for monad F
F (F C )
µ
��
F BF foo A
goo
f •g
ggF C Bfoo
which is the composition in the corresponding Kleisli category,with η as identity:
f • (g • h) = (f • g) • h
f • η = f = η • f
Conceptually, it is as if one (typewise) drops the F ’s from f and gin the diagram above.
Motivation MMM Components Faulty components LAoP Wrapping up References
More MMM combinators
Wrapping of m1 :
F ((s, o), r) (F (s, o), r)τroo ((s, i), r)
m1×idoo ((s, r), i)xroo
Wrapping of m2 :
(s,F (r , k))
τl��
(s, (r , o))id×m2oo ((s, o), r)
xloo
F ((s, r), k) F (s, (r , k))F a◦oo
Mind the strength operators:
τr :: (F a, b)→ F (a, b)τl :: (b,F a)→ F (b, a)
Motivation MMM Components Faulty components LAoP Wrapping up References
More MMM combinators
We will also need interface-wrapping
·{·→·} :: (Functor F)⇒-- input machine
((a, e)→ F (a, c))→-- input wrapper
(i → e)→-- output wrapper
(c → d)→-- output machine
(a, i)→ F (a, d)-- definition
m{f→g} = F (id × g) ·m · (id × f )
for I / O “wiring” (analogy with hardware).
Motivation MMM Components Faulty components LAoP Wrapping up References
Queue componentWe build a queue out of two stacks
queue :: (([q ], [q ]), q + 1)→M (([q ], [q ]), 1 + q)queue = enq ⊕ deq
by providing two methods: enqueueing
enq = nop ; stack{pushIn→pushOut}
which does nothing (nop) on the output stack and pushes onto the inputstack (wrapping selects the push method), and dequeueing,
deq = (stack{popIn→popOut} ; nop) • check
which pops from the output stack, preceded by flushing the input stackshould the former be empty:
check = (empty ′ ; nop)→ (rev (flush′ ; write)) , nop
Motivation MMM Components Faulty components LAoP Wrapping up References
Wiring
Helper functions
popIn = firsttopIn = secondpushIn = third
are convenient renamings of generic wiring functions
first = i1 · i1second = i1 · i2third = i2
which route data into place through the parameter sums.
Motivation MMM Components Faulty components LAoP Wrapping up References
Enriched stacksNB: our stack component had meanwhile to be enriched with twomore methods,
stackpp :: ([p ], (((1 + 1) + p) + 1) + 1)→M ([p ], (((p + p) + 1) + B) + [p ])
stackpp = stack ⊕ empty ′ ⊕ flush′
where empty ′ checks for stack emptiness and
flush′ :: Monad F⇒ ([b ], 1)→ F ([b ], [b ])flush′ = mkMM nil M reverse
flushes a stack contents to its output (reversed), where
mkMM :: Monad F⇒ (a→ c → b)→ (c , a)→ F b
mkMM f = η · f · swap
Motivation MMM Components Faulty components LAoP Wrapping up References
Conditionals
Also mind the need for a MMM-level McCarthy-styled conditionalcombinator,
· → · , · :: (Monad F,Functor F)⇒-- condition((a, i)→ F (a,B))→-- ’then’ branch((a, 1)→ F (a, o))→-- ’else’ branch((a, 1)→ F (a, o))→-- output(a, i)→ F (a, o)-- definition
p → m1 , m2 = [m1,m2] • (F dr · (p{id→outB}))
where outB witnesses isomorphism B ∼= 1 + 1.
Motivation MMM Components Faulty components LAoP Wrapping up References
Changing effect of composition
Finally, term rev (flush′ ; write) involves
rev :: Functor F⇒-- original MM(((b, a), i)→ F ((b, a), o))→-- changed MM((a, b), i)→ F ((a, b), o)
rev m = F (swap × id) ·m · (swap × id)
where
swap (b, a) = (a, b)
which changes which component is affected first in a composition.
Motivation MMM Components Faulty components LAoP Wrapping up References
Simulation (Haskell)
Enqueueing:
> coalg queue ("ab","cd") (i1 ’z’)
Just (("ab","zcd"),Left ())
Dequeueing:
> coalg queue ("ab","cd") (i2 ())
Just (("b","cd"),Right ’a’)
> coalg queue ("","cd") (i2 ())
Just (("c",""),Right ’d’)
NB: coalg m converts MMM m into the corresponding coalgebra(currying).
Motivation MMM Components Faulty components LAoP Wrapping up References
Faulty components
Risk of pop′ behaving like top′ with probability 1− p
pop′′ :: P→ ([a ], 1)→ D (M ([a ], a))pop′′ p = pop′ p� top′
and risk of push′ not pushing anything, with probability 1− q
push′′ :: P→ ([a ], a)→ D (M ([a ], ()))push′′ q = push′ q� skip′
where P = [0, 1], D is the (finite) distribution monad and
· ·� · :: P→ (t → a)→ (t → a)→ t → D a(f p� g) x = choose p (f x) (g x)
choses between f and g .
Motivation MMM Components Faulty components LAoP Wrapping up References
Simulation
Example (no faults) — popping from one stack and pushing ontoanother,
m1 = pop′ ; push′
should produce the intended behaviour, eg.
> coalg m1 ([1],[2]) ()
Just (([],[1,2]),())
> coalg m1 ([],[2]) ()
Nothing
Example (faults) — now suppose the stacks are faulty,
m2 = pop′′ 0.95 ;D push′′ 0.8
over the same (global) state ([1],[2]).
Motivation MMM Components Faulty components LAoP Wrapping up References
Simulation
Running the same simulation, now for machine m2,
> coalg m2 ([1],[2]) ()
Just (([],[1,2]),()) 76.0%
Just (([],[2]),()) 19.0%
Just (([1],[1,2]),()) 4.0%
Just (([1],[2]),()) 1.0%
the risk of faulty behaviour is 24% (1− 0.76), structured as:(a) 1% — both components misbehave; (b) 19% — left stackmisbehaves; (c) 4% — right stack misbehaves.
As expected,
> coalg m2 ([],[2]) ()
Nothing 100.0%
is catastrophic (popping from an empty stack).
Motivation MMM Components Faulty components LAoP Wrapping up References
Faulty components
Simulation:
Using the PFP library written by Erwig andKollmansberger (2006).
Important:
Our MMMs have become probabilistic, leading tocoalgebras of general shape
S → (D(F(S × O)))I
Challenge:
Need for probabilistic extension of the MMMcombinators of Barbosa (2001), for instance (next slide):
Motivation MMM Components Faulty components LAoP Wrapping up References
Combining faulty components
Sequencing:
· ;D · ::-- input probabilistic MMM((u, i)→ D (M (u, k)))→-- input probabilistic MMM((v , k)→ D (M (v , o)))→-- output probabilistic MMM((u, v), i)→ D (M ((u, v), o))
m1 ;D m2 =
((DM a◦) · τDl · (id ×m2) · xl) •D (τDr · (m1 × id) · xr)
where
τDl :: Strong F⇒ (b,D (F a))→ D (F (b, a))
τDr :: Strong F⇒ (D (F a), b)→ D (F (a, b))
are “D”-extended strengths and... (next slide)
Motivation MMM Components Faulty components LAoP Wrapping up References
Combining faulty components
Combinator (•D) implements M-Kleisli composition “wrapped by”D(istributions)
(•D) :: (c → D (M b))→ (a→ D (M c))→ a→ D (M b)g •D f = (D µ) · ((MF g) • f )
where
MF :: (Functor F,Monad F)⇒ (b → F a)→M b → F (M a)
runs M f “inside” F .
Motivation MMM Components Faulty components LAoP Wrapping up References
Back to research question
Recall coalgebraic type
S → (D(F(S × O)))I
How tractable (mathematically) is this doubly-monadicframework? Can F be any monad?
Relatives of this have been studied elsewhere, eg.
- reactive probabilistic automata — S → (M (D S))I
- generative prob. automata — S →M (D (O × S))- bundle systems — S → D (P (O × S))
Cf. (Sokolova, 2011)
Motivation MMM Components Faulty components LAoP Wrapping up References
Back to research question
Related (and inspirational) work:
Bonchi et al. (2012) study coalgebras of functor
S → K× (KSω)I
for K a field in their coalgebraic approach to weightedautomata.
These coalgebras rely on the so-called field valuation(exponential) functor K−ω calling for vector spaces.
Inspired by this approach, a similar framework was studied directlyin suitable categories of matrices (Oliveira, 2012).
We will do the same in this talk concerning probabilistic MMMs.
Motivation MMM Components Faulty components LAoP Wrapping up References
Our strategy
Instead of working in Set,
D (F B) Agoo
D (F C ) Bfoo
we seek to work directly on the Kleisli category of D, that is
F B Agoo
f •g
��F C B
foo
thus “abstracting from” monad D. But — how tractable is such acategory?
Motivation MMM Components Faulty components LAoP Wrapping up References
Our strategyIt turns out to be the (monoidal) category of left (column)stochastic matrices,
A→Set DB++
∼= A→CS Bkk
cf. the adjunction
A
M��
B
D A
L M��
Aηoo
f=L M · η︸ ︷︷ ︸bMc
}}D B
where L M corresponds to the linear transformation captured bymatrix M :
(L M) v b = 〈∑
a ∈ A :: M (b, a)× (v a)〉
Motivation MMM Components Faulty components LAoP Wrapping up References
Our strategy
Another way to put it is
M = df e ⇔ 〈∀ b, a :: M(b, a) = (f a) b〉
where df e is the isomorphism converting probabilistic function finto the corresponding CS-matrix.
For instance, the probabilistic negation function
f = id 0.1� (¬)
corresponds to matrix
df e =True False
TrueFalse
(0.1 0.90.9 0.1
)
Motivation MMM Components Faulty components LAoP Wrapping up References
Our strategy
Probabilistic choice is immediate on the matrix side,
df p� ge = p df e+ (1− p) dge
where (+) denotes addition of matrices of the same type.
Typed linear algebra: homset A→CS B contains all matricesindexed by input (column) type A and indexed by output (row)type B, whose cardinality is bound to ensuring that matrixcomposition (multiplication),
(M · N)(r , c) = 〈∑
x :: M(r , x)× N(x , c)〉
is well-defined.
Motivation MMM Components Faulty components LAoP Wrapping up References
Matrix transform
As we wanted, D -Kleisli composition becomes matrixmultiplication,
df • ge = df e · dge
and the unit of D (Dirac function) becomes the identity matrix,dηe = id .
This is a special case of the embedding:
Any normal (“sharp”) function f in Set is representableby CS matrix dη · f e , as expected.
For instance, the embedding of Set isomorphisms leads topermutation matrices, that is, CS matrices with exactly one 1 percolumn and per row.
Motivation MMM Components Faulty components LAoP Wrapping up References
Column stochasticity
Any CS (column) vector A 1voo represents a distribution.
CS (row) vector 1 A!oo — wholly filled with 1s — is known as
the “bang vector” — a constant function.
Clearly, ! and id coincide on (scalar) type 1→ 1:
1 1idoo = 1 1
!oo
Bang is very useful, cf:
Definition: A matrix M is CS iff
! ·M = !
�
Motivation MMM Components Faulty components LAoP Wrapping up References
Column stochasticity
Not every matrix combinator preserves column stochasticity;composition (of course) does,
! · (df e · dge)
= { ! · df e = ! }
! · dge
= { ! · dge = ! }
!
and thus CS is a subcategory of MatIR , but eg. sum does not —df e+ dge> ! .
Weighted sum (choice) preserves column stochasticity,! · (df e p� dge) = ! as several other useful combinators do (seebelow).
Motivation MMM Components Faulty components LAoP Wrapping up References
Column stochasticity
CS has coproducts,
(A + B)→CS C ∼= (A→CS C )× (B →CS C )
(where A + B is disjoint union) as does MatIR , offering universalproperty
X = [M|N] ⇔ X .i1 = M ∧ X .i2 = N
where [i1|i2] = id .
[M|N] is one of the basic matrix block combinators — it puts Mand N side by side (“‘junc”).
Its dual, X =[PQ
], puts P on top of Q (“‘split”).
Motivation MMM Components Faulty components LAoP Wrapping up References
Back to the main problem
Rewritten in CS, composition of probabilistic MMM reduces to thenon-probabilistic case,
dm1 ;D m2e = dm1e ; dm2e
provided one unfolds the definition of dm1e ; dm2e in CS ratherthan in Set.
Our strategy:
Instead of staying in the original category and adaptingthe definition to the probabilistic case
we
keep the original definition by changing category (CSreplaces Set)
Motivation MMM Components Faulty components LAoP Wrapping up References
Back to the main problem
The advantage is that all probabilistic accounting is smoothlycarried out by composition in CS, and we don’t need to beconcerned with that.
However, there is a price to pay, since the definition of dm1e ; dm2eis strongly monadic:
Which strong monads in Set are still strong monads inCS?
Recall the types of the two strengths:
τl : (B × F A)→ F (B × A)τr : (F A× B)→ F (A× B)
What do we know about products (pairing) in CS?
Motivation MMM Components Faulty components LAoP Wrapping up References
Probabilistic pairing
Pairing the outputs of probabilistic f and g is captured by theirKhatri-Rao matrix product (dropping the d·e ’s for notationeconomy):
A× B A A× Bfstoo snd // B
C
f Mg
OO
C
f Mg
OO
g
<<
f
bb
However — this is a weak categorial product:
k = f M g ⇒{
fst · k = fsnd · k = g
(1)
cf. the ⇒ in (1). Unlike pairing in Set, Khatri-Rao is injective butnot surjective.
Motivation MMM Components Faulty components LAoP Wrapping up References
Probabilistic pairing
Weak product (1) still grants the cancellation rule,
fst · (f M g) = f ∧ snd · (f M g) = g (2)
2 2× 3fst=
[1 1 1 0 0 00 0 0 1 1 1
]oo
snd=
[1 0 0 1 0 00 1 0 0 1 00 0 1 0 0 1
]// 3
4
fMg=
0.15 0.12 0 00.35 0.06 0 0.750 0.12 0 0
0.15 0.28 0.1 00.35 0.14 0.2 0.250 0.28 0.7 0
OO
g=
[0.3 0.4 0.1 00.7 0.2 0.2 10 0.4 0.7 0
]
==
f=
[0.5 0.3 0 0.750.5 0.7 1 0.25
]
aa
Motivation MMM Components Faulty components LAoP Wrapping up References
Probabilistic pairing
... but fusion becomes side-conditioned
(f M g) · h = (f · h) M (g · h) ⇐ h is “sharp” (100%) (3)
and reconstruction doesn’t hold in general
k = (fst · k) M (snd · k)
cf. eg.
k : 2→ 2× 3
k =
0 0.4
0.2 00.2 0.10.6 0.40 00 0.1
(fst · k) M (snd · k) =
0.24 0.40.08 00.08 0.10.36 0.40.12 00.12 0.1
(k is not recoverable from its projections — Khatri-Rao not surjective).
Motivation MMM Components Faulty components LAoP Wrapping up References
Probabilistic pairing
In general, Khatri-Rao gives rise to the well-known Kronecker(tensor) product
M ⊗ N = (M · fst) M (N · snd)
(which is a bifunctor) and absorption holds:
(M ⊗ N) · (P M Q) = (M.P) M (N.Q)
Therefore:
M M N = (M ⊗ N) · (id M id)︸ ︷︷ ︸δ
However, δ is not a “uniform copying operation” in the sense ofCoecke (2011), for it lacks naturality (next slide).
Motivation MMM Components Faulty components LAoP Wrapping up References
Probabilistic pairing
For probabilistic f
evaluate δ · f
where δ : B→ B× B
Then evaluate (f ⊗ f ) · δ
where δ : {a, b} → {a, b} × {a, b}
Motivation MMM Components Faulty components LAoP Wrapping up References
Back to probabilistic MMMs
To extend the bicategorial construction of (Barbosa, 2001) to theprobabilistic case we need to prove that h ; k = h ⊗ k (Kroneckerproduct) is a (sharp) morphism between compound Mealymachines m ; n and m′ ; n′ ,
m′ ; n′ m ; nh;koo (4)
that is,
F ((U × V )× I )
F ((h⊗k)⊗id)��
(U × V )× Im;noo
(h⊗k)⊗id��
F ((U ′ × V ′)× I ′) (U ′ × V ′)× I ′m′;n′
oo
wherever m′ mhoo and n′ n
koo .
Motivation MMM Components Faulty components LAoP Wrapping up References
MMM morphisms in CS
The proof of (4) will be (in CS) exactly the same as that givenby Barbosa (2001) (in Set) for “sharp” Mealy machines.
However, such a proof requires:
• τl and τr natural
• µ natural
Which F ensure these properties? Probably many but not all —studying this at the moment.
Below we check the ’maybe’ functor M A = 1 + A in this respect.
This functor is relevant because it captures the abrupttermination catastrophic scenario so important in PRA.
Motivation MMM Components Faulty components LAoP Wrapping up References
To be or not to be (natural)
In general
τl : B × F A→ F (B × A)
Maybe instance:
τl : 1 + B × A← B × (1 + A)τl = (!⊕ id) · dr
Naturality of τl easy to derive from the naturality of dr and that of!⊕ id .
The former is an isomorphism, therefore natural in the whole
MatIR . Below we check the naturality of 1 + B A + B!⊕idoo :
Motivation MMM Components Faulty components LAoP Wrapping up References
To be or not to be (natural)
Checking:
(id ⊕ N) · (!⊕ id) = (!⊕ id) · (M ⊕ N)
⇔ { bifunctor · ⊕ · }
!⊕ N = (! ·M)⊕ N
⇔ { assumption: M is CS }
!⊕ N = !⊕ N
Thus, in MatIR the property is constrained by one matrix being CS(represented by f below):
(id ⊕ N) · (!⊕ id) = (!⊕ id) · (f ⊕ N) (5)
In CS , (5) always holds.
Motivation MMM Components Faulty components LAoP Wrapping up References
To be or not to be (natural)
Impact on the naturality of τl (f is CS):
τl · (f ⊗ (id ⊕M))
⇔ { definition of τl }
(!⊕ id) · dr · (f ⊗ (id ⊕M))
⇔ { isomorphism dr is natural }
(!⊕ id) · ((f ⊗ id)⊕ (f ⊗M)) · dr
⇔ { naturality of !⊕ id (5) }
(id ⊕ (f ⊗M)) · (!⊕ id) · dr
⇔ { definition of τl }
(id ⊕ (f ⊗M)) · τl
Motivation MMM Components Faulty components LAoP Wrapping up References
To be or not to be (natural)
Naturality of
µ : 1 + (1 + A)→ 1 + Aµ = [i1|id ]
is granted since any category of matrices has coproducts, upon which µis defined. This is a special case of
µ : FB (FB A)→ FB Aµ = (|[id |(in · i2)]|)
where FB A++
∼= A + B (FB A)
in
jj is the free monad on
polynomial B. This works because catamorphisms (vulg. folds) have
solutions in CS for such functors.
Motivation MMM Components Faulty components LAoP Wrapping up References
Wrapping up
Weak tupling has opened new perspectives, namely in relation toRel and to categorial quantum physics, under the umbrella ofmonoidal categories.
In fact, these also include FdHilb, the category of finitedimensional Hilbert spaces. Thus the remarks by Coecke andPaquette, in their Categories for the Practising Physicist (Coecke,2011):
Rel [the category of relations] possesses more ’quantumfeatures’ than the category Set of sets and functions [...]The categories FdHilb and Rel moreover admit a categoricalmatrix calculus.
I agree: Set is too perfect to “belong to reality”...
Motivation MMM Components Faulty components LAoP Wrapping up References
Wrapping up
Back to main issue:
How does one compare CBS architectures with respectto fault propagation?
Relationally, the worst (= most unsafe, most risky) system of itstype corresponds to the topmost relation (>) — anything canhappen (chaos).
For a given type in CS, this corresponds to the probabilisticfunction which yields normal distributions (of outputs) for everyinput (cf. ‘completely mixed states’).
This tallies with the partial order on classical and quantum statesdiscussed in the homonym chapter of (Coecke, 2011), whosemaximal elements are the sharp functions.
Motivation MMM Components Faulty components LAoP Wrapping up References
Wrapping up
Summary:
• Risky software systems captured by probabilistic, monadicMealy machines
• Kleisli categories reduce monadic complexity
• LAoP — exploiting the Kleisli category of the (countablesupport) distribution monad.
Motivation MMM Components Faulty components LAoP Wrapping up References
Wrapping up
Current work:
• Predict which machines are less likely to embark oncatastrophic behaviour
• Final (behavioural) semantics of pMMM calls for infinitesupport distributions
• Measure theory — Kerstan and Konig (2012) provide anexcellent starting point
• Are there “better” (less risky) software architectures?
Motivation MMM Components Faulty components LAoP Wrapping up References
Wrapping up
All in all...
On the righthand-side: what Ithink one shouldread beforeattempting doingPRA (probabilisticrisk analysis) ofsoftware systems.
Enjoy your reading!
Motivation MMM Components Faulty components LAoP Wrapping up References
L.S. Barbosa. Components as Coalgebras. University of Minho,December 2001. Ph. D. thesis.
F. Bonchi, M. Bonsangue, M. Boreale, J. Rutten, and A. Silva. Acoalgebraic perspective on linear weighted automata. Inf. &Comp., 211:77–105, 2012.
B. Coecke, editor. New Structures for Physics. Number 831 inLecture Notes in Physics. Springer, 2011. doi:10.1007/978-3-642-12821-9.
V. Cortellessa and V. Grassi. A modeling approach to analyze theimpact of error propagation on reliability of component-basedsystems. In Component-Based Software Engineering, volume4608 of LNCS, pages 140–156. 2007.
M. Erwig and S. Kollmansberger. Functional pearls: Probabilisticfunctional programming in Haskell. J. Funct. Program., 16:21–34, January 2006.
D. Jackson. A direct path to dependable software. Commun.ACM, 52(4):78–88, 2009.
Motivation MMM Components Faulty components LAoP Wrapping up References
J. Joyce. Proposed Formal Methods Supplement for RTCA DO178C, 2011. High Confidence Software and Systems, 11thAnnual Conference, 3-6 May 2011, Annapolis.
Henning Kerstan and Barbara Konig. Coalgebraic trace semanticsfor probabilistic transition systems based on measure theory. InMaciej Koutny and Irek Ulidowski, editors, CONCUR 2012 -Concurrency Theory, Lecture Notes in Computer Science,Heidelberg, 2012. Springer.
A. McIver and C. Morgan. Abstraction, Refinement And Proof ForProbabilistic Systems. Monographs in Computer Science.Springer-Verlag, 2005. ISBN 0387401156.
J.N. Oliveira. Typed linear algebra for weighted (probabilistic)automata. In CIAA’12, volume 7381 of LNCS, pages 52–65,2012. .
Ana Sokolova. Probabilistic systems coalgebraically: A survey.Theor. Comput. Sci., 412(38):5095–5110, 2011.
M. Stamatelatos and H. Dezfuli. Probabilistic Risk AssessmentProcedures Guide for NASA Managers and Practitioners, 2011.NASA/SP-2011-3421, 2nd edition, December 2011.