Top Banner
DARTEL John Ashburner 2008
68

DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Mar 28, 2015

Download

Documents

Miguel Lopez
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

DARTEL

John Ashburner

2008

Page 2: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation– Dimensionality– Inverse-consistency

• Principles

• Geeky stuff

• Example

• Validation

• Future directions

Page 3: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Motivation

• More precise inter-subject alignment– Improved fMRI data analysis

• Better group analysis• More accurate localization

– Improve computational anatomy• More easily interpreted VBM• Better parameterization of brain shapes

– Other applications• Tissue segmentation• Structure labeling

Page 4: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Image Registration

• Figure out how to warp one image to match another

• Normally, all subjects’ scans are matched with a common template

Page 5: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Current SPM approach

• Only about 1000 parameters.– Unable model detailed

deformations

Page 6: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

A one-to-one mapping

• Many models simply add a smooth displacement to an identity transform– One-to-one mapping not enforced

• Inverses approximately obtained by subtracting the displacement– Not a real inverse Small deformation

approximation

Page 7: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation

• Principles

• Optimisation

• Group-wise Registration

• Validation

• Future directions

Page 8: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Principles

DiffeomorphicAnatomicalRegistrationThroughExponentiatedLie Algebra

Deformations parameterized by a single flow field, which is considered to be constant in time.

Page 9: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

DARTEL

• Parameterising the deformation

• φ(0)(x) = x

• φ(1)(x) = ∫ u(φ(t)(x))dt• u is a flow field to be estimated

t=0

1

Page 10: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Euler integration

• The differential equation is

dφ(x)/dt = u(φ(t)(x))• By Euler integration

φ(t+h) = φ(t) + hu(φ(t))• Equivalent to

φ(t+h) = (x + hu) o φ(t)

Page 11: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Flow Field

Page 12: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

For (e.g) 8 time steps

Simple integration• φ(1/8) = x + u/8• φ(2/8) = φ(1/8) o φ(1/8) • φ(3/8) = φ(1/8) o φ(2/8) • φ(4/8) = φ(1/8) o φ(3/8) • φ(5/8) = φ(1/8) o φ(4/8) • φ(6/8) = φ(1/8) o φ(5/8) • φ(7/8) = φ(1/8) o φ(6/8) • φ(8/8) = φ(1/8) o φ(7/8)

7 compositions

Scaling and squaring• φ(1/8) = x + u/8• φ(2/8) = φ(1/8) o φ(1/8)

• φ(4/8) = φ(2/8) o φ(2/8)

• φ(8/8) = φ(4/8) o φ(4/8)

3 compositions

• Similar procedure used for the inverse.Starts withφ(-1/8) = x - u/8

Page 13: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Scaling and squaring example

Page 14: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

DARTEL

Page 15: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Jacobian determinants remain positive

Page 16: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation

• Principles

• Optimisation– Multi-grid

• Group-wise Registration

• Validation

• Future directions

Page 17: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Registration objective function

• Simultaneously minimize the sum of – Likelihood component

• From the sum of squares difference

• ½∑i(g(xi) – f(φ(1)(xi)))2

• φ(1) parameterized by u

– Prior component• A measure of deformation roughness

• ½uTHu

Page 18: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Regularization model

• DARTEL has three different models for H– Membrane energy– Linear elasticity– Bending energy

• H is very sparse

An example H for 2D registration of 6x6 images (linear elasticity)

Page 19: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Regularization models

Page 20: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Optimisation

• Uses Levenberg-Marquardt– Requires a matrix solution to a very large set

of equations at each iteration

u(k+1) = u(k) - (H+A)-1 b

– b are the first derivatives of objective function– A is a sparse matrix of second derivatives– Computed efficiently, making use of scaling

and squaring

Page 21: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Relaxation

• To solve Mx = cSplit M into E and F, where

• E is easy to invert• F is more difficult

• Sometimes: x(k+1) = E-1(c – F x(k))• Otherwise: x(k+1) = x(k) + (E+sI)-1(c – M x(k))

• Gauss-Siedel when done in place.• Jacobi’s method if not

• Fits high frequencies quickly, but low frequencies slowly

Page 22: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

H+A = E+F

Page 23: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Highest resolution

Lowest resolution

Full Multi-Grid

Page 24: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation

• Principles

• Optimisation

• Group-wise Registration– Simultaneous registration of GM & WM– Tissue probability map creation

• Validation

• Future directions

Page 25: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Generative Models for Images

• Treat the template as a deformable probability density.– Consider the intensity distribution at each

voxel of lots of aligned images.• Each point in the template represents a probability

distribution of intensities.

– Spatially deform this intensity distribution to the individual brain images.

• Likelihood of the deformations given by the template (assuming spatial independence of voxels).

Page 26: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Generative models of anatomy

• Work with tissue class images.

• Brains of differing shapes and sizes.• Need strategies to encode such variability.

Automaticallysegmentedgrey matter

images.

Page 27: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Simultaneous registration of GM to GM and WM to WM

Grey matter

White matter

Grey matter

White matter

Grey matter

White matter

Grey matter

White matter

Grey matter

White matterTemplate

Subject 1

Subject 2

Subject 3

Subject 4

Page 28: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Template Creation

• Template is an average shaped brain.– Less bias in subsequent analysis.

• Iteratively created mean using DARTEL algorithm.– Generative model of data.– Multinomial noise model. Grey matter

average of 471 subjects

White matter average of 471 subjects

μ

t1

ϕ1

t2

ϕ2

t3

ϕ3

t4 ϕ4

t5

ϕ5

Page 29: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Average Shaped Template

• For CA, work in the tangent space of the manifold, using linear approximations.– Average-shaped templates give less bias, as the

tangent-space at this point is a closer approximation.• For spatial normalisation of fMRI, warping to a

more average shaped template is less likely to cause signal to disappear.– If a structure is very small in the template, then it will

be very small in the spatially normalised individuals.• Smaller deformations are needed to match with

an average-shaped template.– Smaller errors.

Page 30: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Average shaped templates

Linear Average

Average on Riemannian manifold

(Not on Riemannian manifold)

Page 31: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

TemplateInitial

Average

After a few iterations

Final template

Iteratively generated from 471 subjects

Began with rigidly aligned tissue probability maps

Used an inverse consistent formulation

Page 32: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Grey matter average of 452 subjects – affine

Page 33: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Grey matter average of 471 subjects

Page 34: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Multinomial Model

• Current DARTEL model is multinomial for matching tissue class images.

log p(t|μ,ϕ) = ΣjΣk tjk log(μk(ϕj))t – individual GM, WM and background

μ – template GM, WM and background

ϕ – deformation

• A general purpose template should not have regions where log(μ) is –Inf.

Page 35: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Laplacian Smoothness Priors on template

2DNicely scale invariant

3DNot quite scale invariant – but probably close enough

Page 36: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Smoothing by solving matrix equations using multi-grid

Template modelled as softmax of a Gaussian process

μk(x) = exp(ak(x))/(Σj exp(aj(x)))

Rather than compute mean images and convolve with a Gaussian, the smoothing is done by maximising a log-likelihood for a MAP solution.

Note that Jacobian transformations are required (cf “modulated VBM”) to properly account for expansion/contraction during warping.

Page 37: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Determining amount of regularisation

• Matrices too big for REML estimates.

• Used cross-validation.

• Smooth an image by different amounts, see how well it predicts other images:

Rigidly aligned

Nonlinear registered

log p(t|μ) = ΣjΣk tjk log(μjk)

Page 38: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

ML and MAP templates from 6 subjects

Nonlinear Registered Rigid registered

log

MAP

ML

Page 39: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 40: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 41: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 42: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 43: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 44: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 45: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 46: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 47: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 48: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 49: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation

• Principles

• Optimisation

• Group-wise Registration

• Validation– Sex classification– Age regression

• Future directions

Page 50: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Validation

• There is no “ground truth”• Looked at predictive accuracy

– Can information encoded by the method make predictions?

• Registration method blind to the predicted information• Could have used an overlap of fMRI results

– Chose to see whether ages and sexes of subjects could be predicted from the deformations

• Comparison with small deformation model

Page 51: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Training and Classifying

ControlTraining Data

PatientTraining Data

?

?

??

Page 52: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Classifying

Controls

Patients

?

?

??

y=f(aTx+b)

Page 53: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Support Vector Classifier

Page 54: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Support Vector Classifier (SVC)

SupportVector

SupportVector

Support

Vector

a is a weighted linear combination of the support vectors

Page 55: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Nonlinear SVC

Page 56: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Support-vector classification

• Guess sexes of 471 subjects from brain shapes – 207 Females / 264 Males

• Use a random sample of 400 for training.

• Test on the remaining 71.

• Repeat 50 times.

Page 57: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Sex classification results

• Small Deformation– Linear classifier

• 87.0% correct• Kappa = 0.736

– RBF classifier• 87.1% correct• Kappa = 0.737

• DARTEL– Linear classifier

• 87.7% correct• Kappa = 0.749

– RBF classifier• 87.6% correct• Kappa = 0.748

An unconvincing improvement

Page 58: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Regression

23

26

30

29

18

32

40

Page 59: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Relevance-vector regression

• A Bayesian method, related to SVMs– Developed by Mike Tipping

• Guess ages of 471 subjects from brain shapes.

• Use a random sample of 400 for training.

• Test on the remaining 71.

• Repeat 50 times.

Page 60: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Age regression results

• Small deformation– Linear regression

• RMS error = 7.55• Correlation = 0.836

– RBF regression• RMS error = 6.68• Correlation = 0.856

• DARTEL– Linear regression

• RMS error = 7.90• Correlation = 0.813

– RBF regression• RMS error = 6.50• Correlation = 0.867

An unconvincing improvement(slightly worse for linear regression)

Page 61: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 62: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Overview

• Motivation

• Principles

• Optimisation

• Group-wise Registration

• Validation

• Future directions

Page 63: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Future directions

• Compare with variable velocity methods– Beg’s LDDMM algorithm.

• Classification/regression from “initial momentum”.

• Combine with tissue classification model.

• Develop a proper EM framework for generating tissue probability maps.

Page 64: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.
Page 65: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

u

Hu

Page 66: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

“Initial momentum”

Variable velocity framework (as in LDDMM)

Page 67: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

“Initial momentum”

Variable velocity framework (as in LDDMM)

Page 68: DARTEL John Ashburner 2008. Overview Motivation –Dimensionality –Inverse-consistency Principles Geeky stuff Example Validation Future directions.

Thank you