Intro PattRecog Simulation Optimization Conclusions COMPUTATIONAL I NTELLIGENCE FOR NEUROSCIENCE Concha Bielza, Pedro Larra˜ naga Departamento de Inteligencia Artificial Universidad Polit´ ecnica de Madrid Discovery Science and Algorithmic Learning Theory - 2009 Porto, October 3, 2009 C. Bielza, P. Larra˜ naga Computational Intelligence for Neuroscience
163
Embed
Computational Intelligence for Neurosciencecig.fi.upm.es/.../neuroscience_DS09_Porto.pdf · Computed Axial Tomography (CAT) Positron Emission Tomography (PET) Magnetic Res. Imaging
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Neuroscience, the science of the brainGoal: understand the relationship between the physical brain and the functionalmind
—how we perceive, move, think, and remember
A principle: all behavior is an expression of neural activity
Involves answers to scientific fundamental questions:
How does the brain develop?Are specific mental processes located in specific brain regions?How do nerve cells in the brain communicate with one another?How do different patterns of interconnections give rise to differentperceptions and motor acts?How is communication between neurons modified by experience?How is it altered by diseases or drugs?
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Some areas, such as the cortex and cerebellum, consist of layers, folded to fitwithin the available space
In mammals, neocortex: complex 6-layered structure, greatly enlarged inprimates, especially frontal lobe part (in humans, extreme enlargement).Concerned with the most evolved human behavior
Outermost=layer 1, innermost=layer 6
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
1 Dendrites receive info from another cell and transmit the message to soma2 Cell body (or soma) contains the nucleus, mitochondria and other organelles
typical of eukaryotic cells3 Axon conducts messages away to the next neuron via a specialized structure
–synapse–Axons fill most of the space in the brain→ >150,000 km in the human brain!!
Each neuron connected to 1,000 neighboring neurons
Messages travel within the neuron as electrochemical impulses called actionpotentials, lasting less than a thousandth of a second at 1-100 m/s
Action potential arrives at a synapse and causes a chemical called aneurotransmitter, stored in small synaptic vesicles, to be released
Neurotransmitter binds to receptor molecules in the membrane of the target cell
Excitatory receptors (↑ rate of action potentials in the target cell) vs. inhibitory
Parkinson’s disease has a deficiency of the neurotransmitter dopamine. Alcoholweakens connections between neurons. Drugs such as caffeine, nicotine, heroin,cocaine, Prozac, act on some particular neurotransmitters
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Changes in blood flow (SPECT, fMRI) or metabolic activity (PET).Reduced flow, less energy consumed = injured sites. Flow changes tocompensate for the increased metabolic demands of the area being used
Which brain structures are activated –structural info– and how –functional info–when performing different tasks (sound, movie, picture, press a button,improvising jazz...). Cognitive processes, causes of neuro-degenerative diseases
EEG: high temporal resolution (millisecond), but poor spatial resolution. fMRI:great 3D spatial resolution, but slow temporal resolution (seconds and minutes)→ Integrate methods
3D images —See 2 animations
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
One of the 14 grand challenges for Engineeringthat will influence science and technology in the 21st century –National Academy ofEngineering in 2009, at the request of the NSF: http://www.engineeringchallenges.org
1 Make solar energy economical2 Provide energy from fusion3 Develop carbon sequestration methods4 Manage the nitrogen cycle5 Provide access to clean water6 Restore and improve urban infrastructure7 Advance health informatics8 Engineer better medicines9 Reverse-engineer the brain (the 4th in Internet votes)10 Prevent nuclear terror11 Secure cyberspace12 Enhance virtual reality13 Advance personalized learning14 Engineer the tools of scientific discovery
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Some advances: neural prostheses to restore damaged hearing (cochlearimplants), sight (artificial retinas), allow movement, communicationBCIs: read the thoughts of paralyzed patients and translate into commands likemoving a cursor on a computer screen
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Founded in 2005 by the Brain and Mind Institute of the Ecole Polytechnique inLausanne (Switzerland), directed by Henry Markram
An attempt to reverse-engineer the mammalian brain, in order to understandbrain function and dysfunction through detailed simulations
Between 1995 and 2005, Markram had mapped the types of neurons and theirconnections in a rat neocortical column, the smallest functional unit of theneocortex
In Dec’2006, the column was simulated (only 10,000 neurons; 60,000 in humans)
∼2 million columns in humans, 2mm tall
Markram predicts it to be built within the next 10 years
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
At the end of 2008, Universidad Politecnica de Madrid (UPM) and InstitutoCajal (IC) from the Spanish Research Council –subproject Cajal Blue Brain,till 2018
UPM, including Madrid Supercomputing and Visualization Center: data analysis,optimization and visualization
At the end of 2008, Universidad Politecnica de Madrid (UPM) and InstitutoCajal (IC) from the Spanish Research Council –subproject Cajal Blue Brain,till 2018
UPM, including Madrid Supercomputing and Visualization Center: data analysis,optimization and visualization
Neuroimaging as an interdisciplinary field at the intersection of imageprocessing, computer science, physics, statistics and neurosciences
Investigate mental activities in healthy as well as in diseased situations
Two classes of images:Anatomical or structural images: Brain structures MRI and SPECTFunctional images: changes –measured directly (MEG) or indirectly (fMRI) and (PET)–in neuronalpopulation activity, provoked by sensory stimulations or during cognitive tasks
Topics
Introduction: Classes of images and examples of problems to be solved
PreprocessingVoxel (pixel) (set) selection. Region of interest (ROI)Dimensionality reduction via feature extractionDimensionality reduction via feature selection
ModelingDiscovering associationsUnsupervised classificationSupervised classificationRegression3D ultrastructure of the cortical column
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Spatial normalization because of variations of brain volume, shape and positionfrom one patient to another (Statistical Parametric Mapping Software (SPM2))
High dimensional data (e.g. in 3D images: 128 × 128 × n, where n is thenumber of slices)
Voxel (pixel) (set) selection. Region of interest (ROI)Dimensionality reduction via feature extractionDimensionality reduction via feature selection
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Suboptimal if correlation among voxels carries important information
Filter approaches: provide a ranking of voxels
t-test for detecting active voxels (Friston et al., 1995), scoring the voxel byits corresponding p-value (valid for a 2 class problem)F -test for classification problems with more than 2 class labelsNon-parametric tests (e.g. Mann-Whitney), after rejecting the assumptionof Gaussian densities with the Lilliefors test
Wrapper approaches: provide a ranking of voxels
Accuracy: scores the voxel by how accurately a classifier –based only onthe voxel– is able to predict correctly the examplesSearchlight accuracy: the same as accuracy but instead of using the datafrom a single voxel, we use the data from the voxel and its immediatelyadjacent neighbours in three dimensions
Multiple comparison criteria
From the ranking decide on how many of the top-ranked voxels to useMultiple testing problem: (a) adjusting by (Bonferroni (1936)) or by(Benjamini, Hochberg (1995)) corrections (b) controlling the fraction offalse positive rate (false discovery rate) (Genovesse et al. 2002)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
NeuroimagingPreprocessing. Voxel sets and region of interest (ROI)
Voxel sets: Dynamic recursive partitioning (DRP) Kontos et al. (2009)
DRP partitions the two-dimensional image adaptively into progressivelysmaller subregions until statistically significant discriminative regions aredetected
By performing statistical tests on group of pixels rather than on individualpixels, the number of statistical tests is effectively reducedThe average value over the group of pixels becomes the value of thevariable (feature)
Region of interest (ROI)
ROIs as the result of the process of partitioning the digital image intomultiple segments (set of voxels (pixels))Feature value as the average of the values over the voxels (pixels) in suchROIClustering, edge detection, level set, watershed transformation, .....
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Preprocessing. Dimensionality reduction via feature extraction
Ignore the class label
Principal Component Analysis (PCA) (Hansen et al. 1999): the Meigenvectors with largest eigenvalues of the covariance matrix (fMRIimages: n ≥ 104 and Σ size is n2)
Singular Value Decomposition (SVD) (Bai et al. 2007): Given Xn×m thedata set, obtain Zn×k and Wk×m with k << n such thatminZ ,W ||X − ZW ||2 where k is the rank of the approximation
Independent Component Analysis (ICA) (McKeown et al. 1998):
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Preprocessing. Dimensionality reduction via feature extraction
Class label taken into account
Partial Least Squares (PLS) (McIntosh and Lobaugh, 2004) finds a linearmodel by projecting the predicted variables Yn×p and the observablevariables Xn×m to a new space (bilinear factor model). Given Xn×m andYn×p find T ,P,Q,E ,F such that:
X = TP t + E Y = TQt + F
where Tn×l is the factor matrix, and Pm×l , and Qp×l are the loadingmatrices. E and F are the error terms
Support Vector Decomposition Machine (SVDM) (Pereira and Gordon,2006):
minZ ,W ,Θ||X − ZW ||2 + f (Θ)
This objective trades reconstruction error (the first term) with a bound onclassification error (the second term)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Gaussian naive Bayes: (Mitchell et al. 2004; Pereira et al., 2009; Zhang et al., 2005)
General linear model: (Bly, 2001; Friston et al., 1995)
Hidden Markov models: (Hojen-Gorensen et al., 1999)
Kernel logistic: (Horn et al., 2009)
k -nearest neighbor: (Fletcher-Heath et al., 2001; Haxby et al., 2001; Horn et al., 2009; Mitchell et al. 2004;Zhang et al., 2005)
Linear discriminant analysis: (Cox and Savoy, 2003; Horn et al., 2009; Pereira et al., 2009; Zhang et al.,2005)
Logistic regression: (Desco et al., 2001; Horn et al., 2009; Pererira et al., 2009)
Mixture of Gaussians (EM): (Desco et al., 2001; Tohka et al., 2007)
Multilayer perceptron: (Chaplot et al, 2006; El-Sayed et al., 2009; Horn et al., 2009)
Probabilistic graphical models: (Burge and Lane, 2007; Chen and Herskovits, 2007; Herkovits et al., 2004;Sun and Tang, 2009)
Sparse regression: (Carroll et al. 2009; Valdes-Sosa et al., 2005; van Gerven et al., 2009; Yamashita et al.,2008)
Support vector machine: (Chaplot et al., 2006; Cox and Savoy, 2003; El-Sayed et al., 2009; Horn et al.,2009; Jim et al., 2009; Li et al., 2007; Mitchell et al. 2004; Pereira et al., 2009; Zhang et al., 2005)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Discovering associations. Probabilistic graphical model (Burge and Lane 2007)
Bayesian network representing the relationships among ROIsNodes: groups of voxels together in a ROI (hierarchically related; activationas the average of voxel activation)Conditional probability tables: smoothing (small sample)Structure learning: Bayesian Dirichlet equivalence criteria (BDe)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
3D ultrastructure of the cortical columnDecipher the wiring diagram and themap of connections at the synapticlevel of the cortical column
Electron microscopy 3D reconstructionmethods
A level of detail never reached before
Neuropil and synaptic connectivityUltrastructure of the neuropil: number and proportion of synapses; length,diameter and volume of axons, dendrites and glial; density of synaptic vesiclesper axon
Maps of synaptic connectivity: spatial distribution of synapses and spines;number of synapses per neuron —See 2 videos
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Modelling. Performance versus interpretabilityPrediction performance of a model is a good measure ofmodel validityIn the sciences, interpretation remains key when applyingpredictive modeling techniquesPrediction accuracy alone does not guarantee validity
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Genomics, proteomics, metabolomics. Integration of data-omics refers to Biology fields in which the objects of studyconforms a totalityGenomics (study of genome), proteomics (study ofproteome), metabolomics (study of metabolome)Neurological diseases: Alzheimer, Parkinson, Huntington,Amyotrophic Lateral Sclerosis, Creutzfeldt-Jakob...DNA microarrays, microRNA molecules, single nucleotidepolymorphisms, mass spectrometry, clinical data⇒Integrate them into a single research to study the possibleinfluences and interactions among them
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Alzheimer’s disease casesNeurological disorder primarily affecting the elderly that manifests throughmemory disorders, cognitive decline and loss of autonomy
17-20 million individuals affected worldwide
Every 70 seconds, someone develops Alzheimer’s
Alzheimer’s is the seventh-leading cause of death
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Genomics-AlzheimerGenetically complex and heterogeneous
Two types of neuropathologic lesions: (i) neurofibrillary degeneration fromintraneuronal accumulation of certain proteins; (ii) amyloid deposits fromextracellular accumulation of amyloid plaques, composed of Aβ peptides
Processes leading to form these lesions and their combined association with ADare not adequately understood
APOE on chromosome 19 is the only confirmed susceptibility locus for late-onsetAD (locus=specific location of a gene or DNA sequence on a chromosome)
Genes may have a role in more than 60 % of AD susceptibility and APOE mayaccount for as much as 50 % of this genetic susceptibility
More than 550 other genes proposed as candidates for AD susceptibility, but notconfirmed to have a role in AD pathogenesis
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Genomics-AlzheimerGenetically complex and heterogeneous
Two types of neuropathologic lesions: (i) neurofibrillary degeneration fromintraneuronal accumulation of certain proteins; (ii) amyloid deposits fromextracellular accumulation of amyloid plaques, composed of Aβ peptides
Processes leading to form these lesions and their combined association with ADare not adequately understood
APOE on chromosome 19 is the only confirmed susceptibility locus for late-onsetAD (locus=specific location of a gene or DNA sequence on a chromosome)
Genes may have a role in more than 60 % of AD susceptibility and APOE mayaccount for as much as 50 % of this genetic susceptibility
More than 550 other genes proposed as candidates for AD susceptibility, but notconfirmed to have a role in AD pathogenesis
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Microarrays: Small et al. (2005)Microarrays as a powerful tool for isolating molecular differences betweenhealthy and diseased tissue, but with analytic challenges when applied to brain
Idea: generate microarray data selectively from the brain site most vulnerable toAD to maximice expression differences between AD and controls
Generate data also from a neighboring region relatively resistant to AD
Hippocampus is particularly vulnerable to AD (postmortem studies, fMRI data inliving subjects)
Regions: entorhinal cortex (EC) forms the main input to the hippocampus, themost vulnerable; dentate gyrus (DG), the most resistant
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Microarrays: Small et al. (2005)Assume hypotheses: EC and DG are relevant regions and differences in ECbetween AD and controls are known to be age independent
Search for the variable(s) that best matches these hypotheses
Is some molecule differentially expressed in EC over DG, comparing AD tocontrols?
kDBExtension to allow a variable to have at most k parents (excluding the class)
Search of structure based on the (conditional) mutual information metric
0-dependence Bayesian classifier← naıve Bayes classifier1-dependence Bayesian classifier← tree augmented naıve Bayes(n − 1)-dependence Bayesian classifier← the complete Bayesianclassifier (no conditional independencies in the structure)
2DB
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Induce many Bayesian classifiers by bootstrapping and assign confidence levelst to the arcs (# times each one must appear) to find robust gene interactions
Different confidence level t , different model Gt (from simple structures with a fewarcs when t ↑, to dense graphs when t ↓)⇒ Hierarchy of autoinclusive models G1 ⊇ G2 ⊇ ...Approach is a consensus feature selection on the final graph –gene interactionnetwork–
Consensus conclusions have shown to provide good results in microarray data
‘Robust arc identification’ algorithm (Armananzas et al, 2008)
Step 1. Repeat B times
Step 1.1. Randomly sample N instances with replacement
Step 1.2. Select an optimal feature subset and reduce the data set (CFS)
Step 1.3. Learn a kDB model
Step 2. Compute frequencies of all the included arcs out of the B models
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Induce many Bayesian classifiers by bootstrapping and assign confidence levelst to the arcs (# times each one must appear) to find robust gene interactions
Different confidence level t , different model Gt (from simple structures with a fewarcs when t ↑, to dense graphs when t ↓)⇒ Hierarchy of autoinclusive models G1 ⊇ G2 ⊇ ...Approach is a consensus feature selection on the final graph –gene interactionnetwork–
Consensus conclusions have shown to provide good results in microarray data
‘Robust arc identification’ algorithm (Armananzas et al, 2008)
Step 1. Repeat B times
Step 1.1. Randomly sample N instances with replacement
Step 1.2. Select an optimal feature subset and reduce the data set (CFS)
Step 1.3. Learn a kDB model
Step 2. Compute frequencies of all the included arcs out of the B models
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
⇒ 2 different analysis: EC-AD vs. EC-Control (12 samples, binary supervisedclassif problem) and EC-AD vs. EC-Control vs. DG-AD vs. DG-Control (24samples, multiclass)
⇒ k = 4,B = 10, 000, discretize into 3 intervals (up-regulated, down-regulated,baseline). t = 1, 000
⇒ 2 disconnected graphs. Genes related to neurological diseases: Huntington,Parkinson, bipolar disorder, depression→ related to AD? Some paper (2007) for Parkinson’s
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
⇒ 2 different analysis: EC-AD vs. EC-Control (12 samples, binary supervisedclassif problem) and EC-AD vs. EC-Control vs. DG-AD vs. DG-Control (24samples, multiclass)
⇒ k = 4,B = 10, 000, discretize into 3 intervals (up-regulated, down-regulated,baseline). t = 1, 000
⇒ 2 disconnected graphs. Genes related to neurological diseases: Huntington,Parkinson, bipolar disorder, depression→ related to AD? Some paper (2007) for Parkinson’s
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
SNP (single-nucleotide polymorphism)A SNP is a DNA sequence variation occurring when a single nucleotide –A, T, C,or G– in the genome differs between members of a species (or between pairedchromosomes in an individual)
Example: 2 sequenced DNA fragments from different individuals, AAGGGA toAAGGAA, contain a difference at a single base-pair location
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
SNP: New markers published Sep’2009 Nature GeneticsIn addition to APOE locus, identify other loci associated with the risk oflate-onset AD
Williams et al. (2009):
529,218 SNPs involving 3,941 cases and 7,848 controlsCollections from UK, Germany, USATwo loci gave replicated evidence of association: CLU on chromosome 8(SNP rs11136000) and PICALM on chromosome 11 (SNP rs3851179)
Amouyel et al. (2009):
537,029 SNPs genotyped in 2,032 cases and 5,328 controlsCollections from France, Belgium, Finland, Italy and SpainLoci: CLU (SNPs rs2279590, rs11136000, rs9331888) and CR1 onchromosome 1 (SNPs rs6656401, rs3818361)
Information on age, sex, geographical region and disease status
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Golgi and Ramon y CajalNeuromorphology intensively studied after the discovery of Golgi method (Golgi,1873) named the black reaction (la reazione nera), allowing to see the entireneuron
Ramon y Cajal (1911) developed the neuron doctrine pointed out the variety ofpatterns of neurons depending on particular places and emitted hypothesis onthe role that could be played by particular forms
A better knowledge of the different neuronal populations that compose thisheterogeneous brain structure may therefore contribute to elucidating theirspecific role
Traditionally neuronal cell types have been classified using qualitative descriptorswith criteria that vary across investigators
In the recent decade several attempts to classify neurons quantitatively(objective classification) from multimodal information (physiological, molecular,morphological) have been developed
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Naming neuronsLorente de No (1938) qualitatively described classes ofinterneuronsRamon-Moliner (1958) defined types of neurons accordingto their dendritic arborisationsTyner (1975). The naming of neurons: applications oftaxonomic theory to the study of cellular populations. BrainBehaviour Evolution, 12, 75-96Rowe, Stone (1977). Naming the neurons. Classificationand naming of neurons of cat retinal ganglia cells. BrainBehaviour Evolution, 14, 185-216
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Dendritic morphologyDendritic branching influenced by complex interaction of synaptic activity,intracellular transport, extracellularly initiated signaling cascades, membranetension and electrical activity
Tree shapes help determine both interconnectivity and functional roles ofneurons
How and why vastly different shapes arise is still largely unknown
Understanding how formed in the brain, learn about their normal function andwhy they are often malformed in neurological diseases or under the effects ofsome drugs (cocaine, morphine)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Dendritic morphologyNeurons typically grouped in morphological classes based on prominentgeometrical features of their arborizations
E.g.: uniform distribution in space (stellate), planar (retinal ganglion cells), orpolarized structure (basal and apical in pyramidal)
stellate retinal ganglion pyramidal Purkinje
Can be specific to location in the brain: pyramidal are different in hippocampusand neocortex, and within the cortex, in different lobes (occipital or prefrontal)and layers (II or V)
An extreme: only present in specific regions, like Purkinje cells in cerebellum
There are no 2 neurons with the same morphology, although there existbranching patterns
⇒ Anatomical characterization is statistical in nature
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Computational modelingConstruct models that can simulate the huge variety of dendritic morphology
Synthetic neurons are generated within a virtual environment from amathematical-statistical model based on experimental measures collected fromreal traced cells
Simulated neurons should be indistinguishable (by statistical analysis or expertvisual inspection) from real cells
Steps1 Measure from real cells different parameters controlling branching behaviour2 Model these measures as statistical distributions3 Design a simulation model to account for all the parameters together and run it4 Compare formed virtual trees to reality
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Computational modelingConstruct models that can simulate the huge variety of dendritic morphology
Synthetic neurons are generated within a virtual environment from amathematical-statistical model based on experimental measures collected fromreal traced cells
Simulated neurons should be indistinguishable (by statistical analysis or expertvisual inspection) from real cells
Steps1 Measure from real cells different parameters controlling branching behaviour2 Model these measures as statistical distributions3 Design a simulation model to account for all the parameters together and run it4 Compare formed virtual trees to reality
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Parameters. Donohue and Ascoli (2008)5 basic parameters: branch pathlength, taper rate, daughter ratio,parent-daughter ratio and probability of bifurcating
With every basic parameter, 3 fundamental determinants are also measured toconstrain the sampling of basic parameters according to their location within thetree: branch order (N. of bifurcations towards soma), path distance from somaand branch radius
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Parameters. Torben-Nielsen et al. (2008)Basic + emergent (coming from interaction between the basic) morphologicalproperties. E.g. total dendritic length emerges from the number of stems, stemlength and individual segment lengths
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Probability distribution modelsDifferent hypotheses:
Basic parameters are uniformly distributed throughout the dendritic tree(Donohue et al, 2002)
Dendritic segments are built from sequences of units of constant diameter,and distributions of the lengths of units of similar diameter is independentof location within a dendritic tree (Lindsay et al, 2007)
Different dimensions of distributions (Lindsay et al, 2007):
Univariate distributions (marginal and conditional): diameter of the childconditioned on the parent diameterBivariate: joint distribution of the parent and child diametersTrivariate: parent, child and peer diameters
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
A recent model: NETMORPH (Koene et al, 2009) (http://www.netmorph.org)
Neuronal outgrowth is a stochastic process, where at each time step, branchingand elongation probab, direction of outgrowth, branching angles and segmentdiameters are functions with some random element (e.g. branch prob dependson dist from the soma, baseline branch rate exponentially decreases)
We can simulate:
Models of dendritic morphologiesModels of axonal morphologies
pyramidal & basket axon
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Network developmentSynaptic connectivity: look for sites where axon and dendrite areseparated less than a given distance...connection diagrams, freq of synapses, 3D visualizations...
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Comparing virtual and real cells: shape descriptorsMost commonly used method: Sholl’s analysis [Sholl, 1953], that uses std measures(length, surface area, number of branches...). Not orientation-sensitive
Or more sophisticated morphometrics related to branch patterns, defined in(Donohue and Ascoli, 2008)
Methods for orientation of the dendritic tree but losing important geometricinformation: “principal axes”, “circular orientation”, “cartesian grid density”(Uylings et al, 1986)
Fractal analysis to describe complexity of dendritic trees: do not discriminatebetween similar complexity but different spatial morphologies (Caserta et al, 1995)
Measure based on the Hausdorff distance, insensitive to outliers (also for 3Dstructures): connected and inconn parts are indistinguishable (Mizrahi et al, 2000)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Comparing virtual and real cells: dendrogramsTransform dendritic trees into dendrograms. Traditionally by hand. Now, e.g. withcomputer-vision techniques (segmentation based on curvature, edgedetection...) (Costa, 1999,2000)
Neuron Internal contours Skeleton
Dendrogram
(vertical width indicates width...)
But comparisons are qualitative and only capture some features of the structures
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Comparing virtual and real cells: an explicit distance between topologies
Comparison measure based on tree-edit distance (comp sci) (Heumann, Wittum, 2009)
Represent the topology as a tree (root=soma), assign to each vertex labels(describing the geometry of the section), compute distance between 2 trees asthe minimal cost of a feasible sequence of edit-operations –substitution,insertion, deletion– that converts T1 into T2
Dist=3
High computational cost and nodes with single label (Gillette, Grefenstette, 2009)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Comparing virtual and real cells: clustering to validate this distance
Show that this distance D discriminates between different cell classes (a prioriknown)
Use matrix of distances D between every pair of cells and then ahierarchical-Ward clustering
Distinguish CA1 and CA3 pyramidal cells from hippocampus using volume for D:
Only 1 error
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Outline
1 Introduction
2 Pattern recognitionNeuroimaging-omicsClustering of neuronsClassification of neurons
3 Computer simulation of dendritic morphologyDendritic morphologySimulation models of dendritic morphologyComparing virtual and real cells
4 Parameter and structural optimization in neuronal and brain modelsCompartmental modelBrain networks
5 ConclusionsChallengingMore material
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Optimization
Two examples of optimization problemsNeuronal parameter optimization in compartmentalneuronal modelStructural optimization in brain networks
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Neuronal parameter optimization
Compartmental modelsCompartmental models,based on a circuitrepresentation of a neuron(Hodgkin and Huxley, 1952),describe the potential alongthe dendritic tree
The model is described by asystem of ordinarydifferential equations
The model is used toreplicate the response of areal neuron to a set ofsimple current stimuli
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Neuronal parameter optimization
The optimization problem in compartmental modelsThe process of identifying sets of parameters that lead to adesired electrical activity pattern in a neuron or neuralnetwork model that is not fully constrained by experimentaldata (Printz, 2007)
Related questionsChoice of an objective function or goodness measure toevaluate the possible solutionsSelection of an appropriate optimization algorithm
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Point-to-point comparison: based on thecomparison of the electrophysiological tracesprovided by the data and the model
Feature-based: (1) spike rate; (2) anaccommodation index; (3) latency to firstspike; (4) average AP overshoot; (5) averagedepth of after hyperpolarization (AHP); (6)average AP width
Multi-objective: several feature-based criteriaat the same time
Optimization algorithms
Hand tuning: Bhalla and Bower, 1993; Foster et al., 1993; Prinz et al., 2003
Gradient descent: Bhalla and Bower, 1993
Evolutionary algorithms: Achard and De Schutter, 2006; Gerken et al., 2006;Keren et al., 2005; Pettinen et al., 2006; Vanier and Bower, 1999; Taylor andEnoka, 2004; van Geit et al., 2008
Hybrid methods: van Geit et al., 2007
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Neuronal parameter optimization
Hybrid method (van Geit et al., 2007)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Neuronal parameter optimization
Multi-objective (Druckmann et al., 2007)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Neuronal parameter optimization
GEneral NEural SImulation System (GENESIS) (Bower andBeeman, 1998)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Brain networksBrain activity can be modeled as a dynamical process acting on a network(DeLucia et al., 2005)
Each vertex of the structure represents an elementary component (brainareas, groups of neurons or individual cells)
Real and artificial brain networksThe network topology is a key element for understanding the behavior of theprocess
Based on real brain networks, similar artificial brain networks can be obtainedvia optimization
The artificial brain networks should satisfy some constraints identified in the realnetwork
The similarity is measured in terms of predefined topological characteristics(structural motifs and functional motifs)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Extraction of a real brain structural connectivity network
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Optimization approachStructural optimization in brain networks implies theidentification of a network topology that:
satisfies a number of constraints identified in the real brainnetwork (same number of node degrees)is optimal with respect to measure(s) defined in the spaceof networks (structural motifs and functional motifs)
Multiobjective optimization under constraints
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Structural motifs and functional motifsMotifs (Milo et al, 2002) as small network building blockswhich are defined by their size and interconnectionpatternsIt is possible (Sporns and Koetter, 2004) to gain insight intothe rules governing the structure of complex networks byinvestigating their composition from motifsStructural motifs as set of brain areas and pathways thatcan potentially engage in different patterns of interactionsFunctional motifs as specific combinations of nodes andconnections (contained in the structural motifs) that may beactivated in the course of neural information processing
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Structural motifs and functional motifs
A “motif” is a connected graph consisting of M vertices and a set of edges (arcs) (maximally M2 − M fordirected graphs, minimally M − 1) forming a subgraph of a larger network
For M = 2, 3, 4, 5 the corresponding numbers of motif types are 2, 13, 199 and 9,365 (Harrary andPalmer, 1973)
A “structural motif” of size M is composed of a specific set of M vertices that are linked by edges (arcs). Thename structural because a larger network could be structurally assembled from a finite set of such motifs
As different edges (arcs) become functionally engaged, different “functional motifs” emerge within a singlestructural motif
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Max. functional motifs and min. structural motifs (Spornsand Kotter, 2004)
The results suggest the hypothesis thatbrain networks maximize the number offunctional motifs, while the repertoire ofstructural motifs remains small
Results are consistent with the hypothesisthat highly evolved neural architectures areorganized to:
maximize functional repertoires(maximizing the number of functionalmotifs)support highly efficient integration ofinformation (minimizing the number ofstructural motifs)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Compartmental model Brain networks
Structural optimization in brain networks
Non-dominated solutions learned in each run of the genetic algorithm (bluepoints) and absolute non-dominated solutions (red points) and original network(triangle)
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Outline
1 Introduction
2 Pattern recognitionNeuroimaging-omicsClustering of neuronsClassification of neurons
3 Computer simulation of dendritic morphologyDendritic morphologySimulation models of dendritic morphologyComparing virtual and real cells
4 Parameter and structural optimization in neuronal and brain modelsCompartmental modelBrain networks
5 ConclusionsChallengingMore material
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Computational intelligence for neuroscience
Neuroscience: challenging problems in modeling,simulation and optimization
Brain as the most complex organComputational intelligence
Pattern recognitionNeuroimaging-omicsClustering of neuronsClassification of neurons
Computer simulation of dendritic morphologyParameter and structural optimization in neuronal and brainnetworks
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Computational intelligence in neuroscience
ChallengingUnderstanding the human brain will have profoundconsequences for technology, health and society (twobillion people on the planet affected by mental disorder)Reverse engineering the brain and the mind has deepphilosophical consequences in addition to its practicalimplicationsHow long will it take? Depends on resources, quality andquantity of researchers, coordination, computing speed,storage capacity, automated imaging systems, ... (amulti-decade effort)Most important outcome: computational neuroscientistsarmed with knowledge and tools to contribute in the futureof humanity
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Relevant conferences
NIPS, INCFNeural Information Processing Systems (NIPS)
Key words: brain imaging, cognitive science and artificial intelligence,neuroscience2009 Workshops:
Connectivity inference in neuroimagingThe curse of dimensionality problem: How can the brain solve it?
Congress of Neuroinformatics (INCF)
The International Neuroinformatics Coordinating Facility (INCF) since2005 coordinates and fosters international activities for discovery andinnovation in neuroscience and related fields14 current member countries2009 Workshops:
Advances in the automatic analysis of multi-dim dataThe neuroinformatics of neural connectivityHigh performance computing and grid infrastructure forneuroinformatics applications
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Fall 2009: applying data mining to map the “cables of the human brain”(∼300,000 fiber streamlines) into 20-50 cables or tracts. Still open
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Relevant books
G. Ascoli (ed.) (2002) Computational Neuroanatomy: Principles and Methods,Humana Press
E. De Schutter (2003) Computational Neuroscience: Trends in Research 2003,Elsevier
R. Hecht-Nielsen, T. McKenna (2003) Computational Models for Neuroscience:Human Cortical Information Processing, Springer
P. Dayan, L.F. Abbott (2005) Theoretical Neuroscience: Computational andMathematical Modeling of Neural Systems, MIT
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Relevant books
S. Koslow, S. Subramaniam (eds.) (2005) Databasing the Brain: From Data toKnowledge, Wiley
L. Benuskova, N. Kasabov (2007) Computational Neurogenetic Modeling,Springer
K. Doya, S. Ishii, A. Pouget, R. Rao (eds.) (2007) Bayesian Brain: ProbabilisticApproaches to Neural Coding, MIT
J. McBrewster, F. Miller, A. Vandome (2009) Mind Uploading: Artificial NeuralNetwork, Artificial Intelligence, Neuroinformatics, Computational Neuroscience.Alphascript Press
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
Relevant journals
Science
Nature
Neuroinformatics
Neurocomputing
BMC Neuroscience
Journal of Neuroscience Methods
Journal of Computational Neuroscience
Mathematical Biosciences
Frontiers in Neuroscience
Bioinformatics
PLoS Computational Biology
NeuroImage
Artificial Intelligence in Medicine
Pattern Recognition Letters
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience
Intro PattRecog Simulation Optimization Conclusions Challenging More material
COMPUTATIONAL INTELLIGENCE
FOR NEUROSCIENCE
Concha Bielza, Pedro Larranaga
Departamento de Inteligencia ArtificialUniversidad Politecnica de Madrid
Discovery Science and Algorithmic Learning Theory - 2009Porto, October 3, 2009
C. Bielza, P. Larranaga Computational Intelligence for Neuroscience