Top Banner
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering May 12 2017
18

Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

Jun 18, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

Convolutional Neural Networks on Graphswith Fast Localized Spectral Filtering

May 12 2017

Page 2: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Content• 1. Introduction

• 2. Proposed Technique 2.1 Learning Fast Localized Spectral Filters 2.2 Graph Coarsening 2.3 Fast Pooling of Graph Signals

• 3. Numerical Experiments

2

Page 3: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

GCNN

3

Page 4: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Abstract

• In this work, we generalize CNN from low-dimensional regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks, brain connectomes or words’ embedding, represented by graphs.

• We present a formulation of CNNs in the context of spectral graph theory. Importantly, the proposed technique offers the same linear computational complexity and constant learning complexity as classical CNNs

4

Page 5: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Introduction

• The major bottleneck of generalizing CNNs to graphs, and one of the primary goals of this work, is the definition of localized graph filters which are efficient to evaluate and learn.

• CNNs extract the local stationarity property of the input data or signals by revealing local features that are shared across the data domain. These similar features are identified with localized convolutional filters or kernels.

• Localized kernels or compactly supported filters refer to filters that extract local features independently of the input data size, with a support size that can be much smaller than the input size.

5

Page 6: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Contributions• Spectral formulation: GCNN built on established tools in graph

signal processing (GSP). • Strictly localized filters: Enhancing [4], the proposed spectral

filters are provable to be strictly localized in a ball of radius K.• Low computational complexity: The evaluation complexity of our

filters is linear.This method avoids the Fourier basis altogether, thus the expensive EVD necessary to compute it as well as the need to store the basis, a matrix of size n**2. Besides the data, our method only requires to store the Laplacian, a sparse matrix.

• Efficient pooling: We propose an efficient pooling strategy on graphs which, after a rearrangement of the vertices as a binary tree structure, is analog to pooling of 1D signals.

• Experimental results: superior both in accuracy and complexity to the pioneer spectral graph CNN[4]. And it performs similarly to a classical CNNs on MNIST.

6

Page 7: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Proposed Technique

• Design of localized convolutional filters on graphs;

• Graph coarsening procedure that groups together similar vertices;

• Graph pooling operation that trades spatial resolution for higher filter resolution.

7

Page 8: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Learning Fast Localized Spectral Filters

• There are two strategies to define convolutional filters:• 1. Spatial approach: challenge of matching local neighborhoods

• 2. Spectral approach: via convolutions with a Kronecker delta implemented in the spectral domain. The convolution theorem defines convolutions as linear operators that diagonalize in the Fourier basis.

- However, a filter defined in the spectral domain is not naturally localized and translations are costly due to the O(n**2) multiplication with the graph Fourier basis. But it can be overcome.

8

Page 9: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Learning Fast Localized Spectral Filters

• 1. Graph Fourier Transform

• 2. Polynomial parametrization for localized filters

• 3. Recursive formulation for fast filtering

9

Page 10: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Graph Fourier Transform

Too many maths symbols, I just conclude its tuition here:

1. Transform the undirected graph into graph Laplacian, which is Degree Matrix - Adjacency Matrix, then normalized it.

2. Find out the Fourier basis U (The Laplacian is indeed diagonalized by the Fourier basis)

3. The convolution operation become an element-wise Hadamard product in the Fourier domain.

10

Page 11: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Polynomial parametrization for localized filters

1. Graph Fourier Transform's learning complexity is quite high and are not localized in space. These issue can be overcome with the use of a polynomial filter.

2. Considering a smoothing kernal, such as splines.

11

Page 12: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Polynomial parametrization for localized filters

1. Graph Fourier Transform's learning complexity is quite high and are not localized in space. These issue can be overcome with the use of a polynomial filter.

2. Considering a smoothing kernal, such as splines.

12

Page 13: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Recursive formulation for fast filteringUsing Polynomial parametrization is cool but the cost to filter a signal x is still high.(Actually these previous methods were proposed by [4])

So they come up with their methods: a polynomial function that can be computed recursively from L, as L is sparse, so the cost is much lower.

Approximation methods:

1. Chebyshev expansion(they use this one in their experiments)

2. Lanczos Algorithm

13

Page 14: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Learning the filters

• Actually is the same as classical CNN, using Back Propagation over a mini-batch of S samples.

14

Page 15: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Graph Coarsening

• After the Convolution operation, we need to think about the pooling operation. So we come across a problem: it requires meaningful neighborhoods on graphs, where similar vertices are clustered together. Doing this for multiple layers is equivalent to a multi-scale clustering of the graph that preserves local geometric structures.

• While this problem is NP-hard, so they use METIS, a greedy alogrithm to compute successive coarser versions of a given graph.

15

Page 16: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Graph Coarsening

• After the Convolution operation, we need to think about the pooling operation. So we come across a problem: it requires meaningful neighborhoods on graphs, where similar vertices are clustered together. Doing this for multiple layers is equivalent to a multi-scale clustering of the graph that preserves local geometric structures.

• While this problem is NP-hard, so they use METIS, a greedy alogrithm to compute successive coarser versions of a given graph.

16

Page 17: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN

Fast Pooling17

Page 18: Convolutional Neural Networks on Graphs with Fast ... · regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks,

GCNN